Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
You should have 8-12 years of experience with a strong overall knowledge of the Software Development Life Cycle and expertise in Microsoft Core DotNET (C#, Web API, .Net Core) version 6 and above. Your responsibilities will include API development, testing, and ensuring operational stability through proficient coding practices. Additionally, you should have experience working with AWS services such as EC2, S3, Lambda, DynamoDB, and API Gateway. Experience in Angular UI development is also required for this role. Hands-on experience with any SQL (MySQL/Oracle DB/SQL Server) and No-SQL databases is necessary. You should also have experience in developing, debugging database querying languages, and Elasticsearch. Strong analytical, problem-solving, time management, and organizational skills are essential. Knowledge of Microservices based Architecture is also expected. Nice to have skills include experience with JavaScript or JavaScript-based frameworks. Advanced skills in Core DotNet, AWS services, Angular UI development, and a solid understanding of the Software Development Life Cycle are preferred. About Virtusa: At Virtusa, we value teamwork, quality of life, and professional and personal development. You will be part of a global team of 27,000 professionals who prioritize your growth and provide exciting projects and opportunities. We work with state-of-the-art technologies and believe in nurturing new ideas and fostering excellence in a collaborative team environment.,
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
The Chapter Lead, Full Stack Engineering Java Spring, WRB Tech role involves a hands-on developer position focusing on both back-end and front-end development. In addition to technical responsibilities, the role also includes accountability for people management and capability development within the Chapter. As the Chapter Lead, you will oversee the execution of functional standards and best practices, provide technical assistance to Chapter members, and ensure the quality of the code repository. You will act as a conduit for wider domain strategy and prioritize capacity for technical debt. This role emphasizes capability building rather than owning applications or delivery. In terms of business responsibilities, you will contribute hands-on to the Engineering squad, balance skills and capabilities across teams in partnership with relevant stakeholders, and work towards improving automation, simplification, and innovative use of the latest technology trends. You will adopt and embed Change Delivery Standards, ensure clear role descriptions and expectations, and follow the chapter operating model to build capability and performance effectively. As a Chapter Lead, you will be accountable for people management and capability development within your Chapter. You will review metrics on capabilities and performance, drive continual improvement, and focus on the development of people and capabilities as a top priority. Risk management and governance are also key aspects of the role, involving effective capacity risk management, adherence to risk management standards, and ensuring compliance with required policies. To qualify for this role, you should hold a Bachelors or Masters degree in Computer Science, Engineering, or a related field, and have a comprehensive background in full-stack development. With at least 10 years of experience in software development, including 5 years focused on full-stack projects involving Java and Spring frameworks, you should also have demonstrable experience in front-end development and strong communication and team-building skills. The ability to lead in dynamic, iterative development environments is essential. Specific technical competencies required for this role include proficiency in Java (Spring), RESTful APIs, Relational Databases (e.g., MySQL or Postgres), JavaScript ES6, ReactJS or Angular or Vue, HTML5/CSS/DOM/Responsive Design Principles, Network Fundamentals, Security Fundamentals, and CI/CD. Additionally, knowledge of NoSQL Databases (e.g., MongoDB or Cassandra) and Native mobile development (iOS/Android) would be advantageous. Standard Chartered is an international bank committed to making a positive difference for its clients, communities, and employees. The organization values diversity, inclusion, integrity, continuous improvement, innovation, and collective growth. As an employee, you can expect core bank funding for retirement savings, medical and life insurance, flexible time-off policies, proactive wellbeing support, continuous learning opportunities, and a supportive and inclusive work environment. If you are looking for a purpose-driven career in a bank that values diversity and inclusion, Standard Chartered welcomes you to join their team and contribute to driving commerce and prosperity through unique diversity and inclusive practices.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
indore, madhya pradesh
On-site
You should have expert-level proficiency in Python and Python frameworks or Java. You must have hands-on experience with AWS Development, PySpark, Lambdas, Cloud Watch (Alerts), SNS, SQS, CloudFormation, Docker, ECS, Fargate, and ECR. Deep experience with key AWS services like Compute (PySpark, Lambda, ECS), Storage (S3), Databases (DynamoDB, Snowflake), Networking (VPC, 53, CloudFront, API Gateway), DevOps/CI-CD (CloudFormation, CDK), Security (IAM, KMS, Secrets Manager), Monitoring (CloudWatch, X-Ray, CloudTrail), and NoSQL Databases like Cassandra, PostGreSQL is required. You should have very strong hands-on knowledge of using Python for integrations between systems through different data formats. Expertise in deploying and maintaining applications in AWS, along with hands-on experience in Kinesis streams and Auto-scaling, is essential. Designing and implementing distributed systems and microservices, and following best practices for scalability, high availability, and fault tolerance are key responsibilities. Strong problem-solving and debugging skills are necessary for this role. You should also have the ability to lead technical discussions and mentor junior engineers. Excellent written and verbal communication skills are a must. Comfort working in agile teams with modern development practices and collaborating with business and other teams to understand business requirements and work on project deliverables is expected. Participation in requirements gathering, understanding, designing a solution based on available framework and code, and experience with data engineering tools or ML platforms (e.g., Pandas, Airflow, SageMaker) are required. An AWS certification such as AWS Certified Solutions Architect or Developer is preferred. This position is based in multiple locations including Indore, Mumbai, Noida, Bangalore, Chennai in India. Qualifications: - Bachelor's degree or foreign equivalent required from an accredited institution. Consideration will be given to three years of progressive experience in the specialty in lieu of every year of education. - At least 8+ years of Information Technology experience.,
Posted 1 month ago
2.0 - 7.0 years
0 Lacs
karnataka
On-site
You should have 2 - 7 years of experience in Python with a good understanding of Big data ecosystems and frameworks such as Hadoop, Spark etc. Your experience should include developing data processing tasks using PySpark and expertise in at least one popular cloud provider, preferably AWS. Additionally, you should possess good knowledge of any RDBMS/NoSQL database with strong SQL writing skills. Experience with Datawarehouse tools like Snowflake and any ETL tool would be a plus. Strong analytical and problem-solving capabilities are essential, along with excellent verbal and written communication skills. Client-facing skills are required, as you will be working directly with clients to build trusted relationships with stakeholders. Ability to collaborate effectively across global teams is crucial. You should have a strong understanding of data structures, algorithms, object-oriented design, and design patterns. Experience in multi-dimensional data, data curation processes, and data quality improvement is desired. General knowledge of business processes, data flows, and quantitative models is expected. An independent thinker who is willing to engage, challenge, and learn new technologies would be an ideal fit for this role. Role & Responsibilities: - Maintain high-quality coding standards and deliver work within the stipulated time frame. - Review the work of team members and occasionally provide guidance. - Develop an understanding of Work Breakdown Structure and assist the manager in delivering the same. - Develop sector initiatives like credential building and knowledge management. - Act as a Team lead and proficiently deliver key responsibilities aligned with the project plan.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
You will be joining HCL Software, a Product Development Division of HCL Tech, which focuses on developing, marketing, selling, and supporting over 20 Product families in various domains such as Customer Experience, Digital Solutions, Secure DevOps, Security & Automation. With offices and labs worldwide, we cater to thousands of customers and strive to drive their success through continuous product innovation. As a Sr. NodeJS Developer in our Product team with a minimum of 6 years of experience, you are expected to possess the following skills: - Hold a Btech/BE in Computer Science or a related technical field, or demonstrate exceptional skills in related areas with practical software engineering experience. - Proficient in programming languages like NodeJS, ReactJS, or JavaScript. - Have a good understanding of DevSecOps practices, CI/CD tools usage, high-level design, and code reviews. - Expertise in microservices and API development, implementing applications with MongoDB or any NoSQL database. - Ability to analyze customer-reported issues in products and provide timely short-term/long-term fixes. - Excellent communication skills with customers. - Familiarity with Docker, Kubernetes, OpenShift, or similar technologies is essential, and experience in AWS and Azure DevOps is a plus. Your responsibilities will include: - Hands-on coding and code review as part of product development. - Delivering quality features, bug fixes, and ensuring the product is scalable and user-friendly. - Applying DevSecOps practices and CI/CD automations for frequent value delivery to customers. - Acting as a Client Advocate and communicating with customers on upcoming features and issues, working closely with Product Management on roadmap priorities. - Managing customer escalations as L3/SME, providing innovative troubleshooting solutions with quick turnaround on product fixes. - Creating technical write-ups for documentation team publications. - Developing product-related VLOGs or BLOGs for customers upon feature releases. - Demonstrating product features to customers, partners, and pre-sales teams. - Collaborating with global product teams for seamless coordination and development.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
kochi, kerala
On-site
The ideal candidate for this role should be ready to join immediately and can share their details via email for quick processing. Act fast for immediate attention! Roles and Responsibilities: Lead end-to-end Java-based solution design, development, and integration in a healthcare ecosystem. You will collaborate with product managers, business analysts, and architects to deliver high-quality solutions aligned with compliance standards such as HIPAA. Your role will involve guiding and mentoring development teams, enforcing coding standards, and ensuring adherence to design principles. Additionally, you will conduct technical reviews, identify risks, and suggest performance and scalability improvements. It will be your responsibility to drive DevOps practices, CI/CD pipelines, and cloud deployment strategies. Must-Have Skills: - 10+ years of experience in Java/J2EE development and solution design - Expertise in Spring Boot, Microservices, REST APIs, and SQL/NoSQL databases - Strong knowledge of healthcare standards like HL7, FHIR, EDI, or HIPAA compliance - Experience with cloud platforms (AWS/Azure/GCP) and containerization (Docker, Kubernetes) - Excellent leadership, communication, and stakeholder management skills Good-to-Have Skills: - Familiarity with Kafka, Elasticsearch, or CI/CD tools like Jenkins, GitHub Action - Knowledge of security frameworks (OAuth2, JWT) and healthcare data privacy policies - Certification in cloud or healthcare technologies is a plus Candidates with the above qualifications and skills are encouraged to apply by sending their details to nitin.patil@ust.com.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
The ideal candidate should be ready to join immediately and can share their details via email for quick processing at nitin.patil@ust.com. Act fast for immediate attention! Roles and Responsibilities: Lead end-to-end Java-based solution design, development, and integration in a healthcare ecosystem. Collaborate with product managers, business analysts, and architects to deliver high-quality solutions aligned with compliance standards (e.g., HIPAA). Guide and mentor development teams, enforce coding standards, and ensure adherence to design principles. Conduct technical reviews, identify risks, and suggest performance and scalability improvements. Drive DevOps practices, CI/CD pipelines, and cloud deployment strategies. Must Have Skills: - 10+ years of experience in Java/J2EE development and solution design. - Expertise in Spring Boot, Microservices, REST APIs, and SQL/NoSQL databases. - Strong knowledge of healthcare standards like HL7, FHIR, EDI, or HIPAA compliance. - Experience with cloud platforms (AWS/Azure/GCP) and containerization (Docker, Kubernetes). - Excellent leadership, communication, and stakeholder management skills. Good to Have Skills: - Familiarity with Kafka, Elasticsearch, or CI/CD tools like Jenkins, GitHub Actions. - Knowledge of security frameworks (OAuth2, JWT) and healthcare data privacy policies. - Certification in cloud or healthcare technologies is a plus.,
Posted 1 month ago
2.0 - 7.0 years
0 Lacs
karnataka
On-site
You should have 2 - 7 years of experience in Python with a good understanding of Big data ecosystems and frameworks such as Hadoop, Spark etc. Your experience should include developing data processing tasks using PySpark. Expertise in at least one popular cloud provider preferably AWS would be a plus. Additionally, you should possess good knowledge of any RDBMS/NoSQL database with strong SQL writing skills. Experience with Datawarehouse tools like Snowflake and any one ETL tool would be advantageous. Your skills should include strong analytical and problem-solving capability, excellent verbal and written communication skills, and the ability to work directly with clients to build trusted relationships with stakeholders. You should also be able to collaborate effectively across global teams and have a strong understanding of data structures, algorithms, object-oriented design, and design patterns. Experience in the use of multi-dimensional data, data curation processes, and the measurement/improvement of data quality is required. A general knowledge of business processes, data flows, and quantitative models that generate or consume data is also preferred. You should be an independent thinker, willing to engage, challenge, and learn new technologies. Role & Responsibilities: - Maintain high-quality coding standards and deliver work within the stipulated time frame. - Review the work of team members and occasionally provide guidance. - Develop an understanding of Work Breakdown Structure and assist the manager in delivering the same. - Develop sector initiatives such as credential building and knowledge management. - Act as a Team lead and proficiently deliver key responsibilities in line with the project plan.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY's GDS Tax Technology team's mission is to develop, implement, and integrate technology solutions that better serve our clients and engagement teams. As a member of EY's core Tax practice, you'll develop a deep tax technical knowledge and outstanding database, data analytics, and programming skills. Ever-increasing regulations require tax departments to gather, organize, and study more data than ever before. Handling the variety and volume of data is often extremely challenging and time-consuming for a company. EY's GDS Tax Technology team members work closely with partners, clients, and tax technical subject matter experts to develop and incorporate technology solutions that enhance value-add, improve efficiencies, and enable clients with disruptive and market-leading tools supporting Tax. EY is currently seeking a Generative AI Application Developer (.NET) to join our Tax Technology practice in Bangalore & Kolkata, India. **The Opportunity:** We're looking for Tax Seniors with expertise in Full-stack Application Development using .NET C# for Generative AI applications to join the TTT team in Tax Service Line. This is a fantastic opportunity to be part of a pioneer firm while being instrumental in the growth of a new service offering. **Your Key Responsibilities:** - Design, develop, and implement AI agents/plugins/interfaces and APIs, ensuring integration with various systems aligns with the core product/platform development strategy. - Estimate and manage technical efforts, including work breakdown structures, risks, and solutions, while adhering to development methodologies and KPIs. - Maintain effective communication within the team and with stakeholders, proactively managing expectations and collaborating on problem-solving. - Contribute to the refinement of development/engineering methodologies and standards, anticipating potential issues and leading the resolution process. **Skills and Attributes for Success:** **Must-Have:** - Skilled in full-stack application development with .NET C#, REST API, React, or any other typescript-based UI frameworks, SQL databases. - Advanced knowledge of Azure services such as Azure app services, Azure Functions, Entra ID, etc. - Containerization - Docker, Azure container apps, Azure Kubernetes Services (AKS). - No-SQL database such as Cosmos or MongoDB. - Working experience with source control such as git or TFVC. - CI/CD pipelines, Azure DevOps, GitHub Actions, etc. - Generative AI application development with Azure OpenAI, Semantic Kernel, and Vector databases like Azure AI search, Postgres, etc. - Fundamental understanding of various types of Large Language Models (LLMs). - Fundamental understanding of Retrieval Augment Generation (RAG) techniques. - Fundamental understanding of classical AI/ML. - Skilled in Advanced prompt engineering. **Nice-to-Have:** - Awareness about various AI Agents/Agentic workflow frameworks and SDKs. - Graph Database such as Neo4j. - Experience with M365 Copilot Studio. - Microsoft Azure AI-900/AI-102 Certification. **Behavioural Skills:** - Excellent learning ability. - Strong communication skills. - Flexibility to work both independently and as part of a larger team. - Strong analytical skills and attention to detail. - The ability to adapt your work style to work with both internal and client team members. **To qualify for the role, you must have:** - Bachelor's / master's degree in software engineering / information technology / BE/ B.TECH. - An overall 5-9 years of experience. **Ideally, you'll also have:** - Thorough knowledge of Tax or Finance Domain. - Strong analytical skills and attention to detail. **What We Look For:** - A team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment. - An opportunity to be a part of a market-leading, multi-disciplinary team of 1400+ professionals, in the only integrated global transaction business worldwide. - Opportunities to work with EY TAS practices globally with leading businesses across a range of industries. **What We Offer:** EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations - Argentina, China, India, the Philippines, Poland, and the UK - and with teams from all EY service lines, geographies, and sectors, playing a vital role in the delivery of the EY growth strategy. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We'll introduce you to an ever-expanding ecosystem of people, learning, skills, and insights that will stay with you throughout your career. - Continuous learning: You'll develop the mindset and skills to navigate whatever comes next. - Success, as defined by you: We'll provide the tools and flexibility, so you can make a meaningful impact, your way. - Transformative leadership: We'll give you the insights, coaching, and confidence to be the leader the world needs. - Diverse and inclusive culture: You'll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
The ideal candidate for this role will have experience in developing RESTful APIs using .NET Core and related technologies. Responsibilities: - Design, develop and maintain efficient and scalable backend API using .NET Core. - Write clean, well-structured, and testable code in C#. - Implement and maintain reliable database interactions using Oracle, SQL Server, and NoSQL databases (MongoDB and Couchbase). - Integrate with external services and APIs as needed. - Responsible for writing unit test cases. - Ensure the performance, quality, and security of APIs. - Identify and address technical debt, code quality, and performance issues. - Collaborate with frontend developers to ensure seamless integration. - Maintain and improve existing APIs. - Work with project managers to deliver projects on time and on budget. - Stay up-to-date with emerging trends and technologies in API development. Requirements: - 5-7 years of experience in developing REST APIs using .NET Core (minimum 5 years in .NET Core is a must). - Proficient in C#, .NET Core, and related back-end technologies. - Experience with Entity Framework Core and other ORMs. - Experience with Oracle, Postgres, SQL Server, and NoSQL databases integration. - Any DB integration skill. - Solid understanding of REST API design principles. - Experience with microservice architecture. - Experience integrating with queue and messaging services (MQueue, Kafka, etc.). - Experience in implementing secure authentication and authorization mechanisms (OAuth, JWT). - Experience with cloud platforms and containerization. - Experience with unit testing tools (NUnit, MSTest, etc.). - Proven experience in identifying and implementing performance optimizations. - Strong understanding of object-oriented programming, design patterns, and software architecture principles. - Strong problem-solving and analytical skills. Please note that this job description was sourced from hirist.tech.,
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Senior Backend Engineer with over 7 years of experience, you will be responsible for compiling and analyzing data, processes, and codes to troubleshoot problems and identify areas for improvement. Your role will involve implementing Rest APIs using Spring Boot Java, NodeJS, and Python, along with documenting the design of the technical implementation and creating technical diagrams based on the finalized approach. In addition, you will be tasked with developing ideas for new programs, products, or features by monitoring industry developments and trends, optimizing applications for maximum speed and scalability, and building reusable code and libraries for future use. Collaboration and problem-solving with other team members, as well as reviewing, testing, and debugging code, will be crucial aspects of your role. To excel in this position, you should have a strong grasp of OOPS concepts, data structures/algorithms, and the web development cycle. Building REST APIs, ensuring maintainable code with unit tests, and preparing build scripts for projects using Maven/Gradle are key requirements. The ability to quickly adapt to new technologies, proficiency in SQL/NoSQL databases such as MySQL, MongoDB, and Postgres, and familiarity with design patterns and cloud technologies will be advantageous. If you possess a logical approach to problem-solving, excellent verbal and written communication skills, good analytical capabilities, and exposure to ad-tech domains, you are the ideal candidate for this role.,
Posted 1 month ago
4.0 - 9.0 years
16 - 20 Lacs
Bengaluru
Hybrid
Job Overview: We are looking for a Full Stack Developer to produce scalable software solutions. You will be part of a cross-functional team that is responsible for the full software development life cycle, from conception to deployment. As a Full Stack Developer, you should be comfortable around both front-end and back-end coding languages, development frameworks and third-party libraries. Role & responsibilities: Design front-end and back-end (server-side) code architecture. Work on currently developed applications (both front-end and back-end) to add new features and integrate the application with different partners and third party APIs. Build the front-end of applications through appealing visual mobile responsive design. Develop and manage well-functioning databases and applications. Develop and integrate APIs with frontend components to enable seamless communication between the client and server. Write and execute unit, integration, and end-to-end tests to ensure code quality and functionality. Troubleshoot, debug the software and fix the bugs.
Posted 1 month ago
5.0 - 10.0 years
7 - 17 Lacs
Bengaluru
Work from Office
About this role: Wells Fargo is seeking a Lead Software Engineer (Lead Data Engineer). In this role, you will: Lead complex technology initiatives including those that are companywide with broad impact Act as a key participant in developing standards and companywide best practices for engineering complex and large scale technology solutions for technology engineering disciplines Design, code, test, debug, and document for projects and programs Review and analyze complex, large-scale technology solutions for tactical and strategic business objectives, enterprise technological environment, and technical challenges that require in-depth evaluation of multiple factors, including intangibles or unprecedented technical factors Make decisions in developing standard and companywide best practices for engineering and technology solutions requiring understanding of industry best practices and new technologies, influencing and leading technology team to meet deliverables and drive new initiatives Collaborate and consult with key technical experts, senior technology team, and external industry groups to resolve complex technical issues and achieve goals Lead projects, teams, or serve as a peer mentor Required Qualifications: 5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: 5+ Years of experience in Data Engineering. 5+ years of overall experience of software development. 5+ years of Python development experience must include 3+ years in Spark framework. 5+ years of Oracle or SQL Server experience in designing, coding and delivering database applications Expert knowledge and considerable development experience with at least two or more of the following : Kafka ,ETL, Big Data, NoSql database, S3 or other object store . Strong understanding of data flows design and how to implement your designs in python Experience in writing and debugging complex PL/SQL or TSQL Stored Procedures Excellent troubleshooting and debugging skills Analyze a feature story and design a robust solution for it and create specs for complex business rules and calculations Ability to understand business problems and articulate a corresponding solution Excellent verbal, written, and interpersonal communication skills Job Expectations: Strong knowledge and understanding of Dremio framework Database query design and optimization Strong experience using the development ecosystem of applications (JIRA, ALM, GitHub, uDeploy(Urban Code Deploy), Jenkins, Artifactory, SVN, etc) Knowledge and understanding of multiple source code version control systems in working with branches, tags and labels
Posted 1 month ago
3.0 - 7.0 years
30 - 35 Lacs
Noida, sector 123
Work from Office
What youll be doing: Understand business requirements and identify the best approach to serve the requirement without risking the system's conceptual sanity. Drive Implementation approach discussions by taking complete ownership of partially defined problems. Practice TDD and help us retain/improve our code coverage. Optimise flows within systems for better performance in response times and memory consumption. Tear down unused flows and concepts to make way for the new and evolved ones. What are we looking for? Someone with at least 2+ years of experience building backend systems using Java. Experience with NoSQL Database. Someone who takes ownership of delivery thereby not only closing your own work but also identifying dependencies and making sure they follow the track towards completion. Someone who can work both independently and in collaboration with the team as and when the situation demands for it. Someone who is not averse to learning and growth and has an open mind about inputs/suggestions from any stakeholder. Experience with DSA. Experience building backend systems in Java+SpringBoot Worked on products with a massive user scale - 1 Mn+ DAUs especially.
Posted 1 month ago
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
Shift: (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity Must have skills required: Java, Spring, System Design, Micro services Infrrd is Looking for: Someone who has: 5-8 years of experience in developing backend for web-based applications using Java. Mentoring 4+ members team at offshore and should have worked on projects where onsite-offshore coordination is required and the delivery model is agile SCRUM for a development project. Excellent understanding of Core Java and Spring framework (Spring, Spring MVC, Spring Data). Ability to work hands-on while development / coding and strong debugging and problem solving skills. Solid command with data structures. Experience with any NoSQL database. Experience with System design and microservices architecture. Working experience on tools like: Jira, Any Java IDE, GitHub Good understanding of Kubernetes and AWS DevOps. What you will do: Should provide strong technical anchorship and be a primary Gatekeeper of the Java team. Design and architect scalable solutions with high-level and low-level design specifications. Conduct code reviews, ensure code quality, implement code automation, and mentor team members on best practices and design principles. Ensure the deliverables are of the highest quality in terms of functional and technical aspects through scrum process. Ensure the in-sprint defects are closed with the highest quality and any production defects are taken care of for a specific duration post release. Work along with the PM, Architect and the teams to manage the team and the deliverables from technical and functional aspects. Contribute individually, as well as provide team mentoring and guidance at appropriate instances.
Posted 2 months ago
6.0 - 8.0 years
25 - 27 Lacs
Hyderabad, Chennai
Work from Office
Proficient in Nest.js Strong experience with Angular. Experience with RESTful API development and integration MongoDB or any NoSQL database knowledge Familiarity with authentication and authorization (JWT, OAuth) Good understanding of server-side templating and error handling DevOps basics (CI/CD, Docker, etc.) is a plus Strong debugging and troubleshooting skills
Posted 2 months ago
6.0 - 8.0 years
25 - 27 Lacs
Gurugram, Bengaluru
Work from Office
Proficient in Nest.js Strong experience with Angular. Experience with RESTful API development and integration MongoDB or any NoSQL database knowledge Familiarity with authentication and authorization (JWT, OAuth) Good understanding of server-side templating and error handling DevOps basics (CI/CD, Docker, etc.) is a plus Strong debugging and troubleshooting skills
Posted 2 months ago
6.0 - 11.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Position Overview: We are seeking an experienced and skilled Senior Database Developer to join our dynamic team. The ideal candidate will have at least 8 years of hands-on experience in database development, with a strong focus on Neo4j (graph) databases. The role involves working on cutting-edge projects, contributing to data modelling, and ensuring the scalability and efficiency of our database systems. Responsibilities : Design, develop, and maintain databases, with a primary focus on Cypher/graph databases. Modify databases according to requests and perform tests. Advanced Query, performance tuning of databases and optimization of database systems. Solve database usage issues and malfunctions. Analyze all databases and monitor them for all design specifications and prepare associated test strategies. Evaluate and engineer efficient backup-recovery processes for various databases. Promote uniformity of database-related programming effort by developing methods and procedures for database programming Remain current with the industry by researching available products and methodologies to determine the feasibility of alternative database management systems, communication protocols, middleware, and query tools. Liaise with developers to improve applications and establish best practices. Ensure the performance, security, and scalability of database systems. Develop and optimize PL/SQL queries for efficient data storage and retrieval. Implement and maintain data models, ensuring accuracy and alignment with business needs. Train, mentor and motivate the junior team members. Contribute to assessing the teams performance evaluation. Stay updated on emerging database technologies and contribute to continuous improvement initiatives. Skills Required: 6+ years work experience as a Database developer Bachelor's or master's degree in computer science, Engineering, or a related field. Proficiency in Neo4j (graph) databases is mandatory. Strong experience with PL/SQL, data modeling, and database optimization techniques. Why us? Impactful Work: Your contributions will play a pivotal role in ensuring the quality and reliability of our platform. Professional Growth: We believe in investing in our employees' growth and development. You will have access to various learning resources, books, training programs, and opportunities to enhance your technical skills & expand your knowledge Collaborative Culture: We value teamwork and collaboration. You will work alongside talented professionals from diverse backgrounds, including developers, product managers, and business analysts, to collectively solve challenges and deliver exceptional software. Benefits: Health insurance covered for you and your family. Quarterly team outing, twice a month team lunch & personal and professional learning development session. Top performers win a chance on an international trip completely sponsored by the company.
Posted 2 months ago
1.0 - 6.0 years
3 - 8 Lacs
Hyderabad
Work from Office
What we expect of you Role Description: We are looking for an Associate Data Engineer with deep expertise in writing data pipelines to build scalable, high-performance data solutions. The ideal candidate will be responsible for developing, optimizing and maintaining complex data pipelines, integration frameworks, and metadata-driven architectures that enable seamless access and analytics. This role prefers deep understanding of the big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Data Engineer who owns development of complex ETL/ELT data pipelines to process large-scale datasets Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Ensuring data integrity, accuracy, and consistency through rigorous quality checks and monitoring Exploring and implementing new tools and technologies to enhance ETL platform and performance of the pipelines Proactively identify and implement opportunities to automate tasks and develop reusable frameworks Eager to understand the biotech/pharma domains & build highly efficient data pipelines to migrate and deploy complex data across systems Work in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value Use JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories. Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle Collaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions What we expect of you Must-Have Skills: Experience in Data Engineering with a focus on Databricks, AWS, Python, SQL, and Scaled Agile methodologies Proficiency & Strong understanding of data processing and transformation of big data frameworks (Databricks, Apache Spark, Delta Lake, and distributed computing concepts) Strong understanding of AWS services and can demonstrate the same Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery, and DevOps practices Good-to-Have Skills: Data Engineering experience in Biotechnology or pharma industry Exposure to APIs, full stack development Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and DevOps Education and Professional Certifications Bachelors degree and 2 to 5 + years of Computer Science, IT or related field experience OR Masters degree and 1 to 4 + years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.
Posted 2 months ago
6.0 - 8.0 years
8 - 10 Lacs
Hyderabad
Work from Office
Role Description: We are looking for highly motivated expert Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain complex ETL/ELT data pipelines in Databricks using PySpark, Scala, and SQL to process large-scale datasets Understand the biotech/pharma or related domains & build highly efficient data pipelines to migrate and deploy complex data across systems Design and Implement solutions to enable unified data access, governance, and interoperability across hybrid cloud environments Ingest and transform structured and unstructured data from databases (PostgreSQL, MySQL, SQL Server, MongoDB etc.), APIs, logs, event streams, images, pdf, and third-party platforms Ensuring data integrity, accuracy, and consistency through rigorous quality checks and monitoring Expert in data quality, data validation and verification frameworks Innovate, explore and implement new tools and technologies to enhance efficient data processing Proactively identify and implement opportunities to automate tasks and develop reusable frameworks Work in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value Use JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories. Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle Collaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Data Engineering experience in Biotechnology or pharma industry Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications Any Degree and 6-8 years of experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.
Posted 2 months ago
9.0 - 12.0 years
9 - 12 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Role Description: We are looking for highly motivated expert Senior Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines to support structured, semi-structured, and unstructured data processing across the Enterprise Data Fabric. Implement real-time and batch data processing solutions, integrating data from multiple sources into a unified, governed data fabric architecture. Optimize big data processing frameworks using Apache Spark, Hadoop, or similar distributed computing technologies to ensure high availability and cost efficiency. Work with metadata management and data lineage tracking tools to enable enterprise-wide data discovery and governance. Ensure data security, compliance, and role-based access control (RBAC) across data environments. Optimize query performance, indexing strategies, partitioning, and caching for large-scale data sets. Develop CI/CD pipelines for automated data pipeline deployments, version control, and monitoring. Implement data virtualization techniques to provide seamless access to data across multiple storage systems. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architectures. Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications 9 to 12 years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.
Posted 2 months ago
1.0 - 3.0 years
3 - 5 Lacs
Hyderabad
Work from Office
What you will do You will play a key role as part of Operations Generative AI (GenAI) Product team to deliver cutting edge innovative GEN AI solutions across various Process Development functions(Drug Substance, Drug Product, Attribute Sciences & Combination Products) in Operations functions. Role Description: As a Full Stack Sr Associate Software Engineer, you will contribute to the development and maintenance of our GEN AI web applications across various Process Development functions(Drug Substance, Drug Product, Attribute Sciences & Combination Products) in Operations, Youll be working on both front-end and back-end technologies. This role is ideal for recent graduates or early-career professionals looking to gain hands-on experience in software development. Roles & Responsibilities: Develop and maintain based front-end applications using modern web frameworks (React, Angular, Fast API). Build and maintain back-end services using languages like Python, Java, or Node.js. Collaborate with the design and product teams to understand user needs and translate them into technical requirements. Write clean, efficient, and well-tested code. Participate in code reviews and provide constructive feedback. Maintain system uptime and optimal performance Learn and adapt to new technologies and industry trends like Prompt Engineering, AI tools and Retrieval-augmented generation (RAG) frameworks What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: Masters degree and 1 to 3 years of Experience in Full Stack Software Engineering, Web development OR Bachelors degree and 3 to 5 years of Experience in Full Stack Software Engineering, Web development OR Diploma and 7 to 9 years of Experience in Full Stack Software Engineering, Web development Cloud Computing certificate preferred Functional Skills: Experienced with API integration, serverless, microservices architecture. Experience in AWS, SQL/NOSQL database, vector database for large language models Experience with popular large language models like OPEN AI Experience with language models and frameworks like Langchain or llamaIndex Experience with prompt engineering, model fine tuning Experience with DevOps CICD build and deployment pipeline Experience with design patterns, data structures, test-driven development. Preferred Qualifications: Professional Certifications: AWS, Data Science Certifications(preferred) Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills.
Posted 2 months ago
8.0 - 12.0 years
18 - 20 Lacs
Pune, Chennai, Coimbatore
Work from Office
Node.js & NoSQL systems to design and maintain scalable APIs and real-time data pipelines. API development, data integration, and cloud infrastructure, leveraging Apache Kafka and GCP to build robust, event-driven backend systems. Required Candidate profile Exp in APIs using Node.js and TypeScript. API integration solutions (REST, GraphQL, webhooks). manage applications using Docker, Kubernetes, and GCP. Strong Node.js, TypeScript, &JavaScript (ES2019+)
Posted 3 months ago
3.0 - 8.0 years
5 - 10 Lacs
Chennai
Hybrid
Duration: 8Months Work Type: Onsite Position Description: Looking for qualified Data Scientists who can develop scalable solutions to complex real-world problems using Machine Learning, Big Data, Statistics, and Optimization. Potential candidates should have hands-on experience in applying first principles methods, machine learning, data mining, and text mining techniques to build analytics prototypes that work on massive datasets. Candidates should have experience in manipulating both structured and unstructured data in various formats, sizes, and storage-mechanisms. Candidates should have excellent problem-solving skills with an inquisitive mind to challenge existing practices. Candidates should have exposure to multiple programming languages and analytical tools and be flexible to using the requisite tools/languages for the problem at-hand. Skills Required: Machine Learning, GenAI, LLM Skills Preferred: Python, Google Cloud Platform, Big Query Experience Required: 3+ years of hands-on experience in using machine learning/text mining tools and techniques such as Clustering/classification/decision trees, Random forests, Support vector machines, Deep Learning, Neural networks, Reinforcement learning, and other numerical algorithms Experience Preferred: 3+ years of experience in at least one of the following languages: Python, R, MATLAB, SAS Experience with GoogleCloud Platform (GCP) including VertexAI, BigQuery, DBT, NoSQL database and Hadoop Ecosystem Education Required: Bachelor's Degree
Posted 3 months ago
4.0 - 8.0 years
6 - 10 Lacs
Tamil Nadu
Work from Office
Duration: 12Months Position Description: Serve as a core member of the secure coding product team that enables the design, development, and creation of secure coding practices Develop application software and RESTful services using GCP and Spring Framework. Experience building distributed, service oriented, cloud micro service-based architectures. Use of Test-Driven Development and code pairing/mobbing practices. Develop components across all tiers of the application stack. Continuously integrates and deploys developed software. Modify CI/CD pipeline and scripts as necessary to improve continuous integration practices. Consults with product manager to identify minimal viable product and decomposes features by story slicing. Collaborate with other product teams on integrations, testing, and deployments Skills Required: React, JavaScript, Application Support, Big Query, Application Testing, Application Design, Coding, Angular, SPRING, Application Development, Developer, Java, Web Services Experience Required: Experience in cloud services engineering, including Pivotal Cloud Foundry (GCP' J-Frog, GitHub, Spring, Angular), RESTful services, CI/CD pipeline (Tekton or similar). Experience with Swagger, logging/tracing, Conformance, Dynatrace, Spring security, and SonarQube Understanding Spring Cloud Data, Spring Security, OAuth, Service monitoring on Cloud Experience in application testing, release management, and support activities. Experience with various Software Development Life Cycle methods such as Agile Experience Preferred: 4+ years of development experience (Purchasing/Automotive industry experience a plus) preferably utilizing Java, Spring, Angular, React, Web Services, etc. 3 years of experience designing and building technical solutions using Java technologies such as Spring, Spring Boot, Web Services, Microservice Architecture etc. Comprehensive understanding of relational database (Microsoft SQL Server), PostgreSQL, NoSQL database and flat file processing concepts Strong knowledge in design patterns and principles, experience in developing web services, REST APIs, and related architectures Exposure to automated testing concepts, tools, and frameworks Excellent communications skills - ability to engage in deep technical discussions with customers and peers and become a trusted technical advisor Education Required: Bachelor's Degree
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City