Jobs
Interviews

23667 Postgresql Jobs - Page 20

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

At IF MedTech, we are dedicated to revolutionizing healthcare through cutting-edge medical device design, development, and pilot manufacturing. Our global team collaborates with experts across medical, engineering, business, and research domains to bring innovative solutions that enhance healthcare and improve lives. Join us in our mission to drive innovation and make a global impact in the medical technology sector. Responsibilities: ● Develop and optimize Android-native components using Java/Kotlin. ● Collaborate with the Flutter developer to integrate Android modules into cross-platform builds. ● Ensure secure data handling, storage, and permissions in compliance with HIPAA. ● Support responsive and accessible UI integration for mobile and web interfaces. ● Design, develop, and maintain backend services using Python FastAPI and Java. ● Implement RESTful and GraphQL APIs for seamless data flow between devices, apps, and servers. ● Apply Java frameworks such as Spring Boot where applicable for service development. ● Optimize backend services for high-volume, low-latency medical data processing. ● Work with relational databases (PostgreSQL, MySQL) and NoSQL databases (MongoDB). ● Design efficient schemas and implement secure data access patterns. ● Ensure database architecture supports scalability, redundancy, and compliance. ● Deploy and manage applications on AWS (EC2, S3, RDS, Lambda, API Gateway, etc.). ● Set up CI/CD pipelines for automated build, test, and deployment. ● Use Docker for containerization and reproducible environments. ● Monitor and optimize cloud resource usage for cost efficiency. ● Adhere to ISO 13485 and HIPAA standards for medical software development. ● Participate in code reviews, unit testing, and automated quality checks. ● Maintain proper documentation as per medical device software lifecycle requirements. Qualifications: ● Bachelor’s degree in Computer Science, Software Engineering, or a related field. ● Experience: 2–4 years in full-stack development with proven backend and Android expertise. ● Must-have: Proficiency in Java for Android and backend services. ● Plus: Knowledge of Spring/Spring Boot. ● Strong in Python (FastAPI). ● Proficiency in AWS services and deployment pipelines. ● Experience with relational and NoSQL databases. ● Familiarity with HIPAA and ISO 13485 guidelines. ● Hands-on with Git, Docker, CI/CD workflows. ● Ability to work independently and collaborate effectively. ● Strong problem-solving, adaptability, and attention to detail. Join IF MedTech to drive innovation in healthcare technology and develop software solutions that transform lives worldwide!

Posted 2 days ago

Apply

8.0 - 12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About ValGenesis ValGenesis is a leading digital validation platform provider for life sciences companies. ValGenesis suite of products are used by 30 of the top 50 global pharmaceutical and biotech companies to achieve digital transformation, total compliance and manufacturing excellence/intelligence across their product lifecycle. Learn more about working for ValGenesis, the de facto standard for paperless validation in Life Sciences: https://www.youtube.com/watch?v=tASq7Ld0JsQ About the Role: We are looking for an experienced software engineering manager who could join our cloud product engineering team to build the next gen applications for our global customers. If you are a technology enthusiast and have passion to develop enterprise cloud products with quality, security, and performance, we are eager to discuss with you the potential role. Responsibilities: Understand the business requirements and technical constraints and architect/design/develop. Participate in the complete development life cycle. Lead and Review the architecture/design/code of self and others. Develop enterprise application features/services using Azure cloud services, C# .NET Core, ReactJS etc, implementing DevSecOps principles. Act as a hands-on technical lead or manager of a scrum team. Own and be accountable for the Quality, Performance, Security and Sustenance of the respective product deliverables. Strive for self-excellence along with enabling success of the team/stakeholders. Appraise the performance of self, peers and team members. Manage the critical conversations and career path of the team members. Requirements 8 to 12 years of experience in developing enterprise software products Strong knowledge of C#, .NET Core, Azure DevOps Working knowledge of the JS frameworks – React, Angular, TypeScript, Express, Node etc Experience in coaching/guiding/leading a scrum team of junior, senior full stack engineers Experience in cross-functional collaboration with product owners, senior management etc. Experience in Micro-Services and/or Micro-Frontend architecture Experience in container-based development, AKS, Service Fabric etc Experience in messaging queue like RabbitMQ, Kafka Experience in Azure Services like Azure Logic Apps, Azure Functions Experience in Relational and No-SQL databases like MS SQL Server, PostgreSQL, MongoDB Strong knowledge of Code Quality, Code Monitoring, Performance Engineering, Test Automation Tools Knowledge of reporting solutions like PowerBI, Apache SuperSet etc We’re on a Mission In 2005, we disrupted the life sciences industry by introducing the world’s first digital validation lifecycle management system. ValGenesis VLMS® revolutionized compliance-based corporate validation activities and has remained the industry standard. Today, we continue to push the boundaries of innovation ― enhancing and expanding our portfolio beyond validation with an end-to-end digital transformation platform. We combine our purpose-built systems with world-class consulting services to help every facet of GxP meet evolving regulations and quality expectations. The Team You’ll Join Our customers’ success is our success. We keep the customer experience centered in our decisions, from product to marketing to sales to services to support. Life sciences companies exist to improve humanity’s quality of life, and we honor that mission. We work together. We communicate openly, support each other without reservation, and never hesitate to wear multiple hats to get the job done. We think big. Innovation is the heart of ValGenesis. That spirit drives product development as well as personal growth. We never stop aiming upward. We’re in it to win it. We’re on a path to becoming the number one intelligent validation platform in the market, and we won’t settle for anything less than being a market leader. How We Work Our Chennai, Hyderabad and Bangalore offices are onsite, 5 days per week. We believe that in-person interaction and collaboration fosters creativity, and a sense of community, and is critical to our future success as a company. ValGenesis is an equal-opportunity employer that makes employment decisions on the basis of merit. Our goal is to have the best-qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristics protected by local law.

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About Our Client: Founded in 2020, the company is a digital platform in the spirituality and wellness sector, offering tailored apps to help users with personal growth and well-being. It combines technology with traditional practices to deliver engaging content and experiences that drive long-term user retention. The brand aims to cater to individuals seeking high-quality and authentic devotional products, blending traditional craftsmanship with modern convenience. Job Role: Python Developer/ Golang Developer Location: HSR Layout, Bangalore Experience: 4-8 years Qualification: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field About the role: We are looking for a senior software developer (Python) with a strong background in backend technologies to join our high-performing engineering team. You will play a key role in designing, building, and maintaining mission-critical services that scale to millions of users. Key Responsibilities: Develop and maintain robust, scalable backend systems using Python, Golang. Design efficient data models and queries for PostgreSQL and MongoDB. Build secure and performant APIs for mobile and web applications. Drive cloud-native development and infrastructure setup on AWS. Collaborate with cross-functional teams, including product, mobile, and DevOps. Optimize systems for performance, reliability, and scalability. Conduct code reviews, write unit tests, and improve development processes. Troubleshoot, debug, and resolve production-level issues. Requirements: Backend development experience with Python, Golang. Strong command over relational (PostgreSQL) and document (MongoDB) databases. Practical experience deploying applications on AWS (EC2, ECS, Lambda, RDS, S3). Proficiency in designing RESTful APIs and working in service-oriented architectures. Familiarity with Docker, Git, CI/CD tools, and cloud monitoring practices. Ability to write clean, testable, and maintainable code. Strong analytical and debugging skills with a performance-first mindset. About Hireginie: Hireginie is a prominent talent search company specializing in connecting top talent with leading organizations. We are committed to excellence and offer customized recruitment solutions across industries, ensuring a seamless and transparent hiring process. Our mission is to empower both clients and candidates by matching the right talent with the right opportunities, fostering growth and success for all.

Posted 2 days ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Responsibilities Build software applications by following coding standards and build appropriate unit tests, integration tests, and deployment scripts. Assist in defining software architectures and collaborate with team leaders to explore existing systems, determine areas of complexity and potential risks to successful implementation, and expand the application's capabilities. Translates designs and style guides the UI/UX team provides into functional user interfaces, ensuring cross-browser compatibility and performance. Communicates continually with the client and project teams regarding the progress on the development effort. Assist the project managers to deliver projects on time, within budget and required quality level. Takes responsibility for the successful delivery of the solutions. Contributes to continual improvements in user interface, software architecture, and new technologies. Required Qualifications Bachelor's degree in computer science, information technology, or related area (equivalent work experience will be considered). 1+ years' experience in developing business applications in a full software development life cycle using web technologies. Advanced experience with Node.js, ReactJS, JavaScript, TypeScript, HTML5, CSS3, SASS, Python and web service integration. Advanced experience with SQL and relational databases, especially PostgreSQL and Microsoft SQL Server. Practical experience in designing and implementing enterprise infrastructure and platforms required for cloud computing (e.g., Azure, AWS, GCP). Demonstrate understanding with Azure DevOps, Azure Synapse Analytics, Databricks, Delta Lake and Lakehouse. Experience in designing, developing, and optimizing data processing applications using Apache Spark in Databricks. Capable of writing efficient Spark jobs in languages such as Scala, Python, PySpark, Spark SQL. Practical experience in containerized deployment, microservice architecture, and micro-frontend architecture. Experience in Nginx, Docker, and Redis. Experience using at least one of the following cloud platforms: Azure, AWS, GCP. Familiarity with the application and integration of Generative AI, Prompt Engineering and Large Language Models (LLMs) in enterprise solutions. Ability to independently implement an entire business module from frontend to backend. Demonstrate excellent interpersonal skills, particularly in balancing requirements, managing expectations, collaborating with team members, and driving effective results. Proactive attitude, ability to work independently, and a desire to continuously learn new skills and technology. Excellent written and verbal communication skills in English. Additional Or Preferred Qualifications Master’s degree in computer science, information technology, or related majors. Technical lead experience. 3+ years’ experience in developing business applications in a full software development life cycle using web technologies. Experience using Azure and either AWS or GCP. Experience with data visualization tools such as Power BI or Tableau.

Posted 2 days ago

Apply

0 years

0 Lacs

Nagpur, Maharashtra, India

On-site

Company Overview: We are an established energy auditing and consulting firm leveraging digital tools to offer cutting-edge insights and automation to our clients. We’re developing applications to automate energy calculations and proposal generation. We're looking to take this initiative to the next level — deploying it on AWS and adding features like reporting, projections, ticketing, and dashboards. Job Description: We're looking for a Full-Stack Developer who can take ownership of our web application. You’ll be responsible for both frontend and backend development, deployment, and ongoing maintenance of our app — with a strong emphasis on delivering value-added features for energy auditing clients. Responsibilities: ● Enhance the existing Next.js app for performance, usability, and feature expansion. ● Develop modules for dynamic report generation (PDF/Excel) and savings projections. ● Build secure and scalable AWS deployment (EC2/S3/Lambda/Vercel etc.) ● Implement client login portals, dashboards, and audit data visualization. ● Collaborate with interns or other team members for fast prototyping and rollouts. ● Maintain DevOps workflows: CI/CD, monitoring, backups, uptime. Required Skills & Experience: ● Strong experience with Next.js, React, and Tailwind CSS ● Solid backend knowledge: Node.js, Express, REST/GraphQL APIs ● Experience with AWS services : EC2, S3, Lambda, Amplify or Vercel ● Familiar with database integration (PostgreSQL, MongoDB, or Firebase) ● Familiarity with PDF/Excel report generation tools (jsPDF, Puppeteer, SheetJS) ● GitHub for version control and collaborative workflows Nice-to-Have: ● Experience in data-driven applications , charts (Recharts/Chart.js) ● Background in energy systems, sustainability , or engineering ● Exposure to time-series forecasting or integrating Python ML models

Posted 2 days ago

Apply

6.0 years

0 Lacs

India

Remote

Job Title – Application Support Engineer L3 Location: Remote (To work in Australia time zone 5AM-2PM IST) About the Role As an L3 Application Support Engineer, you will serve as the escalation point for complex technical issues, ensuring high-quality support for our Enterprise SaaS platform used by Health Professionals and Patients. This role is deeply embedded within the Engineering team, requiring strong troubleshooting skills, debugging capabilities, and collaboration with Product and Development teams.You’ll also play a key role in improving documentation, automating processes, and enhancing platform reliability. Key Responsibilities Technical Escalation & Issue Resolution: o Act as the highest level of support within the Support Team. o Investigate and resolve critical incidents, analyzing logs and application behavior. o Work closely with L1/L2 teams to troubleshoot and resolve complex issues. o Replicate and document software bugs for the Development team. Collaboration & Process Improvement: o Work with the Engineering team to debug issues, propose fixes, and contribute to code-level improvements. o Improve support documentation, build playbooks, and optimize incident management processes. o Enhance monitoring and alerting through platforms like Datadog. Technical Operations & Monitoring: o Perform log analysis, SQL queries, and API debugging to diagnose issues. o Monitor AWS infrastructure, CI/CD pipelines, and application performance to identify potential failures proactively. o Maintain uptime and performance using observability tools. Requirements 6+ years in Technical Application Support, DevOps, or Site Reliability Engineering (SRE). Strong troubleshooting skills with technologies such as Node.js, PostgreSQL, Git, AWS, CI/CD. Hands-on experience with monitoring tools like Datadog and uptime monitoring solutions. Proficiency in debugging APIs, SQL queries, and logs. Experience managing support cases through full lifecycle (triage, reproduction, resolution POSITION DESCRIPTION – Application Support Engineer L3 Ability to write detailed bug reports and collaborate effectively with developers. Strong knowledge of ticketing systems such as Freshdesk, ClickUp, and best practices for incident management. Comfortable with on-call rotations and managing high-priority incidents. Preferred Skills Familiarity with Terraform, Kubernetes, or Docker. Experience writing scripts to automate support tasks. Knowledge of healthcare SaaS environments and regulatory considerations. This role is ideal for problem-solvers who love debugging, enjoy working closely with engineering teams, and thrive in fast-paced, customer-centric environments. Key Requirements: Minimum 6+ years in Technical Application Support Strong troubleshooting skills with technologies such as Node.js, PostgreSQL, Git, AWS, CI/CD. Hands-on experience with monitoring tools like Datadog and uptime monitoring solutions Proficiency in debugging APIs, SQL queries, and Perform log analysis Strong knowledge of ticketing systems such as Freshdesk , ClickUp , Exceptional language to handle AUS clients Location: Location: Remote (To work in Australia time zone 5AM-2PM IST) Compensation: Up to Rs.15–20 LPA

Posted 2 days ago

Apply

0 years

0 Lacs

India

Remote

Job Title: Junior Java Developer Location: India,Remote Job Type: Full-time About the Role: We are seeking a passionate and motivated Junior Java Developer to join our development team. This role is ideal for someone with a strong foundation in Java programming who is eager to learn and contribute to real-world projects. You will work closely with senior developers to design, develop, test, and maintain Java-based applications. Key Responsibilities: Assist in the design, development, and maintenance of Java applications. Write clean, efficient, and well-documented code. Debug, troubleshoot, and fix application issues. Collaborate with the team to gather requirements and provide technical solutions. Participate in code reviews and contribute to improving coding standards. Test and deploy applications to ensure functionality and performance. Stay updated with new Java features, frameworks, and tools. Required Skills & Qualifications: Bachelor’s degree in Computer Science, IT, or related field (or equivalent experience). Strong knowledge of Java SE (Java Standard Edition) and OOP principles . Basic understanding of Java EE , Spring, or Hibernate (preferred). Familiarity with relational databases (MySQL, PostgreSQL, etc.) and SQL. Knowledge of basic web technologies (HTML, CSS, JavaScript) is a plus. Good problem-solving skills and attention to detail. Ability to work in a team and communicate effectively. Preferred Skills: Experience with version control tools (Git). Exposure to RESTful APIs and microservices architecture. Understanding of Agile/Scrum methodologies.

Posted 2 days ago

Apply

0.0 years

0 - 0 Lacs

Thrissur, Kerala

On-site

Company Overview Aitrich Technologies is a forward-thinking technology company headquartered in Thrissur, Kerala, with operations across Engineering Services, Business Solutions, and Technology Training. Since 2010, we’ve been committed to innovation, excellence, and nurturing talent. Job Overview We’re hiring a mid-level Java Developer to design, build, and ship secure, scalable web apps and microservices. You’ll work end-to-end—from grooming user stories to deployment—collaborating with product, UX, QA, and DevOps. This role suits someone with 2–4+ years of hands-on development who’s comfortable owning features and improving code quality. Key ResponsibilitiesBackend & APIs Build RESTful services with Java 11+ and Spring Boot (Web, Data, Security). Model domains using JPA/Hibernate; handle pagination, caching, and N+1 avoidance. Implement authentication/authorization (JWT/OAuth2), validation, and global exception handling. Integrate external APIs; document endpoints with OpenAPI/Swagger. Data & Persistence Design normalized schemas and write performant SQL (PostgreSQL/MySQL). Use NoSQL (MongoDB/Redis) where appropriate for caching, document storage, or queues. Manage migrations with Flyway/Liquibase; ensure backup/restore readiness. Web UI (Server-Side) Implement server-rendered views using JSP/JSTL/Servlets (or Thymeleaf). Wire up forms, input validation, and session handling; collaborate with frontend teams for SPA integrations. Dev Tools, CI/CD & Deployment Maintain developer-level pipelines (GitHub Actions/GitLab/Jenkins) with build/test stages (Maven/Gradle). Write unit/integration tests (JUnit/Mockito) and uphold code quality gates. Containerize services with Docker; deploy to Tomcat/Jetty and assist in environment configuration. Quality, Observability & Performance Apply SOLID principles and common design patterns; participate in code reviews. Add logging/metrics/tracing (SLF4J/Logback, ELK/Prometheus/Grafana). Profile JVM (GC, heap, threads) and tune SQL queries and indexes. Ways of Working (Agile/SCRUM) Participate in sprint planning, estimation, daily stand-ups, reviews, and retros. Break down epics into stories/tasks; maintain concise technical documentation/ADRs. Must-Have Qualifications 2+ years of hands-on Java development experience. Strong Core Java (collections, concurrency, streams), REST API design. Spring Boot (Web/Data/Security), Hibernate/JPA. SQL design & tuning (PostgreSQL/MySQL) and working knowledge of a NoSQL store (MongoDB/Redis). Git workflows, Maven/Gradle, unit/integration testing with JUnit/Mockito. Basic Docker and Linux server familiarity; developer-level CI/CD exposure. Clear communication, ownership mindset, and collaborative attitude. Good-to-Have (Bonus) AI tools for coding & productivity (e.g., GitHub Copilot), prompt-assisted test/data generation. AWS basics (EC2, S3, RDS, IAM) or container orchestration exposure (ECS/EKS). Angular fundamentals (components, services, RxJS) for SPA integrations. Messaging/eventing (Kafka/RabbitMQ), microservices patterns (config server, circuit breaker). Security hardening (CORS/CSRF, OAuth2/OIDC), SonarQube, SAST/DAST familiarity. Product development mindset: instrument features, read telemetry, iterate with users. Success Metrics (What Good Looks Like) Consistent on-time delivery of sprint commitments with low defect escape rates. Maintainable, well-tested code (coverage & quality thresholds met). Measurable performance improvements on critical endpoints/queries. Positive peer code-review feedback and effective cross-team collaboration. Education Bachelor’s/Master’s in CS/IT/Engineering—or equivalent practical experience with strong projects. Reporting & Work Mode Reports to: Engineering Manager / Tech Lead Job Type: Full-time Pay: ₹15,000.00 - ₹25,000.00 per month Benefits: Health insurance Provident Fund Work Location: In person

Posted 2 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Software Engineer (Full Stack): Biometric Authentication Our Purpose We work to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. We cultivate a culture of inclusion for all employees that respects their individual strengths, views, and experiences. We believe that our differences enable us to be a better team – one that makes better decisions, drives innovation and delivers better business results. Overview Background The Mastercard Authentication Program owns how consumer authentication works for both in-store and e-commerce transactional use cases. The primary purpose of this role is to develop and deliver best-of-breed authentication products for e-commerce transactional use cases that will drive uptake and penetration for the products and revenue for Mastercard. The authentication products that fall within this role’s responsibilities are ID Check, Token Authentication Service and Token Authentication Framework. If you want a challenging project that is changing the way people do payments, this is the role for you! As a Lead Software Engineer, you will be responsible for Design and develop secure, reliable, and scalable solutions for globally distributed customer facing products Support development teams and work with stakeholders, promoting agile development Integrate our systems with third-party SaaS products, ensuring seamless data flow and functionality Research, create and evaluate technical solution alternatives for the business needs using current and upcoming technologies and frameworks They are hands on all the time and collaborate by writing interfaces, tests, unit or acceptance and architecture fitness functions, outside of meeting rooms Participate in architectural discussions, code reviews, and contribute to a collaborative engineering culture. Work with business/product owners to architect and deliver on new services to introduce new products and bundles Ensure the quality, performance, and security of our applications through testing, optimization, and adherence to best practices. Contribute and lead initiatives by engaging and mentoring Engineers at all levels to improve the craftmanship of Software Engineering Technologies: - Microservices architecture and development - Java, Spring boot, RESTful APIs, Open API specification Front end development - Javascript, HTML/CSS, using React / Angular / Vue.js frameworks Experience working with more than one object-oriented programming languages - C/C++ or Python with flask or Node.js (preferrable) Proven experience working with major cloud platforms (Azure, AWS) and a strong understanding of cloud-based services, with specific expertise in container orchestration using AKS or EKS Experience deploying and managing applications in docker containers Databases – SQL and NoSQL databases such as Oracle, Mongo, PostgreSQL, Redis Secure communication (HTTPS, TLS, OAuth) and security best practices such as data encryption and protection against vulnerabilities About You Bachelor’s degree in information systems, Information Technology, Computer Science or Engineering or equivalent work experience. Experience with various architectural patterns, including high performance, high availability transaction processing and multi-tiered web applications Hands-on experience in designing solutions and full stack development in modern technologies for large enterprise technology platforms and systems Hands-on experience in coding Microservices in Java, building UI/UX, frameworks such as React, Angular, spring boot, RDBMS, Oracle and event driven architecture. Hands on experience integrating vendor and open-source products into a cohesive system Experience in deploying applications using CI/CD pipelines, docker containers & Kubernetes to Cloud platforms is preferred Has experience designing and implementing solutions focusing on the non-functional concerns – Performance, Scalability, Availability, Extensibility, Supportability, Usability Operate with urgency, fairness and decency to address challenges and solve for new opportunities. Strong communicator to maintain internal and external alignment. Familiar with cutting edge industry trends and thorough understanding of development methodologies and standards, Has skills to succinctly articulate architecture patterns of complex systems, with business and technical implications, to executive and customer stakeholders Experienced in agile and modern SDLC practices: Scrum/Kanban/Continuous Delivery/DevOps/Quality engineering, and the delivery situations they are used for Good To Have Familiarity with the payments industry, payment processing, reporting and data & analytics domain Understanding of image processing techniques and computer vision principles is beneficial Exposure to trending technologies (AI/ML, IOT, Bot, Quantum Computing) and architectures is very good to have Exposure to security best practices for biometric systems, including data encryption, authentication protocols, and vulnerability mitigation. Our Teams And Values We work within small collaborative teams consisting of software engineers and product managers Our customer’s success is at the core of what we do We are diverse and inclusive teams from many backgrounds and with many experiences We believe in doing well by doing good through inclusive growth and making ethical and environmentally responsible decisions Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.

Posted 2 days ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

We are seeking a talented Software Engineer to join our AI platform team. In this role, you will contribute to the development of cutting-edge software solutions leveraging Java 8, Spring Framework, Spring Boot, Rest APIs, Microservices, and Kafka, while supporting data-driven initiatives with generative AI and machine learning at their core. Responsibilities Develop and maintain Java-based applications to meet business and technical needs Collaborate with cross-disciplinary teams, including data scientists and business analysts, to design and deliver robust solutions Architect and implement scalable microservices-based applications using Spring Boot Build, consume, and document RESTful APIs and ensure seamless integration with other services Utilize Kafka for messaging frameworks to implement asynchronous communication in distributed systems Ensure code quality through unit testing, debugging, and optimization using tools like JUnit or Mockito Monitor and troubleshoot performance issues to maintain application reliability Contribute to CI/CD pipelines, enhancing code deployment automation and delivery processes Stay informed about emerging technologies and best practices to continuously improve applications Requirements 4-8 years of experience as a Software Engineer or in a similar role 3+ years of hands-on experience in Java development Knowledge of Java 8+ and frameworks like Spring, Spring Boot Background in building RESTful APIs and microservices architecture Proficiency in messaging systems such as Kafka for integration and communication Expertise in database technologies like PostgreSQL, Oracle, or Hibernate/JPA for data storage and interaction Familiarity with CI/CD tools such as Jenkins or GitLab CI/CD Understanding of authentication mechanisms, including OAuth2, JWT, and Spring Security Showcase of testing frameworks such as JUnit, TestNG, or Mockito for maintaining application quality English level B1+ for effective communication Nice to have Experience working within the financial services industry Certification in Azure or related cloud technologies Familiarity with other programming languages and frameworks Background in Agile methodologies and DevOps practices

Posted 2 days ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Education And Experience Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or related field. 3–6 years of hands-on experience in Scala development, preferably in a data engineering or data pipeline context. Key Responsibilities Collaborate with business analysts and stakeholders to gather and analyze requirements for data pipeline solutions. Design, develop, and maintain scalable data pipelines using Scala and related technologies. Write clean, efficient, and well-documented Scala code for data ingestion, transformation, and processing. Develop and execute unit, integration, and end-to-end tests to ensure data quality and pipeline reliability. Orchestrate and schedule data pipelines using tools such as Apache Airflow, Oozie, or similar workflow schedulers. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Participate in code reviews, provide constructive feedback, and adhere to best practices in software development. Document technical solutions, data flows, and pipeline architectures. Work closely with DevOps and Data Engineering teams to deploy and maintain solutions in production environments. Stay current with emerging technologies and industry trends in big data and Scala development. Required Skills & Qualifications Strong proficiency in Scala, including functional programming concepts. Experience building and maintaining ETL/data pipelines. Solid understanding of data structures, algorithms, and software engineering principles. Experience with workflow orchestration/scheduling tools (e.g., Apache Airflow, Oozie, Luigi, or similar). Familiarity with distributed data processing frameworks (e.g., Apache Spark, Kafka, Flink). Proficiency in writing unit and integration tests for data pipelines. Experience with version control systems (e.g., Git). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Skills & Qualifications Experience with cloud platforms (AWS, Azure, or GCP) and related data services. Knowledge of SQL and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB). Familiarity with containerization and orchestration tools (Docker, Kubernetes). Exposure to CI/CD pipelines and DevOps practices. Experience with data modeling and data warehousing concepts. Knowledge of other programming languages (e.g., Python, Java) is a plus. Experience working in Agile/Scrum environments. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 308597

Posted 2 days ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Key Responsibilities: • Architect & Build Scalable Systems: Design and implement petabyte-scale lakehousearchitectures (Apache Iceberg, Delta Lake) to unify data lakes and warehouses. • Real-Time Data Engineering: Develop and optimize streaming pipelines using Kafka, Pulsar,and Flink to process structured/unstructured data with low latency. • High-Performance Applications: Leverage Java to build scalable, high-throughput dataapplications and services. • Modern Data Infrastructure: Leverage modern data warehouses and query engines (Trino, Spark)for sub-second operation and analytics on real-time data. • Database Expertise: Work with RDBMS (PostgreSQL, MySQL, SQL Server) and NoSQL(Cassandra, MongoDB) systems to manage diverse data workloads. • Data Governance: Ensure data integrity, security, and compliance across multi-tenant systems. • Cost & Performance Optimization: Manage production infrastructure for reliability, scalability,and cost efficiency. • Innovation: Stay ahead of trends in the data ecosystem (e.g., Open Table Formats, streamprocessing) to drive technical excellence. • API Development (Optional): Build and maintain Web APIs (REST/GraphQL) to expose dataservices internally and externally. Qualifications: • 8+ years of data engineering experience with large-scale systems (petabyte-level). • Expert proficiency in Java for data-intensive applications. • Hands-on experience with lakehouse architectures, stream processing (Flink), and event streaming (Kafka/Pulsar). • Strong SQL skills and familiarity with RDBMS/NoSQL databases. • Proven track record in optimizing query engines (e.g., Spark, Presto) and data pipelines. • Knowledge of data governance, security frameworks, and multi-tenant systems. • Experience with cloud platforms (AWS, GCP, Azure) and infrastructure-as-code (Terraform). What we offer? • Unique experience in Fin-Tech industry, with a leading, fast-growing company. • Good atmosphere at work and a comfortable working environment. • Additional benefit of Group Health Insurance - OPD Health Insurance • Coverage for Self + Family (Spouse and up to 2 Children) • Attractive Leave benefits like Maternity, Paternity Benefit, Vacation leave & Leave Encashment • Reward & Recognition – Monthly, Quarterly, Half yearly & yearly. • Loyalty benefits • Employee referral program

Posted 2 days ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Education And Experience Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or related field. 3–6 years of hands-on experience in Scala development, preferably in a data engineering or data pipeline context. Key Responsibilities Collaborate with business analysts and stakeholders to gather and analyze requirements for data pipeline solutions. Design, develop, and maintain scalable data pipelines using Scala and related technologies. Write clean, efficient, and well-documented Scala code for data ingestion, transformation, and processing. Develop and execute unit, integration, and end-to-end tests to ensure data quality and pipeline reliability. Orchestrate and schedule data pipelines using tools such as Apache Airflow, Oozie, or similar workflow schedulers. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Participate in code reviews, provide constructive feedback, and adhere to best practices in software development. Document technical solutions, data flows, and pipeline architectures. Work closely with DevOps and Data Engineering teams to deploy and maintain solutions in production environments. Stay current with emerging technologies and industry trends in big data and Scala development. Required Skills & Qualifications Strong proficiency in Scala, including functional programming concepts. Experience building and maintaining ETL/data pipelines. Solid understanding of data structures, algorithms, and software engineering principles. Experience with workflow orchestration/scheduling tools (e.g., Apache Airflow, Oozie, Luigi, or similar). Familiarity with distributed data processing frameworks (e.g., Apache Spark, Kafka, Flink). Proficiency in writing unit and integration tests for data pipelines. Experience with version control systems (e.g., Git). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Skills & Qualifications Experience with cloud platforms (AWS, Azure, or GCP) and related data services. Knowledge of SQL and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB). Familiarity with containerization and orchestration tools (Docker, Kubernetes). Exposure to CI/CD pipelines and DevOps practices. Experience with data modeling and data warehousing concepts. Knowledge of other programming languages (e.g., Python, Java) is a plus. Experience working in Agile/Scrum environments. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 308597

Posted 2 days ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Role: Oracle Database Engineer Location: Gurgaon Experience: 6 Year+ Job Description : JD is as below Role Overview: We are looking for a highly skilled Senior Oracle Database Resource to join our innovative team. The ideal candidate will possess significant experience in database design, development, and administration, with a strong emphasis on Oracle databases alongside familiarity with ETL tools and Snowflake. Experience in migrating databases from Oracle to PostgreSQL will be an added advantage. Key Responsibilities: Database Design & Development: Design, implement, and maintain robust Oracle database solutions. Develop and manage ETL processes to ensure data accuracy and quality. Collaborate with business stakeholders to gather requirements and develop scalable database solutions. Performance Tuning & Optimisation: Monitor and enhance database performance and efficiency. Implement industry best practices for database management and security compliance. Migration Expertise: Lead projects on database migration from Oracle to PostgreSQL. Assist in establishing strategies and methodologies for successful database transitions. Documentation & Training: Produce and maintain comprehensive documentation for database architecture, processes, and procedures. Provide training and support to team members and end-users. Quality Assurance: Ensure adherence to data governance and regulatory compliance. Conduct regular database backups and develop disaster recovery plans. Required Qualifications: Extensive experience with Oracle Database (specific version if necessary). Proficient in ETL tools (e.g., Informatica, Talend, Apache NiFi). Solid understanding of Snowflake and its integration with existing systems. Proven experience in designing and implementing complex database solutions. Familiarity with database migration processes, especially from Oracle to PostgreSQL. Desired Skills: Strong analytical and problem-solving abilities. Excellent verbal and written communication skills. Ability to work collaboratively in a team environment and mentor junior members.

Posted 2 days ago

Apply

2.0 - 4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Summary: We are seeking a detail-oriented Full Stack Engineer with strong debugging and performance optimization skills. The primary responsibility of this role is to maintain existing systems, fix bugs, resolve production issues, and continuously enhance application performance. The ideal candidate should be proficient in React, Java Spring Boot, and PostgreSQL, and must be an expert in debugging across the stack. Key Responsibilities:- Investigate, analyze, and fix bugs across frontend and backend codebases. Debug and resolve production issues with quick turnaround and root cause analysis. Improve performance of existing systems (both backend APIs and frontend UI). Collaborate with development teams to implement sustainable technical solutions. Optimize queries and ensure database efficiency using PostgreSQL. Participate in code reviews and suggest performance improvements. Contribute to documentation related to bug-fixes and improvements. Required Skills & Qualifications:- 2 - 4 years of experience in full stack development and system maintenance. Strong proficiency in React.js, JavaScript, HTML, and CSS. Solid backend development experience in Java Spring Boot. In-depth knowledge of PostgreSQL and query optimization. Expert in debugging production systems and troubleshooting real-time issues. Good understanding of performance tuning techniques and tools. Familiarity with version control (Git) and CI/CD pipelines. Nice to Have:- Experience with unit/integration testing tools (JUnit, Jest) Experience with monitoring tools (Site 24*7, Grafana, Prometheus). Background in microservices and distributed systems. Experience with automated testing tools and frameworks. Educational Qualification: Bachelor's degree in Computer Science, IT, or related field

Posted 2 days ago

Apply

6.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Education And Experience Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or related field. 3–6 years of hands-on experience in Scala development, preferably in a data engineering or data pipeline context. Key Responsibilities Collaborate with business analysts and stakeholders to gather and analyze requirements for data pipeline solutions. Design, develop, and maintain scalable data pipelines using Scala and related technologies. Write clean, efficient, and well-documented Scala code for data ingestion, transformation, and processing. Develop and execute unit, integration, and end-to-end tests to ensure data quality and pipeline reliability. Orchestrate and schedule data pipelines using tools such as Apache Airflow, Oozie, or similar workflow schedulers. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Participate in code reviews, provide constructive feedback, and adhere to best practices in software development. Document technical solutions, data flows, and pipeline architectures. Work closely with DevOps and Data Engineering teams to deploy and maintain solutions in production environments. Stay current with emerging technologies and industry trends in big data and Scala development. Required Skills & Qualifications Strong proficiency in Scala, including functional programming concepts. Experience building and maintaining ETL/data pipelines. Solid understanding of data structures, algorithms, and software engineering principles. Experience with workflow orchestration/scheduling tools (e.g., Apache Airflow, Oozie, Luigi, or similar). Familiarity with distributed data processing frameworks (e.g., Apache Spark, Kafka, Flink). Proficiency in writing unit and integration tests for data pipelines. Experience with version control systems (e.g., Git). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Skills & Qualifications Experience with cloud platforms (AWS, Azure, or GCP) and related data services. Knowledge of SQL and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB). Familiarity with containerization and orchestration tools (Docker, Kubernetes). Exposure to CI/CD pipelines and DevOps practices. Experience with data modeling and data warehousing concepts. Knowledge of other programming languages (e.g., Python, Java) is a plus. Experience working in Agile/Scrum environments. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 308597

Posted 2 days ago

Apply

6.0 years

0 Lacs

Gandhinagar, Gujarat, India

On-site

Job Title: Odoo Tech Lead / Team Leader (6+ Years Experience) Experience Required: Minimum 6 Years in Odoo Development & Team Management Employment Type: Full-time. About the Role: We are seeking an experienced and proactive Odoo Technical Lead / Team Leader who will not only lead and mentor a team but also actively write code and build modules in the initial phase of projects. This is a hands-on leadership role for someone who is passionate about solving complex business challenges with scalable ERP solutions using Odoo. You will oversee the technical architecture, supervise code quality, and ensure timely delivery, while also working closely with clients and functional teams. Key Responsibilities: ● Lead the technical planning, architecture, and end-to-end implementation of Odoo-based solutions ● Write and review code for custom modules, especially in the initial stages of the project ● Guide junior and mid-level developers through design decisions, reviews, and problem-solving ● Translate functional requirements into detailed technical solutions ● Manage project timelines, code quality, deployments, and documentation ● Collaborate with functional consultants and QA to ensure delivery accuracy and system performance ● Handle complex customizations and third-party API integrations ● Ensure adherence to coding standards, version control, and CI/CD practices ● Stay up to date with new features in Odoo and emerging ERP technologies Required Skills & Qualifications: ● Bachelor’s or Master’s in Computer Science or related field ● Minimum 6 years of Odoo development experience across multiple versions (v10 to latest) ● Must have the ability to write Odoo modules from scratch and modify core functionalities if needed ● Agile/Scrum experience with tools like Jira or ClickUp ● Strong command of Python, PostgreSQL, XML, JavaScript, QWeb, and Odoo ORM ● Solid understanding of backend and frontend customization in both Community and Enterprise editions ● Experience with tools like Odoo.sh, GitHub, Docker, Jenkins, etc. ● Hands-on experience with REST APIs and 3rd-party app integrations ● Familiarity with business domains like Sales, Purchase, Inventory, Manufacturing, HR, Accounting ● Strong leadership, problem-solving, and communication skills ● Experience in performance tuning and large database management in Odoo ● Capable of managing a team and delivering projects independently Preferred Qualifications: ● Odoo Certification (Technical or Functional) is highly desirable

Posted 2 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About Us “Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Job Title: Java Developer Exp: 5+ Years Location: Pune In this role, you will: Work towards priorities defined by product owners in the business to collaboratively build out the Product/Platform · To have a clear view of technology strategy for design & delivery of the technical aspects of the product…focusing on not only business delivery but constant focus on remediating tech debt. · Responsible for delivering tasks end to end with high quality and in line with design and architecture laid out. Strive towards no post implementation issues. · Production support, environment management, release support and automation implementation to be part of day job. · Ensuring that quality (code / performance) and discipline (TDD, BDD, unit, JIRA usage etc) are always maintained. · Maintaining our Agile and delivery principles. · Working with UX and Architecture to ensure that Design Driven ethos is upheld. · Collaboration with business and team along with DevOps principles maintained all the time. To be successful in this role, you should meet the following requirements: Demonstrable experience of Continuous Delivery software development methods, including TDD and automated testing (including non-functional performance testing). · Experience of working on high volume data integration and throughput requirements (profiling) · Experience of micro service architecture · Experience of REST services. · Experience of Developing microservices and deploying on Containerized environment · A background of solid architectural work is advantageous. Technical Specifics: Java 17 or above, Spring Boot components, Spring framework Proficiency with ServiceNow development, such as scripting, workflows, and integrations. Oracle, PostgreSQL, MySQL Some experience with NoSQL, Elastic, Google Cloud, Kubernetes, Ansible, AI/ML is good to have. Non-Technical: · Strong communication skills – experience of interfacing with IT Lead/Delivery Manager, Architect, business Product Owners and IT offshore teams. · Model – Strive to be Role Model for the peers.

Posted 2 days ago

Apply

3.0 years

15 - 17 Lacs

Pune, Maharashtra, India

On-site

Role & Responsibilities Design, implement, and maintain backend services and RESTful APIs using Python and frameworks like Django or Flask. Collaborate with product owners and UI/UX designers to translate business requirements into technical solutions. Optimize application performance through code reviews, profiling, and effective caching strategies. Integrate with SQL/NoSQL databases, ensuring data integrity and efficient query performance. Develop and maintain automated tests (unit, integration) to ensure code quality and reliability. Participate in agile ceremonies, contribute to sprint planning, and drive continuous improvement initiatives. Skills & Qualifications Must-Have 3+ years of hands-on experience in Python development with strong OOP and scripting skills. Proficiency in Django or Flask for building web applications and APIs. Solid experience with relational (PostgreSQL/MySQL) and NoSQL (MongoDB) databases. Hands-on knowledge of RESTful API design principles and microservices architecture. Familiarity with Git workflows, branching strategies, and code review tools. Strong problem-solving skills, debugging techniques, and command over Linux/Unix environments. Preferred Experience with containerization technologies such as Docker and orchestration using Kubernetes. Exposure to CI/CD pipelines and infrastructure as code (Jenkins, GitLab CI, Terraform). Knowledge of asynchronous task queues (Celery, RabbitMQ) and real-time messaging systems. Benefits & Culture Highlights Collaborative on-site environment with open communication and agile best practices. Continuous learning culture: access to training budgets, certifications, and tech workshops. Clear career progression paths and regular performance feedback to fuel professional growth. Skills: python,git,oop,django,sql,microservices,restful apis,flask,nosql,linux/unix

Posted 2 days ago

Apply

5.0 years

0 Lacs

Bengaluru East, Karnataka, India

Remote

Req ID: 336888 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a SQL Developer to join our team in Bangalore, Karnātaka (IN-KA), India (IN). Once You Are Here, You Will: Developer: SQL (PostgreSQL/ETL)Data Analysis Agile process knowledge Act as the first point of escalation for daily service issues along with PM and be a primary point of contact for Stakeholders . Proficiency in SQL, data environments, and data transformation tools (Python). Strong understanding of ETL data pipelines, including integration with APIs and databases. Hands-on experience with cloud-based Data Warehousing solutions (Snowflake). Knowledge of SDLC and Agile development techniques Practical experience with source control (GIT, SVN, etc.) Knowledge of design, development, and data linkages inside RDBMS and file data stores for MS SQL Server databases (CSV, XML, JSON, etc.) Thorough understanding of the development methods for batch and real-time system integration Prepare/Review Test Scripts and Unit testing of changes. Provide training, support, and leadership to the larger project team Required Qualifications: 5+ years’ experience in : SQL (PostgreSQL/ETL)Data Analysis Agile process knowledge consulting role that include completing at least 4 projects in a developer role Preferred Experience: Prior experience with a software development methodology, Agile preferred Experience with data migration using Data Loader Ideal Mindset: Problem Solver. You are creative but also practical in finding solutions to problems that may arise in the project to avoid potential escalations. Analytical. You like to dissect complex processes and can help forge a path based on your findings #salesforce About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com Whenever possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client’s needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, https://us.nttdata.com/en/contact-us . NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .

Posted 2 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Looking Immediate Joiners: Serving Notice Period Role: Data Engineer Experience: 3-5 Years Location: Hyderabad Work Mode: Hybrid Interview Mode: Face-to-Face Experience, Qualifications, Knowledge and Skills Bachelor's degree (B. A. / B. S.) from four-year college or university; and two to four years related experience and/or training; or equivalent combination of education and experience. 2+ years Healthcare industry experience preferred 3+ years of experience with SQL, database design, optimization , and tuning 3+ years of experience with open source relational (e.g. Postgresql ) 3+ years of experience using Github 3+ years of experience in Shell Scripting and one other object oriented language such as Python , or PhP. 3+ years of experience in continuous integration and development methodologies tools such as Jenkins 3+ years of experience in an Agile development environment Time management skills Professionalism Programming skills particularly SQL, Shell Scripting, and Python Detail oriented Conscientious Team player Oral and written communication skills * Note: Candidates must have hands-on experience with PostgreSQL, SQL, Python, and Shell scripting &ETL If you are interested, please share updated resume to prasanna@intellistaff.in

Posted 2 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Key Responsibilities System Architecture & Event-Driven Design • Design and implement event-driven architectures using Apache Kafka to orchestrate distributed microservices and streaming pipelines. • Define scalable message schemas (e.g., JSON/Avro), data contracts, and versioning strategies to support AI-powered services. • Architect hybrid event + request-response systems to balance real-time streaming and synchronous business logic. Backend & AI/ML Integration • Develop Python-based microservices using FastAPI, enabling both standard business logic and AI/ML model inference endpoints. • Collaborate with AI/ML teams to operationalize ML models (e.g., classification, recommendation, anomaly detection) via REST APIs, batch processors, or event consumers. • Integrate model-serving platforms such as SageMaker, MLflow, or custom Flask/ONNX-based services. Cloud-Native & Serverless Deployment (AWS) • Design and deploy cloud-native applications using AWS Lambda, API Gateway, S3, CloudWatch, and optionally SageMaker or Fargate. • Build AI/ML-aware pipelines that automate retraining, inference triggers, or model selection based on data events. • Implement autoscaling, monitoring, and alerting for high-throughput AI services in production. Data Engineering & Database Integration • Ingest and manage high-volume structured and unstructured data across MySQL, PostgreSQL, and MongoDB. • Enable AI/ML feedback loops by capturing usage signals, predictions, and outcomes via event streaming. • Support data versioning, feature store integration, and caching strategies for efficient ML model input handling. Testing, Monitoring & Documentation • Write unit, integration, and end-to-end tests for both standard services and AI/ML pipelines. • Implement tracing and observability for AI/ML inference latency, success/failure rates, and data drift. • Document ML integration patterns, input/output schema, service contracts, and fallback logic for AI systems. Preferred Qualifications • 6+ years of backend software development experience with 2+ years in AI/ML integration or MLOps. • Strong experience in productionizing ML models for classification, regression, or NLP use cases. • Experience with streaming data pipelines and real-time decision systems. • AWS Certifications (Developer Associate, Machine Learning Specialty) are a plus. • Exposure to data versioning tools (e.g., DVC), feature stores, or vector databases is advantageous.

Posted 2 days ago

Apply

10.0 - 15.0 years

0 Lacs

India

Remote

Job Role : Senior Lecturer Subject : Data Science with good knowledge of AWS, ML Ops and Bid Data Location : Remote Responsibilities : Develop and manage a robust academic framework for the Data Science vertical. Collaborate with various departments to ensure efficient resource allocation and program delivery. Stay updated with the latest trends in Data Science and emerging technologies to keep the curriculum relevant. Represent the institution at academic and professional conferences, contributing to thought leadership in the Data Science field. Qualifications: M.Sc. (Computer Science), MCA (Master in Computer Application), or B.Tech/M.Tech (Computer Engineering/IT). Doctor of Philosophy (Optional) A minimum of 10-15 years of teaching experience in Data Science or related fields. Proven experience in managing large-scale academic programs or corporate training initiatives. Technical Skills: Programming Languages: Python. Database Knowledge: Experience with MySQL, Oracle, SQL Server, or PostgreSQL (any one). Data Science Expertise: Numpy, Pandas, Matplotlib, Seaborn, Exploratory Data Analysis (EDA). Machine Learning: Proficiency with Scikit-learn (Sklearn) and experience in ML models for regression, classification, and clustering problems. Big Data - PySpark ML, PySpark NLP, Apache Kafka MlOps - Git, Github, Docker, PyCaret, MLFlow. Additional Knowledge: Familiarity with Tableau or Power BI is advantageous. Desired Skills: Strong client-facing and presentation skills. Ability to develop technical solutions tailored to client needs. Strong leadership and collaboration skills, with experience working in cross-functional teams. Exceptional communication and problem-solving abilities. You can also email at sadafa@regenesys.net lls

Posted 2 days ago

Apply

3.0 years

0 Lacs

India

Remote

About Huzzle At Huzzle, we connect high-performing professionals with global companies across the UK, US, Canada, Europe, and Australia. Our clients include startups, digital agencies, and tech platforms in industries like SaaS, MarTech, FinTech, and EdTech. We match top sales talent to full-time remote roles where they're hired directly into client teams and provided ongoing support by Huzzle. As part of our talent pool, you'll have access to exclusive SDR opportunities matched to your background and preferences. About The Company We're looking for a AI Engineer —or as we like to call it, a Vibe Coder . This isn't your typical engineering gig. You'll play a hybrid role, part engineer, part product visionary, part UX craftsman pushing the boundaries of what's possible with AI. You'll work across the full stack, invent features that feel like magic, and co-create Olivia's future alongside the founding team. If you thrive in high-agency, zero-handholding environments and want to work on agentic, generative, and conversational AI systems, this role was built for you. Key Responsibilities Full-stack execution: Design, build, and ship core product features using React, Node.js, and our AI-first architecture. AI-first engineering: Prototype and deploy magical features using tools like Augment and Cursor. Design-forward mindset: Craft seamless user experiences—no design background needed, just good taste and intuition. Autonomous systems: Develop scalable, intelligent agents capable of brand-consistent, on-demand generation. Creative API orchestration: Combine tools like OpenAI, Google AI, Anthropic, and Bedrock into intelligent, unified pipelines. Strategic input: Shape product roadmaps and infrastructure decisions as part of a small, founder-led team. Rapid iteration: Build fast, ship faster, and bring a founder's mindset to debugging, feature testing, and performance tuning. Who You Are A former founder, founding engineer, or technical operator with deep ownership mentality. A creative problem-solver who codes with empathy and thinks in user workflows, not just code modules. A hands-on AI builder already using tools like Cursor or Augment to supercharge your dev flow. A startup-native who thrives in ambiguity and builds structure from chaos. A UX-aware engineer who sweats the details and instinctively builds interfaces that just feel right. A clear communicator who knows when to loop in others—and when to sprint solo. A relentless learner excited by the future of AI and always hunting for better ways to build. A product thinker who treats features like micro-startups: own the vision, build the thing, ship and iterate. Tech Stack Languages: TypeScript, JavaScript, Python (bonus) Frontend: React Backend: Node.js, Wasp (easy to pick up) Infra: Cloudflare Workers/R2, PostgreSQL, Docker AI & APIs: OpenAI, Anthropic, Google AI, Bedrock, Openrouter Dev Tools: Cursor, Augment, Git, Linear A Day in the Life Jump into a fast, focused standup to align on goals Prototype generative features that combine UX, backend, and AI orchestration Share demos via Loom, jam with founders in Slack, and rapidly ship to prod Ideate new user flows, sketch mockups, or dive deep into technical tradeoffs End the day knowing you shipped real value—and helped shape the future of design Requirements 3+ years of hands-on experience in full-stack development using JavaScript/TypeScript (Node.js, React) Strong understanding of modern backend architecture and scalable infrastructure (PostgreSQL, Docker, Cloudflare, AWS/GCP) proven experience across Product, Engineering, UX Research Proven track record of shipping production-ready products or meaningful side projects Experience working with or strong interest in AI development tools (e.g., OpenAI, Anthropic, Cursor, Augment, Bedrock) Solid grasp of API orchestration and prompt engineering for generative/conversational AI systems Natural product intuition with a UX-first mindset—you care about how it feels, not just how it works Comfort working in high-autonomy, high-speed startup environments Ability to balance speed, quality, and experimentation in an agile development cycle Excellent communication skills—able to collaborate asynchronously and explain technical decisions clearly Passionate about AI, startups, and the future of creative tooling Benefits 💰 Competitive compensation with equity potential at milestones 🌍 Fully remote, async-first culture with high flexibility 🚀 Zero bureaucracy, 100% impact environment 🎨 Creative ownership—you shape what gets built ⚙️ Cutting-edge AI stack and tools 📈 Be a foundational team member at a venture-scale company 🔥 Work on a product people feel when they use

Posted 2 days ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

We are seeking an experienced Python Solution Architect to join our dynamic team. In this role, you will be responsible for designing and implementing scalable, high-performance software solutions that meet business requirements. You will collaborate with cross-functional teams to define architecture, best practices, and oversee the development process. Job Responsibilities · Architect scalable, efficient, and high-performance Python-based applications. · Design microservices architecture and cloud-native solutions using Python frameworks (e.g., Django, Flask, FastAPI). · Ensure Python solutions align with business goals and enterprise architecture. · Design and manage RESTful APIs and web services, leveraging Python's capabilities. · Expertise in selecting the right Python frameworks, libraries, and tools for different use cases. · Architect and optimize database interactions, including SQL and NoSQL databases. · Ensure efficient data processing, ETL pipelines, and integrations with data analytics platforms (e.g., Pandas, NumPy, SQLAlchemy). · Design seamless integrations with third-party services, APIs, and external systems using Python-based solutions. · Ensure smooth data flow between Python applications and other enterprise systems. · Architect solutions in cloud environments (AWS, GCP, Azure) using Python. · Implement CI/CD pipelines for Python projects and manage infrastructure-as-code (Terraform, Ansible). · Ensure security best practices in Python code (e.g., OWASP, cryptography, input validation). · Lead efforts to comply with data protection and regulatory requirements in Python solutions. · Provide guidance to Python developers on architectural decisions, design patterns, and code quality. · Mentor teams on Python best practices, writing clean, maintainable, and efficient code. · Work closely with customers, business analysts, project managers, and development teams to understand requirements. · Communicate complex technical concepts to non-technical stakeholders. · Ensure solutions address functional and non-functional requirements (e.g., performance, scalability, security). Preferred Skills · Deep knowledge of Python frameworks like Django, Flask, or FastAPI. · Proficiency with asynchronous programming in Python (e.g., asyncio, concurrent.futures). · Hands-on experience with designing and deploying microservices-based architectures. · Understanding of containerization technologies like Docker and orchestration tools like Kubernetes. · Strong experience with AWS, GCP, or Azure for deploying and scaling Python applications. · Familiarity with cloud services like Lambda (AWS), Cloud Functions (GCP), or similar. · Experience with CI/CD pipelines and automation tools (e.g., Jenkins, GitLab CI, CircleCI). · Knowledge of Infrastructure-as-Code (IaC) tools like Terraform or Ansible. · Proficiency with relational databases (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Redis). · Experience with database optimization, indexing, and query tuning. · Strong understanding of RESTful APIs, GraphQL, and API documentation standards (e.g., OpenAPI/Swagger). · Experience with integrating third-party services via APIs. · Proficient with Git, GitHub, or GitLab for version control and collaboration in Python projects. · Familiarity with branching strategies (e.g., GitFlow) and code review practices. · Experience with Python security tools and practices (e.g., PyJWT, OAuth2, secure coding). · Familiarity with encryption, authentication, and data protection standards. · Hands-on experience working in Agile environments, familiar with Scrum or Kanban. · Ability to break down complex technical tasks into sprints and manage backlogs. · Knowledge of popular Python AI/ML libraries such as TensorFlow, PyTorch, and Scikit-learn. · Experience with deploying machine learning models in production environments.

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies