Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
As the Manager, Data Platform Engineering at Assent, you will play a crucial role in leading and coaching a team of data platform engineers. Your primary responsibility will be to keep your team organized, motivated, and engaged in delivering high-quality, scalable software solutions that address key business challenges. By recruiting, developing, and retaining a high-performing team, you will ensure that Assent's products meet the needs of its customers and align with the company's business objectives. Your role will involve coordinating and allocating work among the team members, driving delivery, and championing product quality. You will collaborate closely with other teams in the product value chain such as Product Management, User Experience, Quality Assurance, Customer Success, and Infrastructure. Additionally, you will visualize upcoming work, manage product releases, and ensure that the software developed follows Assent's guidelines and standards. To excel in this role, you should have at least 10 years of progressive experience in a data-focused role, with a proven track record in a leadership position. Strong mentoring and coaching skills are essential to keep your team engaged and motivated. A solid technical background, familiarity with AWS, Snowflake, dbt, Python, SQL, Kafka, and experience in working with large volumes of unstructured data are also required. As a strategic and business-minded individual, you should possess strong analytical skills and be able to manage short-term projects as well as long-term strategies effectively. Adaptability, flexibility, and a growth mindset are key attributes that will help you thrive in Assent's dynamic environment. Your contributions will not only impact the success of Assent but will also contribute to addressing global challenges related to supply chain sustainability. At Assent, we value your talent, energy, and passion. In addition to competitive financial benefits, we offer wellness programs, flexible work options, volunteer opportunities, and a commitment to diversity, equity, and inclusion. Join us in our mission to create a sustainable future and be a part of a team that values inclusivity, respect, and continuous learning.,
Posted 2 days ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Responsibilities The Sr. Integration Developer (Senior Software Engineer) will work in the Professional Services Team and play a significant role in designing and implementing complex integration solutions using the Adeptia platform. This role requires hands-on expertise in developing scalable and efficient solutions to meet customer requirements. The engineer will act as a key contributor to the team's deliverables while mentoring junior engineers. They will ensure high-quality deliverables by collaborating with cross-functional teams and adhering to industry standards and best practices. Responsibilities include but not limited to: Collaborate with customers' Business and IT teams to understand integration requirements in the B2B/Cloud/API/Data/ETL/EAI Integration space and implement solutions using the Adeptia platform Design, develop, and configure complex integration solutions, ensuring scalability, performance, and maintainability. Take ownership of assigned modules and lead the implementation lifecycle from requirement gathering to production deployment. Troubleshoot issues during implementation and deployment, ensuring smooth system performance. Guide team members in addressing complex integration challenges and promote best practices and performance practices. Collaborate with offshore/onshore and internal teams to ensure timely execution and coordination of project deliverables. Write efficient, well-documented, and maintainable code, adhering to established coding standards. Review code and designs of team members, providing constructive feedback to improve quality. Participate in Agile processes, including Sprint Planning, Daily Standups, and Retrospectives, ensuring effective task management and delivery. Stay updated with emerging technologies to continuously enhance technical expertise and team skills. Essential Skills: Technical 5-7 years of hands-on experience in designing and implementing integration solutions across B2B, ETL, EAI, Cloud, API & Data Integration environments using leading platforms such as Adeptia, Talend, MuleSoft, or equivalent enterprise-grade tools. Proficiency in designing and implementing integration solutions, including integration processes, data pipelines, and data mappings, to facilitate the movement of data between applications and platforms. Proficiency in applying data transformation and data cleansing as needed to ensure data quality and consistency across different data sources and destinations. Good experience in performing thorough testing and validation of data integration processes to ensure accuracy, reliability, and data integrity. Proficiency in working with SOA, RESTful APIs, and SOAP Web Services with all security policies. Good understanding and implementation experience with various security concepts, best practices,Security standards and protocols such as OAUTH, SSL/TLS, SSO, SAML, IDP (Identity Provider). Strong understanding of XML, XSD, XSLT, and JSON. Good understanding in RDBMS/NoSQL technologies (MSSQL, Oracle, MySQL). Proficiency with transport protocols (HTTPS, SFTP, JDBC) and experiences of messaging systems such as Kafka, ASB(Azure Service Bus) or RabbitMQ. Hands-on experience in Core Java and exposure to commonly used Java frameworks Desired Skills: Technical Familiarity with JavaScript frameworks like ReactJS, AngularJS, or NodeJS. Exposure to integration standards (EDI, EDIFACT, IDOC). Experience with modern web UI tools and frameworks. Exposure to DevOps tools such as Git, Jenkins, and CI/CD pipelines. Non-Technical Onshore Experience working directly with Customers Strong time management skills and the ability to handle multiple priorities. Detail-oriented and enthusiastic about learning new tools and technologies. Committed to delivering high-quality results. Flexible, responsible, and focused on quality work. Ability to prioritize tasks, work under pressure, and collaborate with cross-functional teams. About Adeptia Adeptia believes business users should be able to access information anywhere, anytime by creating data connections themselves, and its mission is to enable that self-service capability. Adeptia is a unique social network for digital business connectivity for “citizen integrators” to respond quickly to business opportunities and get to revenue faster. Adeptia helps Information Technology (IT) staff to manage this capability while retaining control and security. Adeptia’ s unified hybrid offering — with simple data connectivity in the cloud, and optional on-premises enterprise process-based integration — provides a competitive advantage to 450+ customers, ranging from Fortune 500 companies to small businesses. Headquartered in Chicago, Illinois, USA and with an office in Noida, India, Adeptia provides world-class support to its customers around-the-clock. For more, visit www.adeptia.com Our Locations: India R&D Centre : Office 56, Sixth floor, Tower-B, The Corenthum, Sector-62, Noida, U.P. US Headquarters : 332 S Michigan Ave, Unit LL-A105, Chicago, IL 60604, USA
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Backend Engineer with 4 to 8 years of experience, you will be responsible for developing enterprise-grade systems with a focus on backend development using Scala and Akka. You will work on building microservices, high-performance APIs, and real-time event processing engines. Your key responsibilities will include building microservices using Akka HTTP and Akka Streams, designing backend architecture and data processing flows, developing and maintaining RESTful APIs, optimizing application performance, and ensuring system reliability. You will also collaborate with DevOps for CI/CD pipelines and production deployments. In terms of technical skills, you should have proficiency in Scala and the Akka ecosystem, experience in REST API design and implementation, familiarity with Akka Streams and reactive systems, and knowledge of databases like PostgreSQL, MongoDB, or Cassandra. Experience with containerization tools like Docker and Kubernetes is also required. Preferred skills for this role include an understanding of data pipelines and message queues such as Kafka or RabbitMQ, as well as knowledge of authentication protocols like OAuth2 and JWT. In addition to technical skills, soft skills such as being detail-oriented with strong debugging abilities and the ability to work independently and in teams are essential for success in this role. This position offers you the opportunity to work on complex backend engineering projects, gain exposure to cloud-native development, and grow into tech lead roles. If you are looking to further develop your skills in Scala, Akka, and backend development in a collaborative environment, this role is perfect for you.,
Posted 2 days ago
7.0 years
0 Lacs
India
Remote
Mars Data Hiring full time Data Engineer ( SQL, Talend ETL, GCP & Azure) in remote locations Location: Remote / WFH Rel Experience: 7+ years Job Type: Full-Time Notice Period : Immediate to 15 Days Shift: Mid Shift (IST) We are seeking a highly skilled Senior Data Engineer with 7+ years of experience , specializing in SQL, Talend, Google Cloud Platform (GCP) BigQuery, and Microsoft Azure . The ideal candidate will be responsible for designing, building, and optimizing SQL-driven data pipelines , ensuring high performance, scalability, and data integrity. Required Qualifications · 7+ years of hands-on experience in SQL development, database performance tuning, and ETL processes. · Expert-level proficiency in SQL , including query optimization, stored procedures, indexing, and partitioning. · Strong experience with Talend for ETL/ELT development. · Hands-on experience with GCP Big Query and Azure SQL / Synapse Analytics . · Solid understanding of data modelling (relational & dimensional) and cloud-based data architectures. · Proficiency in Python or Shell scripting for automation and workflow management. · Familiarity with CI/CD, Git, and DevOps best practices for data engineering. Nice to Have · Experience with Apache Airflow or Azure Data Factory for workflow automation. · Knowledge of real-time data streaming (Kafka, Pub/Sub, Event Hubs). · Cloud certifications in GCP or Azure (e.g., Google Professional Data Engineer, Azure Data Engineer Associate). Why Join Us? · Lead and innovate in SQL-driven, cloud-first data solutions . · Work on cutting-edge data engineering projects in a collaborative, agile team. · Opportunities for career growth, leadership, and certifications . Key Responsibilities · Develop and optimize complex SQL queries , stored procedures, and indexing strategies for large datasets. · Design and maintain ETL/ELT data pipelines using Talend , integrating data from multiple sources. · Architect and optimize data storage solutions on GCP BigQuery and Azure SQL / Synapse Analytics. · Implement best practices for data governance, security, and compliance in cloud environments. · Work closely with data analysts, scientists, and business teams to deliver scalable solutions. · Monitor, troubleshoot, and improve data pipeline performance and reliability. · Automate data workflows and scheduling using orchestration tools (e.g., Apache Airflow, Azure Data Factory). · Lead code reviews, mentoring, and best practices for junior engineers. Share your resume to hr@marsdata.in
Posted 2 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
The ideal candidate for this role should possess at least 6 years of relevant IT work experience. You must have strong experience in Java8, Springboot, Hibernate, and data JPA. Additionally, you should have exposure to advanced Java frameworks like Spring Boot and be proficient with SQL. Experience in Microservices and working with Python is a must. In this role, you will be responsible for defining re-usable components model using ReactJS while considering modularity, scalability, performance, best practices, and integration with service layers such as microservices and CMS. Experience in messaging services is required, and knowledge of Kafka is a plus. Familiarity with Multithreading is also beneficial. It would be advantageous to have experience with AWS cloud services. Strong communication skills, problem-solving abilities, and the capacity to work effectively in large, collaborative teams to achieve organizational goals are essential. A passion for building an innovative culture is highly desirable.,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
haryana
On-site
Our Enterprise Technology division at Macquarie delivers cutting-edge solutions for global operations. We are currently seeking a Vice President who will be responsible for driving strategic direction and operational excellence. The ideal candidate will lead a talented team of engineers, fostering a culture of innovation and collaboration. At Macquarie, we take pride in bringing together diverse individuals and empowering them to explore a wide range of possibilities. As a global financial services group operating in 34 markets with 55 years of unbroken profitability, we offer a supportive and friendly team environment where everyone's ideas contribute to driving outcomes. In this key leadership position, you will have the opportunity to lead and mentor a high-performing team of engineers, cultivating a culture of innovation and continuous improvement. Your responsibilities will include developing and executing the strategic roadmap for enterprise data platforms, ensuring alignment with business objectives and timely project delivery. Collaboration with cross-functional teams to deliver effective data solutions, maintaining technical excellence, and embracing innovative technologies will be essential. The successful candidate should possess: - Extensive experience in data engineering and managing complex data platform projects - Demonstrated leadership skills in managing and developing engineering teams - Proficiency in data architecture, data warehousing, ETL processes, big data technologies (Hadoop, Spark, Kafka), AWS services, Kubernetes, and Docker - Strong analytical and problem-solving abilities for data-driven decision-making - Excellent communication and interpersonal skills for engaging and influencing stakeholders If you are inspired to contribute to building a better future and are excited about the role or working at Macquarie, we encourage you to apply. About Technology: Technology plays a crucial role in every aspect of Macquarie, for our people, customers, and communities. We are a global team that is passionate about accelerating the digital enterprise, connecting people and data, building platforms and applications, and designing tomorrow's technology solutions. Our Commitment to Diversity, Equity, and Inclusion: We are dedicated to providing reasonable adjustments to individuals who may require support during the recruitment process and in their working arrangements. If you need additional assistance, please inform us during the application process.,
Posted 2 days ago
9.0 - 15.0 years
0 Lacs
haryana
On-site
The client is a global IT services company headquartered in Southborough, Massachusetts, USA, founded in 1996, with a revenue of $1.8B and 35,000+ associates worldwide. Specializing in digital engineering and IT services, the company helps clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. Partnering with major firms in banking, healthcare, telecom, and media, the client combines deep industry expertise with agile development practices for scalable and cost-effective digital transformation. With operations in over 50 locations across more than 25 countries, the company has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. As a Java Full Stack with React Lead, you will be responsible for defining and driving the overall architecture and technical strategy across full-stack applications. Your role involves architecting scalable and secure microservices-based systems using Spring Boot and Java, as well as designing and guiding the development of responsive, user-friendly UIs using React.js and related technologies. You will provide technical leadership across backend and frontend development teams, collaborate with stakeholders to align technical solutions with business goals, and establish coding standards, architectural patterns, and best practices across the engineering organization. To qualify for this position, you should have a Bachelor's or Master's degree in Computer Science, Engineering, or a related discipline, along with 9+ years of hands-on software development experience and at least 3 years of experience in a software architect role. You should possess strong expertise in Java, Spring Boot, and RESTful microservices, as well as solid front-end development skills using React. Experience in designing and deploying applications in a cloud environment, specifically Azure, is required, along with familiarity with containerization (Docker) and orchestration (Kubernetes). Additionally, you should have a deep understanding of software architecture principles, including domain-driven design, event-driven architecture, and API gateway patterns. Your proven ability to design systems with high availability, resilience, and scalability, combined with excellent leadership, communication, and interpersonal skills, will be essential for this role. Experience with CI/CD pipelines, DevOps, and infrastructure as code (Terraform, CloudFormation), exposure to GraphQL, gRPC, or messaging systems like Kafka/RabbitMQ, and a strong understanding of data security, GDPR, and compliance standards are also required. Familiarity with working in Agile/Scrum environments is preferred for this position.,
Posted 2 days ago
6.0 years
0 Lacs
Kochi, Kerala, India
On-site
Tranzmeo is an innovative leader in the fiber optic sensing industry, dedicated to developing cutting-edge solutions that optimize pipeline monitoring, well monitoring, railways, aerospace, defense, and beyond. We harness the power of AI, ML, and Big Data to push the boundaries of sensing technology and provide real-time insights that matter. We’re hiring a passionate and experienced Technical Lead to head our Data Science & Machine Learning initiatives. If you’re excited about applying AI/ML to advanced sensing technologies and want to make a tangible impact, we want to hear from you! What You’ll Do: 🔹 Lead the design, development, and deployment of scalable AI/ML models tailored to fiber optic sensing data 🔹 Architect Big Data solutions for processing vast amounts of sensing data in real-time 🔹 Collaborate with product teams, sensor engineers, and stakeholders on innovative projects 🔹 Mentor and manage a talented team of data scientists and engineers 🔹 Ensure data quality, model performance, and security standards 🔹 Stay ahead of developments in AI/ML and fiber optic sensing fields to foster innovation What We’re Looking For: ✅ B.Tech in Computer Science, Data Science, or related fields ✅ 6+ years of experience in data science, machine learning, and Big Data, preferably in sensing or telecom sectors ✅ Proven leadership and team management skills ✅ Strong programming skills (Python) ✅ Experience with cloud platforms (AWS, GCP, Azure) ✅ Hands-on experience with ML frameworks (TensorFlow, PyTorch, scikit-learn) ✅ Skilled in Big Data tools (Hadoop, Spark, Kafka, Airflow) ✅ Good understanding of fiber optic sensing technologies and data characteristics is a plus ✅ Excellent problem-solving, communication, and collaboration skills Why Join Us? Work on innovative sensing technology projects with real-world impact Lead the AI and ML strategies in a pioneering industry Dynamic and growth-oriented work environment Opportunity to shape the future of fiber optic sensing solutions
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
ahmedabad, gujarat
On-site
You are an experienced Senior MEAN Stack Developer with 2-4 years of hands-on experience in designing, developing, and maintaining scalable web applications. Your expertise lies in MongoDB, Express.js, Angular, and Node.js (MEAN stack) with strong problem-solving abilities and leadership skills. Your responsibilities will include designing, developing, and deploying full-stack web applications using the MEAN stack. You will architect and optimize scalable, high-performance web applications, develop RESTful APIs and GraphQL services for seamless integration with frontend applications, and implement authentication and authorization mechanisms such as JWT, OAuth, and Role-Based Access Control. Additionally, you will optimize database queries and performance in MongoDB using Mongoose. In this role, you will mentor and guide junior developers, conduct code reviews and technical discussions, integrate third-party APIs, cloud services, and DevOps solutions for automation and deployment. You will also implement CI/CD pipelines, ensure best practices for software development and deployment, troubleshoot complex issues, debug applications, and improve code quality while staying updated with emerging technologies and contributing to the continuous improvement of development. To excel in this position, you should possess 3-5 years of experience in MEAN stack development, strong proficiency in Angular 15+ and frontend optimization techniques, advanced knowledge of Node.js and Express.js, including asynchronous programming and event-driven architecture. Expertise in MongoDB, MySQL & PostgreSQL, building microservices-based architectures, Docker, Kubernetes, CI/CD pipelines, and proficiency in Git, GitHub, or GitLab for version control is essential. Experience with message queues, WebSockets, real-time data processing, caching strategies, unit testing, integration testing, TDD, analytical and debugging skills, performance optimization, as well as excellent communication and leadership skills are required. Skills & Qualifications: - Strong proficiency in Angular 15+ and frontend optimization techniques - Advanced knowledge of Node.js and Express.js - Expertise in MongoDB, MySQL & PostgreSQL - Experience in building microservices-based architectures - Proficiency in Docker, Kubernetes, CI/CD pipelines - Proficiency in Git, GitHub, or GitLab - Experience with message queues (Redis, RabbitMQ, Kafka) - Understanding of WebSockets, real-time data processing, caching strategies - Hands-on experience in unit testing, integration testing, TDD - Strong analytical and debugging skills - Experience in performance optimization - Excellent communication and leadership skills Additional Skills: - Experience with GraphQL API development - Familiarity with AWS, Azure, Google Cloud Platform - Knowledge of Serverless architecture, cloud functions - Knowledge of Next.js, React.js - Experience in Angular Universal (Server-Side Rendering SSR) - Knowledge of Nginx, PM2, load balancing strategies - Exposure to AI/ML-based applications using Node.js - Utilization of AI tools like ChatGPT (ref:hirist.tech),
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
Qualcomm India Private Limited is seeking a Programmer Analyst to join the Identity and Access Management Team. This team is responsible for managing, maintaining, and enhancing production, development, and test systems in a 24x7 mission critical environment. We are looking for a highly motivated self-starter Java/Grails/UI Developer with excellent interpersonal, communication, problem-solving, and analytical skills. The ideal candidate should have around 3 to 5 years of hands-on technical experience with both Groovy/Grails and Java with Spring Framework. They must have exposure to building integrations and developing tools for the Identity & Access management domain. The qualified candidate should have a minimum of 4+ years of work experience in programming, scripting, and/or automation or IT-relevant work experience with a Bachelor's degree, or 6+ years without a Bachelor's degree. Additionally, they should have 2+ years of experience with Database Design structures such as Mongo DB and MySQL. The candidate should also possess 3-5 years of development experience with Java, J2EE, Spring Boot, and Web Services, along with experience in Agile development methodology, Test-Driven Development, Incremental delivery, and CI/CD. A thorough understanding of OOPS concepts, design principles, and implementation of different types of design patterns is required. The candidate should have experience with programming languages like Java, Groovy, Python, and Front-End related languages such as Angular, TypeScript, or JavaScript. Strong communication skills, the ability to collaborate effectively with stakeholders, and take complete responsibility of the features developed from coding to deployments are essential. The candidate should be able to identify and resolve any technical issues, contribute to critical application and product development projects, maintain and enhance existing APIs, and collaborate with cross-functional teams to gather and analyze system requirements. Proficiency in database management, including MongoDB, SQL, and NoSQL, as well as knowledge of messaging tools like Kafka, MQ, and Rabbit MQ, and proficiency in CI/CD are required. Design, implement, and maintain Java-based applications that can be high-volume and low-latency. Proficiency in front-end technologies like HTML, CSS, JavaScript, React, Angular is a plus. Qualcomm is an equal opportunity employer and is committed to providing an accessible process for individuals with disabilities. If you would like more information about this role, please contact Qualcomm Careers.,
Posted 2 days ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
The Senior DevOps, Platform, and Infra Security Engineer opportunity at FICO's highly modern and innovative analytics and decision platform involves shaping the next generation security for FICO's Platform. You will address cutting-edge security challenges in a highly automated, complex, cloud & microservices-driven environment inclusive of design challenges and continuous delivery of security functionality and features to the FICO platform as well as the AI/ML capabilities used on top of the FICO platform, as stated by the VP of Engineering. In this role, you will secure the design of the next-generation FICO Platform, its capabilities, and services. You will provide full-stack security architecture design from cloud infrastructure to application features for FICO customers. Collaborating closely with product managers, architects, and developers, you will implement security controls within products. Your responsibilities will also include developing and maintaining Kyverno policies for enforcing security controls in Kubernetes environments and defining and implementing policy-as-code best practices in collaboration with platform, DevOps, and application teams. As a Senior DevOps, Platform, and Infra Security Engineer, you will stay updated with emerging threats, Kubernetes security features, and cloud-native security tools. You will define required controls and capabilities for the protection of FICO products and environments, build and validate declarative threat models in a continuous and automated manner, and prepare the product for compliance attestations while ensuring adherence to best security practices. The ideal candidate for this role should have 10+ years of experience in architecture, security reviews, and requirement definition for complex product environments. Strong knowledge and hands-on experience with Kyverno and OPA/Gatekeeper are preferred. Familiarity with industry regulations, frameworks, and practices (e.g., PCI, ISO 27001, NIST) is required. Experience in threat modeling, code reviews, security testing, vulnerability detection, and remediation methods is essential. Hands-on experience with programming languages such as Java, Python, and securing cloud environments, preferably AWS, is necessary. Moreover, experience in deploying and securing containers, container orchestration, and mesh technologies (e.g., EKS, K8S, ISTIO), Crossplane for managing cloud infrastructure declaratively via Kubernetes, and certifications in Kubernetes or cloud security (e.g., CKA, CKAD, CISSP) are desirable. Proficiency with CI/CD tools (e.g., GitHub Actions, GitLab CI, Jenkins, Crossplane) is important. The ability to independently drive transformational security projects across teams and organizations and experience with securing event streaming platforms like Kafka or Pulsar are valued. Hands-on experience with ML/AI model security, IaC (e.g., Terraform, Cloudformation, Helm), and CI/CD pipelines (e.g., Github, Jenkins, JFrog) will be beneficial. Joining FICO as a Senior DevOps, Platform, and Infra Security Engineer offers you an inclusive culture reflecting core values, the opportunity to make an impact and develop professionally, highly competitive compensation and benefits programs, and an engaging, people-first work environment promoting work/life balance, employee resource groups, and social events to foster interaction and camaraderie.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
You are a talented Software Engineer with hands-on experience in Quarkus and Red Hat Fuse, responsible for designing, developing, and maintaining integration solutions. Your strong proficiency in Java, along with experience in Kafka-based event streaming, RESTful APIs, relational databases, and CI/CD pipelines deployed on OpenShift Container Platform (OCP), will be essential for success in this role. You are passionate about constructing robust microservices and integration systems within a cloud-native environment. Your responsibilities include designing and developing scalable microservices using the Quarkus framework, building and maintaining integration flows and APIs with Red Hat Fuse (Apache Camel), and developing and consuming RESTful web services and APIs. You will design, implement, and optimize Kafka producers and consumers for real-time data streaming and event-driven architecture. Additionally, writing efficient, well-documented, and testable Java code adhering to best practices, working with relational databases for schema design, queries, and performance tuning, and collaborating with DevOps teams to build and maintain CI/CD pipelines are key aspects of your role. You will deploy and manage applications on the OpenShift Container Platform (OCP), participate in code reviews, design discussions, and agile ceremonies, and troubleshoot and resolve production issues with a focus on stability and performance. Staying up-to-date with emerging technologies and recommending improvements will be crucial in this position. Your required skills and experience include strong proficiency in Java (Java 8 or above) and the Quarkus framework, expertise in Red Hat Fuse (or Apache Camel) for integration development, proficiency in designing and consuming REST APIs, experience with Kafka for event-driven and streaming solutions, solid understanding of relational databases and SQL, experience in building and maintaining CI/CD pipelines, hands-on experience deploying applications to OCP, working knowledge of containerization tools like Docker, familiarity with microservices architecture, cloud-native development, and agile methodologies, strong problem-solving skills, ability to work independently and in a team environment, good communication, and documentation skills. Preferred qualifications for this role include experience with messaging systems like Kafka, knowledge of other Java frameworks such as Spring Boot, experience with monitoring tools like Prometheus and Grafana, understanding of security best practices in microservices and API development, and cloud platform experience. You should hold a Bachelors or Masters degree in Computer Science, Engineering, or a related field, or possess equivalent practical experience to excel in this role.,
Posted 2 days ago
7.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Title : JD 21 - Software Development Engineer - 3 Department : Product & Engineering Team Location : Kolkata - Onsite Job Summary: We are seeking a highly skilled and experienced Software Development Engineer - 3 with deep expertise in React.js, Node.js, MySQL, Apache Kafka/RabbitMQ, and Temporal. The ideal candidate will lead the design and development of scalable/highly concurrent web applications where performance of the product is at the core , distributed systems, and real-time data pipelines. You will work closely with cross-functional teams to architect robust backend services and rich, interactive front-end interfaces. Key Responsibilities Architect and implement RESTful APIs and microservices . Write and promote reusable components/framework . Design and optimize MySQL database schemas, queries, and indexing strategies Integrate and maintain event-driven systems using Apache Kafka/Rabb MQ Develop workflows and long-running background processes using Temporal Ensure scalability, performance, and reliability of the application stack Collaborate with product managers, designers, and QA engineers to deliver high-quality solutions Mentor junior developers and participate in code reviews Write unit and integration tests to ensure code quality and coverage Participate in agile development processes, including sprint planning and retrospectives Skills & Qualifications 7+ years of professional experience in full-stack software development . Strong expertise in React.js, including hooks, state management, and component architecture . Solid backend development experience with Node.js (Express/Nest.js preferred) Proficiency with MySQL (query optimization, stored procedures, schema design) Hands-on experience with Apache Kafka (producer/consumer patterns, stream processing) /RabbitMQ Experience building and maintaining workflows using Temporal Strong understanding of software architecture, design patterns, and system design Familiarity with containerization and orchestration (Docker, Kubernetes – a plus) Strong debugging and problem-solving skills Excellent communication and team collaboration skills Working knowledge of ORM such as sequelizer Strong knowledge of the MVC architecture Relevant experience of working on the MySQL/NoSQL database will be preferred . Must be a team player . Previous experience with startups will be considered an added advantage Other Details: Engagement : Full Time No. of openings : Multiple - Frontend, Backend and Fullstack CTC : 24 - 30 LPA Location : Kolkata - Onsite About SuperProcure SuperProcure is a next-generation end-to-end TMS platform with multi-enterprise collaboration for shippers. It digitizes and automates all processes across the logistics value chain from vehicle sourcing to freight accounting, ensuring stakeholder collaboration, real-time visibility & transparency. We are determined to make the lives of the logistic teams easier, add value, and help in establishing a fair and beneficial process for businesses. SuperProcure is trusted by a diverse customer base spread across manufacturing & construction industries to boost their customer serviceability with cutting-edge technology solutions. Indian logistics spent is 14% of GDP, against 7-9% in developed countries. This makes the Indian industries less competitive in the international market and extra spent for domestic consumers. The logistics inefficiency is driven by manual processes spread across multiple stakeholders who work in silos to complete transportation of goods. SuperProcure aims to revolutionize Indian logistics to save 1% of GDP spending and make India globally competitive, enabling collaboration on a single platform, and driving logistics efficiencies. Our clients include some of the fortune 500 companies such as Tata Chemicals, Havells, KEI, ITC, PepsiCo, Tata Consumers, Dawaat, L&T Constructions, Aditya Birla, MP Birla Corporation, Sun Pharma, many more. SuperProcure is backed by IndiaMart & IIM Calcutta. It has been recognized for its innovation at the CII Industrial Innovation Awards, was recognized amongst the Top 50 Emerging start-ups in India by NASSCOM, and ranked Asia's top 10 TMS solution providers by the Global Supply Chain Council (GSCC) and ChainTech. More details about our journey can be found here Life @ SuperProcure SuperProcure operates in an extremely innovative, entrepreneurial, analytical, and problem-solving work culture. Every team member is fully motivated and committed to the company's vision and believes in getting things done. In our organization, every employee is the CEO of what he/she does; from conception to execution, the work needs to be thought through. Our people are the core of our organization, and we believe in empowering them and making them a part of the daily decision-making, which impacts the business and shapes the company's overall strategy. They are constantly provided with resources, mentorship, and support from our highly energetic teams and leadership. SuperProcure is extremely inclusive and believes in collective success. Looking for a bland, routine 9-6 job? PLEASE DO NOT APPLY. Looking for a job where you wake up and add significant value to a $180 Billion logistics industry every day? DO APPLY. Team: SuperProcure's success is fueled by our diverse & talented team of 150+ members, 50%+ of whom are women. Together, we collaborate with a shared passion for innovation and excellence. From visionary leaders & meticulous engineers to creative designers & customer support specialists, each plays a crucial role in our growth. We build lasting relationships, understand unique needs, and exceed expectations by delivering cutting-edge tailored solutions. Culture: All challenges and fun associated with start-ups. Competitive salary, responsibilities, flat hierarchy, daily challenges, long working hours, delivery pressure, and a fun workplace.
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
As a Software Engineer (A2), your main responsibilities will revolve around designing and developing AI-driven data ingestion frameworks and real-time processing solutions to enhance data analysis and machine learning capabilities across the full technology stack. You will be tasked with deploying, maintaining, and supporting application codes and machine learning models in production environments while ensuring seamless integration with front-end and back-end systems. Additionally, you will create and improve AI solutions that facilitate the smooth flow of data across the data ecosystem, enabling advanced analytics and insights for end users. Your role will also involve conducting business analysis to gather requirements and develop ETL processes, scripts, and machine learning pipelines that meet technical specifications and business needs using both server-side and client-side technologies. You will be responsible for developing real-time data ingestion and stream-analytic solutions leveraging technologies such as Kafka, Apache Spark, Python, and cloud platforms to support AI applications. Utilizing multiple programming languages and tools like Python, Spark, Hive, Presto, Java, and JavaScript frameworks, you will build prototypes for AI models and assess their effectiveness and feasibility. It will be essential to develop application systems adhering to standard software development methodologies to deliver high-performance AI solutions across the full stack. Collaborating with other engineers, you will provide system support to resolve issues and enhance system performance for both front-end and back-end components. Furthermore, you will operationalize open-source AI and data-analytic tools for enterprise-scale applications, ensuring alignment with organizational needs and user interfaces. Compliance with data governance policies by implementing and validating data lineage, quality checks, and data classification in AI projects will be a crucial aspect of your role. You will need to understand and follow the company's software development lifecycle effectively to develop, deploy, and deliver AI solutions. In terms of technical skills, you are expected to have a strong proficiency in Python, Java, C++, and familiarity with machine learning frameworks like TensorFlow and PyTorch. A deep understanding of ML, Deep Learning, and NLP algorithms is also required. Proficiency in building backend services using frameworks like FastAPI, Flask, and Django, as well as full-stack development skills with JavaScript frameworks such as React and Angular, will be essential for integrating user interfaces with AI models and data solutions. Preferred technical skills include expertise in big data processing technologies like Azure Databricks and Apache Spark to handle, analyze, and process large datasets for machine learning and AI applications. Additionally, certifications such as Microsoft Certified: Azure Data Engineer Associate or Azure AI Engineer are considered advantageous. To excel in this role, you should possess strong oral and written communication skills to effectively communicate technical and non-technical concepts to peers and stakeholders. Being open to collaborative learning, able to manage project components beyond individual tasks, and having a good understanding of business objectives driving data needs will be key behavioral attributes for success. This role is suitable for individuals holding a Bachelors or Masters degree in Computer Science with 2 to 4 years of software engineering experience.,
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
noida, uttar pradesh
On-site
Changing the world through digital experiences is what Adobe is all about. Adobe provides emerging artists to global brands with everything they need to design and deliver exceptional digital experiences. The company is passionate about empowering individuals to create beautiful and powerful images, videos, and apps, and revolutionize how companies engage with customers across all screens. Adobe is on a mission to hire the very best talent and is dedicated to creating exceptional employee experiences where everyone is respected and has equal opportunities. The company values new ideas from all levels within the organization, recognizing that the next big idea could come from anyone. At Adobe, employees are immersed in a work environment that is globally recognized. For 20 consecutive years, Adobe has been listed as one of Fortune's "100 Best Companies to Work For." Employees at Adobe are surrounded by colleagues who are committed to each other's growth and success. If you are looking to make a meaningful impact, Adobe is the place for you. Learn more about the career experiences of Adobe employees on the Adobe Life blog and explore the comprehensive benefits offered by the company. Role Summary: Digital Experience (DX) is a USD 4B+ business catering to enterprise needs, including 95%+ of Fortune 500 organizations. Adobe Marketo Engage, part of Adobe DX, is the world's largest marketing automation platform, offering enterprises a comprehensive solution to attract, segment, and nurture customers from initial discovery to becoming loyal advocates. The platform enables effective customer engagement across various touchpoints and surfaces. We are seeking talented and passionate engineers to join our team as we expand the business by developing next-generation products and enhancing our current offerings. If you have a passion for innovative technology, we would love to speak with you! What you'll do: This is an individual contributor role with responsibilities including: - Developing new services - Working in full DevOps mode, overseeing multiple engineering phases from early specs to deployment - Collaborating with architects, product management, and other engineering teams to enhance product features - Participating in the resolution of production issues and creating preventive solutions for future incidents Requirements: - B.Tech / M.Tech degree in Computer Science from a premier institute - 2+ years of relevant experience in software development - Strong computer science fundamentals and understanding of algorithm design and performance - Passionate about delivering quality work with persistence and high standards - Proficiency in Java, Spring Boot, Rest Services, and good knowledge of MySQL, Postgres, Cassandra, Kafka, Redis, MongoDB, Solr, Elastic Search, Spark is an added advantage Adobe is committed to ensuring accessibility for all users on Adobe.com. If you require accommodation due to a disability or special need during the website navigation or application process, please contact accommodations@adobe.com or call (408) 536-3015.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Salesforce has immediate opportunities for software developers who want their lines of code to have a significant and measurable positive impact for users, the company's bottom line, and the industry. You will be working with a group of world-class engineers to build the breakthrough features our customers will love, adopt, and use while keeping our trusted CRM platform stable and scalable. The software engineer role at Salesforce encompasses architecture, design, implementation, and testing to ensure we build products right and release them with high quality. We pride ourselves on writing high quality, maintainable code that strengthens the stability of the product and makes our lives easier. We embrace the hybrid model and celebrate the individual strengths of each team member while cultivating everyone on the team to grow into the best version of themselves. We believe that autonomous teams with the freedom to make decisions will empower the individuals, the product, the company, and the customers they serve to thrive. As a Senior Backend Software Engineer, your job responsibilities will include: - Building new and exciting components in an ever-growing and evolving market technology to provide scale and efficiency. - Developing high-quality, production-ready code that millions of users of our cloud platform can use. - Designing, implementing, and tuning robust APIs and API framework-related features that perform and scale in a multi-tenant environment. - Working in a Hybrid Engineering model and contributing to all phases of SDLC including design, implementation, code reviews, automation, and testing of the features. - Building efficient components/algorithms on a microservice multi-tenant SaaS cloud environment. - Conducting code reviews, mentoring junior engineers, and providing technical guidance to the team (depending on the seniority level). Required Skills: - Mastery of multiple programming languages and platforms. - 5+ years of backend software development experience including designing and developing distributed systems at scale. - Deep knowledge of object-oriented programming and other scripting languages: Java, Python, Scala C#, Go, Node.JS, and C++. - Strong PostgreSQL/SQL skills and experience with relational and non-relational databases including writing queries. - A deeper understanding of software development best practices and demonstrate leadership skills. - Degree or equivalent relevant experience required. Experience will be evaluated based on the core competencies for the role (e.g. extracurricular leadership roles, military experience, volunteer roles, work experience, etc.). Preferred Skills: - Experience with developing SAAS products over public cloud infrastructure - AWS/Azure/GCP. - Experience with Big-Data/ML and S3. - Hands-on experience with Streaming technologies like Kafka. - Experience with Elastic Search. - Experience with Terraform, Kubernetes, Docker. - Experience working in a high-paced and rapidly growing multinational organization. Benefits & Perks: - Comprehensive benefits package including well-being reimbursement, generous parental leave, adoption assistance, fertility benefits, and more. - World-class enablement and on-demand training with Trailhead.com. - Exposure to executive thought leaders and regular 1:1 coaching with leadership. - Volunteer opportunities and participation in our 1:1:1 model for giving back to the community. For more details, visit [Salesforce Benefits](https://www.salesforcebenefits.com/).,
Posted 2 days ago
4.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As a Software Engineer specializing in Backend Development, you will have the opportunity to progress towards a senior role while tackling unique and challenging technological issues for clients. Your role will involve delving into customer pain points to deliver effective and actionable solutions. Additionally, you will gain exposure to architectural innovation, performance optimization, and diverse projects that will enhance your skill set and enrich your portfolio. Working closely with end clients, particularly large enterprises, you will encounter real-world challenges and gain insights into practical business scenarios. You will play a crucial part in projects, with the potential for incentives based on your contributions to project success. Moreover, you will be engaged in optimizing the performance of a platform handling substantial data volumes ranging from 5 to 8 petabytes. Collaboration opportunities await as you work alongside engineers from renowned tech giants such as Google, AWS, and ELK. Furthermore, as you progress within the company, you will have the chance to assume leadership responsibilities and even establish your own team during project engagements. Clear paths for career advancement within the organization will be available, contingent upon your performance and demonstrated capabilities. Joining a culture that values innovation and promotes experimentation, you will have a platform to voice your ideas, receive recognition for your contributions, and enjoy a hands-off management approach that emphasizes accountability and ownership over your tasks. In this role, your primary focus will revolve around Software Engineering, specifically Backend Development with expertise in AWS, Elasticsearch, J2EE, Java, Kafka, MongoDB, Spring, and Spring Boot technologies. A background in Computer Science from a reputable institution, complemented by a strong academic track record, is expected. Proficiency in ElasticSearch, a problem-solving mindset, and a solid grasp of data structures and algorithms are essential. With 4 to 10 years of hands-on development experience, particularly in building products for large enterprises, you will bring valuable skills to this dynamic role. Don't miss this chance to be part of a forward-thinking company that offers a platform for growth, learning, and professional development, where your skills and contributions are highly valued.,
Posted 2 days ago
3.0 years
0 Lacs
Andhra Pradesh, India
On-site
At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description: Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications: Bachelor's degree in Engineering or a related field. 3 to 5 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills: Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams. Preferred Skills: Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams.
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
bhubaneswar
On-site
As a Lead Engineer specializing in Java Full-Stack, you will be an integral part of our core engineering team, utilizing your 8 to 12 years of experience to drive innovation and tackle complex challenges across full-stack applications. Your role involves developing scalable, secure, and high-performance applications using Java, Spring, and related technologies. You will be responsible for building quick Proof of Concepts (POCs) to validate ideas and lead technical direction. In this position, you will own full-stack features utilizing Angular, HTML, CSS, Spring Boot, and AWS. Your expertise will be crucial in working with both SQL and NoSQL databases such as PostgreSQL, MySQL, and MongoDB. Leveraging AWS for integration, deployment, scalability, and DevOps will be a key aspect of your responsibilities. Furthermore, you will utilize Kafka or RabbitMQ for event-driven and asynchronous communication, integrating Retool and AWS Amplify for rapid UI development. Containerizing applications using Docker and managing them via Kubernetes will also fall under your purview. Setting up and managing CI/CD pipelines using Jenkins or equivalent tools, as well as applying Agile practices, will be essential in ensuring efficient project delivery. To excel in this role, you must possess a strong background in Java, Spring Boot, Spring Data/JPA, and Spring Cloud. Proficiency in Angular, HTML, and CSS is also required. Experience with Python and Node.js for backend tasks or microservices, along with a solid understanding of SQL and NoSQL databases, is highly valued. Hands-on experience with AWS services, Kafka, RabbitMQ, Docker, Kubernetes, Jenkins, Retool, and AWS Amplify will be beneficial. Additionally, your qualifications should include a Bachelor's degree in Computer Science, Engineering, or a related field, coupled with at least 8 years of experience in full-stack development. A problem-solving attitude, a passion for learning new technologies, and strong communication skills are essential for success in this role.,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Full Stack Developer with 4+ years of professional experience, you will be responsible for designing, developing, and deploying high-quality web applications using C#, ASP.Net MVC, ASP.Net Web API, and .Net Core. You will integrate with Microsoft Dynamics CRM and Microsoft Business Central to ensure seamless data flow and business logic. Your role will involve developing and optimizing SQL Server and MongoDB databases, including writing complex queries, stored procedures, and managing database performance. Additionally, you will be building and maintaining user interfaces with HTML, CSS, and JavaScript, focusing on responsive and efficient design. You will leverage Radish Cache and Kafka to improve system performance, reliability, and scalability. Working with cloud platforms such as Azure/AWS, you will deploy and maintain applications in a cloud-native environment. Your responsibilities will also include writing clean, maintainable, and well-documented code, participating in code reviews, troubleshooting, and debugging to ensure a high level of quality in all deliverables. At GlobalLogic, we prioritize a culture of caring, where you will experience an inclusive environment of acceptance and belonging. We are committed to your continuous learning and development, providing you with opportunities to grow personally and professionally. You will have the chance to work on meaningful projects that make an impact and help clients reimagine what's possible. We believe in the importance of balance and flexibility, offering various work arrangements to help you achieve the perfect balance between work and life. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest companies. By joining us, you become part of a high-trust organization where integrity is key. We value truthfulness, candor, and integrity in everything we do, both internally and with our clients. If you are looking to work in a dynamic environment with cutting-edge technologies and impactful projects, GlobalLogic offers a rewarding opportunity to grow and contribute to innovative solutions shaping the digital landscape.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
You are a talented Software Engineer with hands-on experience in Quarkus and Red Hat Fuse, responsible for designing, developing, and maintaining integration solutions. Your strong proficiency in Java, along with experience in Kafka-based event streaming, RESTful APIs, relational databases, and CI/CD pipelines deployed on OpenShift Container Platform (OCP) will be essential for this role. You are passionate about building robust microservices and integration systems in a cloud-native environment. Your key responsibilities will include designing and developing scalable microservices using the Quarkus framework. You will build and maintain integration flows and APIs leveraging Red Hat Fuse (Apache Camel) for enterprise integration patterns. Developing and consuming RESTful web services and APIs will be part of your routine tasks. Additionally, you will design, implement, and optimize Kafka producers and consumers for real-time data streaming and event-driven architecture. Writing efficient, well-documented, and testable Java code adhering to best practices is crucial. Collaborating with DevOps teams to build and maintain CI/CD pipelines for automated build, test, and deployment workflows will be part of your responsibilities. You will deploy and manage applications on OpenShift Container Platform (OCP) following containerization best practices (Docker). Participating in code reviews, design discussions, and agile ceremonies is expected from you. Troubleshooting and resolving production issues with a focus on stability and performance is also an important aspect of your role. You will need to keep up-to-date with emerging technologies and recommend improvements to the existing systems. Your strong experience with Java (Java 8 or above) and the Quarkus framework, expertise in Red Hat Fuse (or Apache Camel) for integration development, and proficiency in designing and consuming REST APIs are required skills for this position. Experience with Kafka for event-driven and streaming solutions, a solid understanding of relational databases and SQL, and experience in building and maintaining CI/CD pipelines will be beneficial. Hands-on experience deploying applications to OpenShift Container Platform (OCP), working knowledge of containerization tools like Docker, familiarity with microservices architecture, cloud-native development, and agile methodologies are also expected from you. Strong problem-solving skills, ability to work independently as well as in a team environment, good communication, and documentation skills are essential for success in this role. Preferred qualifications include experience with messaging systems like Kafka, knowledge of other Java frameworks such as Spring Boot, experience with monitoring tools such as Prometheus and Grafana, understanding of security best practices in microservices and API development, and cloud platform experience. You should hold a Bachelor's or Master's degree in Computer Science, Engineering, or a related field, or possess equivalent practical experience for this position.,
Posted 2 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Summary We are seeking Python Engineers to design backend systems, APIs, and data pipelines for healthcare AI applications. The role focuses on creating scalable and secure solutions for seamless data integration and communication, with a strong emphasis on Generative AI technologies and LLM (Large Language Model) APIs. Key Responsibilities 1 . Backend Development : Develop and maintain robust APIs using FastAPI. Handle integrations with external APIs, including LLM APIs such as OpenAI, Anthropic, and Google Gemini. Design efficient database schemas for healthcare data storage. Data Pipeline Management : Implement ETL processes for structured and unstructured data. Create workflows for real-time data processing and delivery. System Optimization Optimize backend systems for performance and scalability. Conduct unit testing and debugging to ensure robust code quality. Collaboration and Documentation : Work with DevOps engineers for seamless deployments. Document code and processes to maintain high project transparency. Skills And Qualifications Core Skills : Experience with RAG Development, Proficiency in frameworks like LangChain, LlamaIndex and FastAPI. Experience with relational (PostgreSQL, MySQL), NoSQL (MongoDB) and vector databases (e.g., Qdrant, Faiss), Embedding models for search and chatbot applications Familiarity with LLM APIs such as OpenAI, Anthropic, and Google Gemini. Healthcare Knowledge Familiarity with healthcare APIs (FHIR, HL7). Knowledge of security standards for healthcare applications. Additional Skills Experience with containerization (Docker) and orchestration tools (Kubernetes). Knowledge of message brokers like RabbitMQ or Kafka Salary : As per Industry Job Mode : Hybrid (ref:hirist.tech)
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Backend Developer specializing in Scala & Akka Streams, you will be responsible for creating stream flows using Akka Streams, handling real-time ingestion and transformation logic, and monitoring and tuning the performance of stream operations. You will work on building low-latency, real-time pipelines in a data-driven product environment. To excel in this role, you must possess proficiency with Akka Streams and materializers, have Scala development experience, and be familiar with async processing and stream composition. Additionally, knowledge of Kafka and Alpakka integrations, as well as stream supervision and error recovery, will be considered a plus. Apart from technical skills, soft skills like being result-oriented, committed, and demonstrating strong ownership and collaboration skills are highly valued. By joining our team, you will have the opportunity to solve streaming challenges at scale and work on high-throughput backend systems. If you have 4 to 8 years of experience and are looking for a role where you can contribute to building robust and efficient streaming solutions, this position could be the right fit for you.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
raipur
On-site
As a Backend Developer at Fundtec, you will architect, develop, and optimize robust backend systems to support cutting-edge fintech products. Your responsibilities include designing scalable architectures, integrating systems, and collaborating with cross-functional teams to create high-performance applications that drive innovation in the financial services domain. You will design, develop, and maintain backend systems, RESTful and GraphQL APIs, and microservices using Node.js. Implementing scalable and secure architecture following best practices in modular service design is crucial. Additionally, you will build and manage messaging-based communication systems using Kafka, optimize systems for high performance and reliability, and design database schemas and queries in PostgreSQL. Leading architectural discussions, contributing to system-level decisions, and ensuring seamless integration of backend services with front-end applications and third-party systems are key aspects of the role. Working with containerized environments using Docker and orchestration tools like Kubernetes is also required. Maintaining code quality by writing clean, efficient, and well-documented code following modern development standards is essential. You will conduct thorough code reviews, offer mentorship, and provide technical guidance to junior developers. Collaborating with front-end developers, product managers, and DevOps engineers to deliver end-to-end solutions is a significant part of the job. In planning sessions, you will translate business requirements into technical deliverables. Monitoring and troubleshooting production systems to ensure uptime and performance, as well as identifying and applying optimizations for performance bottlenecks, are part of your responsibilities. To be successful in this role, you should have 5+ years of hands-on experience in backend development, strong proficiency in Node.js, event-driven architecture using Kafka, and solid experience with PostgreSQL. In-depth knowledge of RESTful APIs, GraphQL, microservices architecture, cloud platforms (AWS, Azure, or GCP), and containerization and orchestration (Docker, Kubernetes) is required. A strong understanding of system design, architecture patterns, and security best practices along with a degree in Computer Science, Engineering, or a related field is necessary. Experience or interest in the fintech domain is considered a plus. Soft skills such as problem-solving, analytical abilities, communication, collaboration, independence, and time management in a fast-paced environment are also valued. Benefits include health insurance, paid sick time, and paid time off. The schedule is full-time, Monday to Friday, with a day shift.,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Full Stack Developer at our organization, you will bring your 4+ years of professional experience to design, develop, and deploy high-quality web applications using C#, ASP.Net MVC, ASP.Net Web API, and .Net Core. Your strong proficiency in C# and .Net Framework, along with hands-on experience with Microsoft Dynamics CRM and Microsoft Business Central, will be instrumental in ensuring seamless data flow and business logic integration. Your solid knowledge of SQL Server, MongoDB, and experience with database design and performance optimization will be put to use in developing and optimizing databases, writing complex queries, stored procedures, and managing database performance. Proficiency in developing RESTful/GraphQL services using ASP.Net Web API and front-end technologies like HTML, CSS, JavaScript, and responsive web design will be key in building and maintaining user interfaces with a focus on responsive and efficient design. You will leverage cache management techniques like Radish Cache and message brokers such as Kafka to improve system performance, reliability, and scalability. Practical experience with cloud platforms like Azure/AWS will enable you to deploy and maintain applications in a cloud-native environment. Your ability to write clean, maintainable, and well-documented code will contribute to the overall quality of deliverables. As part of our team, you will participate in code reviews, troubleshooting, and debugging to ensure a high level of quality in all deliverables. Your excellent communication skills, both written and verbal, will allow you to work effectively within a team and collaborate with colleagues on various projects. At GlobalLogic, we offer a culture of caring where people come first, continuous learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust organization where integrity is key. Join us to be part of a team that engineers impact for and with clients worldwide, creating innovative solutions and shaping the digital landscape. About GlobalLogic: GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for collaborating with the world's largest companies to create innovative digital products and experiences. Join us in transforming businesses, redefining industries, and working on cutting-edge solutions that shape the world today.,
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France