Jobs
Interviews

16979 Nosql Jobs - Page 14

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 years

0 Lacs

Kochi, Kerala, India

On-site

Node.js Backend Developer (with React.js knowledge) - 1+ yrs exp | Immediate Joiners Preferred Location: Kochi, Kerala | Type: Full-time We are seeking a talented and motivated Node.js Backend Developer with minimum 1+ years of experience in node to join our growing team. If you are passionate about building robust, scalable, and efficient server-side applications, and have a solid grasp of modern JavaScript, we want to hear from you. While your primary focus will be on the backend, a good understanding of React.js is a significant plus. What You'll Do: Design, develop, and maintain high-performance, scalable backend services and APIs using Node.js and its frameworks. Work with databases (SQL/NoSQL) to design efficient schemas and manage data. Collaborate closely with our frontend (React.js) developers to integrate user-facing elements with server-side logic. Implement secure coding practices and ensure data protection. Optimize applications for maximum speed, scalability, and stability. Write clean, maintainable, and well-tested code. Participate in code reviews, debugging, and troubleshooting. What You Bring: 1+ years of hands-on experience developing with Node.js. Strong proficiency in JavaScript (ES6+) and asynchronous programming concepts. Good knowledge of TypeScript is a significant advantage. Experience with Node.js frameworks like Express.js, NestJS, or similar. Solid understanding of RESTful API design and implementation. Familiarity with database systems (e.g., PostgreSQL, MongoDB, MySQL). Experience with version control systems (Git). Good understanding of React.js and frontend development principles is a strong advantage. Knowledge of unit testing for backend applications. About You: A clear and effective communicator, comfortable collaborating with cross-functional teams. Proactive, self-driven, and possess strong problem-solving abilities. Dedicated to writing high-quality, performant code. Perks: Competitive compensation package Opportunities for continuous learning and professional growth A friendly, supportive, and collaborative work environment #nodejs #backenddeveloper #javascript #typescript #reactjs #fullstack

Posted 1 day ago

Apply

3.0 - 4.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Description: -  Involvement in the full software development life cycle within broadly defined parameters and providing software solutions while keeping into consideration the software quality needs.  Design and define the interaction between the different component pieces.  Write efficient code based on the brief given by the team lead.  Fast prototyping of proof-of-concept features/applications based on the brief.  Develop and maintain new features on the Java stack.  Own the delivery of an entire piece of a system or application.  Management and execution against project plans and delivery commitments.  Work closely with peers and leads to develop the best technical design and approach for new product development.  Build software solutions for complex problems.  Compliance with build/release and configuration management process.  Responsible to develop unit test cases for his/her project module.  Execution of appropriate quality plans, project plans, test strategies and processes for development activities in concert with business and project management efforts. Required Skills:-  Good understanding of Object-Oriented Programming Concepts, hands-on knowledge on Java stack (Spring/Hibernate)  Development across multiple browsers/platforms on the Website.  Good Understanding of SQL/NoSQL data stores.  Fair Understanding of Responsive High-Level Designs.  Work experience in a product/start-up company is a plus.  Familiarity with MVC, SOA, and restful web services.  Work with other teams and manage time across multiple projects and tasks in a deadline-driven, team environment.  Good to have knowledge of JavaScript (AngularJS/ReactJS)/HTML/CSS/jQuery front-end code across a broad array of Interactive web.  Understand agile methodology and in still best practices into the process.  Good Understanding Data Structures and Algorithms (DSA) in Java. Qualification & Experience: -  3-4 years of experience in software development.  B. E. / B. Tech. / M. E. / M. Tech. / M. S. in Computer Science, Electronics or a related field.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

The Sleep Company is Hiring: Backend Developer II (Node.js + GraphQL) Are you passionate about building scalable backend systems and modern APIs? Join The Sleep Company , a fast-growing D2C brand pioneering comfort tech. We're looking for a Backend Developer II who thrives in high-performance environments and loves solving complex engineering challenges. Tech Stack You’ll Work With Languages : TypeScript, JavaScript Frameworks : Node.js, NestJS, ExpressJS APIs : GraphQL (Apollo Server), REST Databases : MySQL, MongoDB, DynamoDB Cloud : AWS Testing : Jest, Mocha Your Role Build and maintain server-side applications using Node.js , NestJS , and ExpressJS Develop high-performance GraphQL APIs , including schema design and resolver logic Integrate SQL and NoSQL databases like MySQL , MongoDB , and DynamoDB Write clean, scalable, and testable code aligned with industry best practices Contribute to architecture, participate in code reviews, and collaborate across teams Stay current with evolving backend technologies and GraphQL tools Requirements 2–5 years of experience in backend development with Node.js Proficient in TypeScript , with hands-on experience in NestJS and ExpressJS Deep understanding of GraphQL APIs, including schema definition and performance optimization Strong knowledge of SQL and NoSQL databases Experience with unit testing frameworks such as Jest or Mocha Bonus: Familiarity with Apollo Server, Prisma, or similar tools Why The Sleep Company? Be part of a mission-led brand redefining the future of comfort Work in a fast-paced, product-first, and innovation-driven team Competitive salary, modern tech stack, and meaningful ownership Ready to build the future of comfort tech? Apply now and make an impact. #BackendDeveloper #NodeJSJobs #GraphQL #NestJS #TheSleepCompany #TechJobsMumbai #NowHiring

Posted 1 day ago

Apply

5.0 - 10.0 years

15 - 18 Lacs

Mumbai Metropolitan Region

On-site

Sinch is a global leader in the growing market for Communication Platforms as a Service (CPaaS) and mobile customer engagement. We are specialists in allowing businesses to reach everyone on the planet, in seconds or less, through mobile messaging, email, voice, and video. We reach every phone on earth. From the lifechanging to the time-saving, we're helping our customers to interact with people like never before. For you, that means working in an environment that offers an incredible variety of exciting challenges, and the chance to impact how billions of people engage with their favourite brands. The dream of personalizing content to all 15 billion phones on the planet is no fairy tale! More than 150,000 businesses, including many of the world's largest companies and mobile operators, use Sinch's advanced technology platform to engage with their customers. Moreover, Sinch has been profitable and fast-growing since its foundation. Sinch's core values are Make it Happen, Dream Big, Keep it Simple and Win Together. These values describe how our global organization works and inspire every of our more than 5,000 employees across 60 different countries. We are seeking an experienced Mid-Level Database Expert to join our dynamic IT team. The successful candidate will be responsible for the administration, optimization, and security of our database environments. This role requires a deep understanding of database architecture, performance tuning, and data integrity to support mission-critical business applications. The ideal candidate will collaborate closely with development, infrastructure, and analytics teams to deliver robust and scalable data solutions. Key Responsibilities Database Architecture & Design Design and implement scalable, high-performance MySQL database solutions (InnoDB preferred). Assist in schema design, indexing strategies, and query optimization. Participate in data modeling discussions and decisions with engineering teams. Operations & Maintenance Monitor and tune database performance (query performance, slow logs, deadlocks, I/O, memory). Understand and work well with Table Partitions. Able to work with partitions like partition exchanges etc. Manage database provisioning, configuration, and capacity planning and estimation. Completely own MySQL backup and restore strategy using tools such as (but not limited to) mysqldump, xtrabackup, or MySQL Enterprise Backup. High Availability & Disaster Recovery Set up and manage MySQL High Availability using InnoDB Cluster, Group Replication, or Percona XtraDB Cluster. Implement disaster recovery plans including backup validation and offsite replication. Maintain InnoDB ClusterSet or other DR replication frameworks, where applicable. Security & Compliance Enforce database security best practices: user management, encryption, SSL, auditing. Perform towards closure of Database SCD points as per requirements. Ensure databases are hardened and comply with internal or external security standards. Automation Automate routine tasks like purging, archival using SQL Procedures. Good to have Shell scripts, Python, or Ansible. Work with CI/CD pipelines to support DB migrations and deployments. Support & Troubleshooting Provide production support during business and off-hours for critical incidents. Act as the go-to person for MySQL-related issues, advising developers and IT on best practices. Requirements 5 to 10 years of hands-on experience as a MySQL DBA in production environments. Strong experience with MySQL 5.7/8.0, especially InnoDB, replication, and performance tuning. Proficiency in setting up HA/DR using MySQL InnoDB Cluster, ProxySQL, MySQL Router, or alternatives. Experience working with MySQL Enterprise Edition specific features and related packages. Experience with backup/restore tools, including logical and physical backups. Working knowledge of Linux/Unix systems (RedHat/CentOS-based distros preferred). Knowledge of MySQL Enterprise features such as Firewall, TDE, and Audit Plugin. Experience with database migration (on-premises to cloud or vice versa). Excellent problem-solving skills, attention to detail, and communication abilities. If you are a results-driven individual with a passion for team leadership and management, we would love to hear from you! You'll Stand Out From The Crowd If You Have Familiarity with cloud DB deployments (AWS RDS, Aurora, or GCP Cloud SQL) is a plus. Exposure to NoSQL or secondary data systems (MongoDB, Redis). Good to have Experience with monitoring tools like PMM, Nagios, Zabbix, or custom Prometheus/Grafana setups. Experience with database migration (on-premises to cloud or vice versa). Experience working in agile, DevOps-driven teams. Being You At Sinch We're a worldwide group of people, committed to diversity. We're working to offer an increasingly inclusive workplace wherever you are. No matter who you are, you'll be able to explore new career and growth options - sharing your voice, building your path and making it happen with us. We're proud to be an equal opportunity employer, and all qualified applicants will be considered to join our team regardless of race, colour, religion, gender identity or expression, sexual orientation, pregnancy, disability, age, veteran status, and more. Your Life At Sinch Being a Sincher is all about learning and being in pursuit of new challenges. Working in the offices, at home, or in a hybrid model, that means celebrating change and the unknown, rolling up your sleeves and seeing what impact you can have on the world. The only way is up, and you'll be reaching for the opportunities that match where you want to take your career. It's closer than you think. Our expert teams are built from some of the most experienced in the industry. We employ people from all over the world, from all walks of life and from all backgrounds. We work together, feeding on our diversity to make us stronger, and we encourage each other to be the best we can be. Innovation drives us, and we challenge ourselves every day. Are you ready? Join us on our journey! Know more about us: www.sinch.com Benefits Private Health Insurance coverage, Accidental Coverage, Optional Parental Health Coverage Flexible and supportive working environment Paid Time Off, Maternity, Paternity Leave, Wellbeing Programs Training & Development Internal Mobility Competitive salary and Allowances Highly engaged, collaborative, and transparent work culture Constant skill upgradation by learning and career advancement opportunities in a high-growth environment Annual health checkup. Global Mobility Program/Opportunities. Engaging Rewards & Recognition programs

Posted 1 day ago

Apply

3.0 years

0 Lacs

Surat, Gujarat, India

On-site

Job Title: MERN Stack Developer Experience: 3+ Years (Minimum 2+ years in Node.js & 1 year as Full Stack Developer) Location: Surat Employment Type: Full-time Job Summary: We are looking for a skilled and passionate MERN Stack Developer to join our development team. The ideal candidate will have a strong background in Node.js (2+ years) and at least 1 year of hands-on experience as a Full Stack Developer using the MERN stack (MongoDB, Express.js, React.js, Node.js). You will be responsible for developing and maintaining scalable web applications, working across the full development lifecycle. Key Responsibilities: Develop, test, and maintain high-quality web applications using the MERN stack. Write clean, scalable, and efficient code in JavaScript/TypeScript. Develop and integrate RESTful APIs using Node.js and Express.js . Build responsive and dynamic front-end interfaces using React.js . Design and manage NoSQL databases using MongoDB . Collaborate with cross-functional teams including UI/UX designers, product managers, and QA teams. Optimize applications for performance, scalability, and security. Participate in code reviews and maintain high coding standards. Debug and resolve technical issues across the stack. Required Skills & Qualifications: Minimum 3+ years of total software development experience . 2+ years of strong experience in back-end development using Node.js . 1+ year of hands-on experience as a Full Stack Developer with MERN . Strong understanding of RESTful APIs, JSON, and HTTP protocols. Experience with version control systems like Git . Familiarity with Agile/Scrum development practices. Good understanding of web security, performance optimization, and deployment. Nice to Have: Experience with cloud platforms (e.g., AWS, Azure). Familiarity with Docker, CI/CD pipelines. Knowledge of GraphQL, TypeScript. Prior experience with testing frameworks like Jest, Mocha. Why Join Us? Opportunity to work on innovative projects with a modern tech stack. Collaborative and growth-focused work environment. Competitive salary and benefits. Flexible work culture.

Posted 1 day ago

Apply

1.0 years

1 - 1 Lacs

Pune, Maharashtra, India

On-site

🌟 Backend Engineering Intern Location: Pune (In-office) Duration: 12 Months | Type: Full-Time Internship 🚀 Excited to build scalable backend systems for real-world SaaS platforms? As a Backend Engineering Intern at Bynry Inc., you’ll join a focused 20-member team building enterprise-grade platforms that serve thousands of users across industries. You’ll contribute to backend systems powering APIs, data infrastructure, and system integrations in a cloud-first, multi-tenant environment. If you're passionate about backend development, solving complex engineering challenges, and working in a fast-paced startup—this internship is your launchpad! 👥 Who Can Apply We’d Love To Hear From You If You Are available for a full-time, in-office internship Can start immediately and commit to a 1-year duration Are based in Pune or willing to relocate Have a basic understanding of backend development concepts Are comfortable working with APIs, databases, and backend logic Are excited to learn about SaaS system design and large-scale architecture Are curious, motivated, and quality-focused in your approach to engineering 📅 Day-to-Day Responsibilities Build and maintain RESTful APIs for real-world B2B use cases Design and model relational and/or NoSQL databases Work on multi-tenant architectures and data isolation strategies Optimize backend systems for performance and scalability Collaborate with frontend, product, and DevOps teams Integrate third-party APIs and internal microservices Participate in code reviews, unit testing, and technical discussions Own end-to-end development of small to medium-sized features 📚 What You’ll Learn Real-world backend engineering in a B2B SaaS environment Enterprise-scale system design, API development, and data modeling Development workflows using Git, CI/CD, and deployment practices How to collaborate with product and DevOps teams for full delivery cycles Best practices in clean code, documentation, testing, and performance tuning 🎓 Qualifications Pursuing or completed a degree in Computer Science, IT, or related field Familiarity with backend programming concepts in any language Understanding of databases (SQL or NoSQL) and data structures Some experience building or consuming REST APIs (projects, internships, etc.) Exposure to Git, HTTP protocols, and basic debugging Strong analytical and problem-solving skills Willingness to learn and thrive in a fast-paced startup environment 🧰 Skills You’ll Use or Develop Technical Skills Backend development with frameworks like Express.js, Flask, or Spring API creation and integration Database modeling (PostgreSQL, MongoDB, etc.) Performance optimization Git and collaboration workflows Basic cloud and deployment understanding (AWS/GCP, Docker) Soft Skills Problem-solving and debugging Clear technical communication Time management and ownership Agile development and collaboration Documentation and clean code practices 💰 Compensation & Benefits Stipend: ₹10,000 month Learning & development budget Access to real project codebases and cloud environments Opportunity for full-time conversion based on performance 💡 Why Bynry? At Bynry, we’re modernizing the utility sector through Smart360—a powerful, cloud-based platform transforming how cities and businesses operate. As a Backend Engineering Intern, you’ll work on meaningful challenges, grow alongside experienced engineers, and build systems that deliver impact at scale. Join a team that values ownership, learning, and innovation—and get real experience in solving enterprise-scale engineering problems. Note: This is a paid internship.Skills: system integration,restful apis,git,performance optimization,debugging,backend development with frameworks like express.js, flask, or spring,database modeling,clear technical communication,problem-solving and debugging,time management and ownership,api creation and integration,git and collaboration workflows,api,backend programming,documentation and clean code practices,database modeling (postgresql, mongodb, etc.),sql,agile development and collaboration,basic cloud and deployment understanding (aws/gcp, docker)

Posted 1 day ago

Apply

6.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Location: Kolkata Experience Required: 6 to 8+ Employment Type: Full-time CTC: 8 to 12 LPA About Company: At Gintaa, were redefining how Indians order food. With our focus on affordability, exclusive restaurant partnerships, and hyperlocal logistics, we aim to scale across India's Tier 1 and Tier 2 cities. Were backed by a mission-driven team and expanding rapidly – now’s the time to join the core tech leadership and build something impactful from the ground up. Job Description : We are seeking a talented and experienced Mid-Senior Level Software Engineer (Backend) to join our dynamic team. The ideal candidate will have strong expertise in backend technologies, microservices architecture, and cloud environments. You will be responsible for designing, developing, and maintaining high-performance backend systems to support scalable applications. Responsibilities: Design, develop, and maintain robust, scalable, and secure backend services and APIs. Work extensively with Java, Spring Boot, Spring MVC, Hibernate to build and optimize backend applications. Develop and manage microservices-based architectures. Implement and optimize RDBMS (MySQL, PostgreSQL) and NoSQL (MongoDB, Cassandra, etc.) solutions. Build and maintain RESTful services for seamless integration with frontend and third-party applications. Basic understanding of Node.js and Python is a bonus. Ability to learn and work with new technologies. Optimize system performance, security, and scalability. Deploy and manage applications in cloud environments (AWS, GCP, or Azure). Collaborate with cross-functional teams including frontend engineers, DevOps, and product teams. Convert business requirements into technical development items using critical thinking and analysis. Lead a team and manage activities, including task distribution. Write clean, maintainable, and efficient code following best practices. Participate in code reviews, technical discussions, and contribute to architectural decisions. Required Skills: 6+ years of experience in backend development with Java and Spring framework (Spring Boot, Spring MVC). Strong knowledge of Hibernate (ORM) and database design principles. Hands-on experience with Microservices architecture and RESTful API development. Proficiency in RDBMS (MySQL, PostgreSQL) and NoSQL databases (MongoDB, Cassandra, etc.). Experience with cloud platforms such as AWS, GCP, or Azure. Experience with Kafka or equivalent tool for messaging and stream processing. Basic knowledge of Node.js for backend services and APIs. Proven track record of working in fast-paced, Agile/Scrum methodology. Proficient with Git. Familiarity with IDE tools such as Intellij and VS Code. Strong problem-solving and debugging skills. Understanding of system security, authentication and authorization best practices. Excellent communication and collaboration skills. Preferred Skills (Nice to Have): Experience with Elasticsearch for search and analytics. Familiarity with Firebase tools for real-time database, firestore, authentication, and notifications. Hands-on experience with Google Cloud Platform (GCP) services. Hands-on experience of working with Node.js and Python. Exposure to containerization and orchestration tools like Docker and Kubernetes. Experience in CI/CD pipelines and basic DevOps practices.

Posted 1 day ago

Apply

25.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Atobic: Atobic was founded by ex-Amazonians with 25+ years of experience at Amazon and AWS. Our name - Atobic - blends Atoms, Bits, and Cells : the fundamental units of the physical, digital, and biological worlds. It reflects our vision to power the next generation of AI-Native products that sit at the intersection of humans, software, and systems, transform industries and elevate everyday experiences- reimagined through the lens of AI. We are pioneering a new era of AI-first product engineering , where human creativity is amplified by intelligent tools. Whether it’s a startup looking to ship fast or an enterprise rethinking its core systems, our teams build future-ready products that learn, adapt, and scale. Role Overview: We’re looking for curious and driven Software Product Engineers who want to kickstart their careers building software the AI-native way. As an early team member, you’ll learn how to use AI tools- not just to write code faster, but to engineer smarter, more resilient systems . You’ll work closely with senior engineers to deliver real-world solutions to real-world problems. This role is ideal for those excited about the intersection of software, AI, and product thinking. What You’ll Do: Collaborate with product and engineering teams to design and develop scalable software solutions Leverage AI-assisted development tools (e.g. code generation) to accelerate delivery Write clean, maintainable, and well-documented code Participate in code reviews and receive structured mentorship Learn modern software architectures and AI-enablement techniques Contribute to continuous improvement in product and engineering workflows Embrace agile and iterative development processes Build and deploy production-grade software systems Required Qualifications: Bachelor’s degree in Computer Science, Engineering, or a related field Strong foundations in computer science (DSA, OOP, SDLC) Strong Problem Solving skills Proficiency in at least one programming language (e.g. Python, Java, JavaScript) Familiarity with basic tools like Git, command-line, IDEs Growth mindset and willingness to learn new tools and paradigms Strong analytical and communication skills Preferred Skills: Exposure to web technologies (React, HTML/CSS, REST APIs) Experience with cloud environments (e.g. AWS, GCP) Familiarity with SQL/NoSQL databases Interest in AI/ML or experience with tools like GitHub Copilot, LangChain, or Hugging Face Basic understanding of DevOps or CI/CD pipelines What We Offer: Direct mentorship from senior engineers and Amazon/AWS alumni Opportunity to work on diverse projects across startups and enterprises Training on how to use AI to enhance your engineering workflow Clear growth path with exposure to product thinking, architecture, and AI tooling Collaborative, low-hierarchy, high-trust environment Competitive salary Join us in shaping the future of software engineering- where humans build, and AI amplifies.

Posted 1 day ago

Apply

10.0 years

0 Lacs

Thiruvananthapuram, Kerala, India

On-site

Job Title: Solution Architect – Enterprise Applications with AI/ML Exposure Experience: 8–10 Years Location: Thiruvananthapuram Employment Type: Full-time (WFO) Salary Offered : Max 22 LPA Job Summary We’re looking for a talented Solution Architect with a strong foundation in designing and developing large-scale enterprise applications, and a growing interest or experience in modern AI/ML-driven technologies. This role is ideal for someone who is confident in architecture, passionate about emerging trends like AI/ML, and eager to help shape intelligent systems in collaboration with engineering and business teams. Design scalable, secure, and maintainable enterprise application architectures. Translate business needs into clear technical solutions and design patterns. Lead design discussions, code reviews, and solution planning with internal teams. Guide development teams by providing architectural direction and mentoring. Collaborate with DevOps for smooth deployment and CI/CD implementation. Participate in client meetings and technical solutioning discussions. Explore and propose the use of AI/ML capabilities where relevant, especially in areas like intelligent search, automation, and data insights Must-Have Skills & Qualifications 8–10 years of experience in software development and solution architecture. Hands-on expertise in either Python or C# .Net. Deep understanding of software architecture patterns—microservices, event-driven, layered designs. Experience with cloud platforms (AWS, Azure, or GCP). Solid knowledge of databases (SQL & NoSQL), APIs, and integration techniques. Exposure to or strong interest in AI/ML technologies—especially those involving intelligent automation or data-driven systems. Good interpersonal and communication skills; experience interfacing with clients. Capability to lead technical teams and ensure delivery quality. Preferred Skills Awareness of LLMs, vector databases (e.g., Pinecone, FAISS), or RAG-based systems is a plus. Familiarity with Docker, Kubernetes, or DevOps workflows. Knowledge of MLOps or experience working alongside data science teams. Certifications in cloud architecture or AI/ML are a bonus.

Posted 1 day ago

Apply

0.0 years

0 Lacs

Gurugram, Haryana

On-site

Immediate Hiring Senior Frontend Developer - ReactJs Manufac Analytics Pvt. Ltd. (India) About Manufac : Manufac Analytics Private Limited is a software solutions provider that operates on two fronts: services and products. On the services front, the company builds web, desktop, and mobile applications to solve various challenges in industries such as healthcare, hospitality, education, finance, and manufacturing sectors. While on the product front, Manufac Analytics is working towards building tools that help improve manufacturing plants' operational efficiency. Job Profile: The ideal candidate will be responsible for designing, developing, testing, and debugging responsive web and mobile applications for the company. Using JavaScript, HTML, and CSS, this candidate will be able to translate user and business needs into functional front-end design. Responsibilities: As an SDE, the company is seeking : Developing user interfaces for web applications, primarily via ReactJS, but occasionally via other frameworks like Angular/Vue too. A talented JavaScript/TypeScript developer to join their team. The role includes developing user interfaces for web applications. Setting up REST & GraphQL-based APIs using SQL and NoSQL databases. Deploying and maintaining cloud infrastructure AWS, GCP, and Azure. Enhancing and maintaining a product's test suite. Participating in the full lifecycle of software development: gathering product requirements, architecture, solution design, development, quality assurance, and maintenance. The role also involves mentoring others on best practices and effectively sharing knowledge and delivering high-quality implementations while promoting engineering excellence within the team. Skills and Qualifications : The ideal candidate should have a Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 2+ years of experience in JavaScript/TypeScript and ReactJS. The candidate should also have a good understanding of mobile responsive design and cross-browser compatibility. Proficiency in at least one of the following cloud platforms: AWS, GCP, or Azure. experience with version control systems such as Git. Strong problem-solving skills, and attention to detail. Ability to work independently and as part of a team. College pass out candidates can apply for internship too. Salary : As per the industry standards. Job Type: Full-time Pay: From ₹200,000.00 per year Benefits: Flexible schedule Paid sick time Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): What is the Notice Period? What is your CTC What is your Expected CTC? Education: Bachelor's (Preferred) Work Location: In person

Posted 1 day ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Company Description CodeChavo is a global provider of digital transformation solutions. Collaborating with top technology companies, CodeChavo drives real impact through innovation and agility. Our team is dedicated to transforming client operations from design to delivery. By outsourcing digital projects and building quality tech teams, we help clients thrive in a rapidly evolving tech landscape. Role Description Location - Gurgaon 6 Days Working - Hybrid (4 days office) This is a full-time on-site role for a Python Developer (FastAPI) SDE 1 located in Gurugram. The Python Developer will be responsible for designing and developing back-end web applications using FastAPI , writing reusable and efficient code, and collaborating with cross-functional teams. You will build and improve the backend services that power our chatbots, telephony flows, CRM automations, lead-allocation engine, database APIs, FAQ pipelines and third-party API integrations. You’ll work in Python with FastAPI, AWS Lambda and message queues & third party supplier API’s while shipping reliable, scalable features for real users. Key Responsibilities: • Design, code and ship new Python services and API endpoints using FastAPI or similar frameworks • Integrate third-party platforms and handle secure data flow • Write unit and integration tests; troubleshoot and fix bugs quickly • Monitor performance, optimise queries and follow good logging and alerting practices • Keep task lists updated and share progress in regular team meetings • Learn and apply CI/CD, AWS/GCP/Azure basics and other DevOps practices as you grow Qualifications • 1–2 years of software development experience • Strong coding skills in Python and RESTful APIs • Familiarity with FastAPI, Flask or Django, plus SQL/NoSQL databases • Basic knowledge of AWS Lambda (or any serverless stack), message queues and Git

Posted 1 day ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Position: Senior Principal Data Engineer Experience: Must have 10+ years of experience About Role: We are looking for experienced Data engineers with excellent problem-solving skills to develop machine-learning powered Data Products design to enhance customer experiences. About us: Nurtured from the seed of a single great idea - to empower the traveler - MakeMyTrip went on to pioneer India’s online travel industry Founded in the year 2000 by Deep Kalra, MakeMyTrip has since transformed how India travels. One of our most memorable moments has been to ring the bell at NASDAQ in 2010. Post-merger with the Ibibo group in 2017, we created a stronger identity and traction for our portfolio of brands, increasing the pace of product and technology innovations. Ranked amongst the LinkedIn Top 25 companies 2018. GO-MMT is the corporate entity of three giants in the Online Travel Industry—Goibibo, MakeMyTrip and RedBus. The GO-MMT family celebrates the compounded strengths of their brands. The group company is easily the most sought after corporate in the online travel industry. About the team: MakeMyTrip as India’s leading online travel company and provides petabytes of raw data which is helpful for business growth, analytical and machine learning needs. Data Platform Team is a horizontal function at MakeMyTrip to support various LOBs (Flights, Hotels, Holidays, Ground) and works heavily on streaming datasets which powers personalized experiences for every customer from recommendations to in-location engagement. There are two key responsibilities of Data Engineering team: One to develop the platform for data capture, storage, processing, serving and querying. Second is to develop data products starting from; o personalization & recommendation platform o customer segmentation & intelligence o data insights engine for persuasions and o the customer engagement platform to help marketers craft contextual and personalized campaigns over multi-channel communications to users We developed Feature Store, an internal unified data analytics platform that helps us to build reliable data pipelines, simplify featurization and accelerate model training. This enabled us to enjoy actionable insights into what customers want, at scale, and to drive richer, personalized online experiences. Technology experience : Extensive experience working with large data sets with hands-on technology skills to design and build robust data architecture Extensive experience in data modeling and database design At least 6+ years of hands-on experience in Spark/BigData Tech stack Stream processing engines – Spark Structured Streaming/Flink Analytical processing on Big Data using Spark At least 6+ years of experience in Scala Hands-on administration, configuration management, monitoring, performance tuning of Spark workloads, Distributed platforms, and JVM based systems At least 2+ years of cloud deployment experience – AWS | Azure | Google Cloud Platform At least 2+ product deployments of big data technologies – Business Data Lake, NoSQL databases etc Awareness and decision making ability to choose among various big data, no sql, and analytics tools and technologies Should have experience in architecting and implementing domain centric big data solutions Ability to frame architectural decisions and provide technology leadership & direction Excellent problem solving, hands-on engineering, and communication skills

Posted 1 day ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Position: Lead Data Engineer Experience: Must have 6+ years of experience About Role: We are looking for experienced Data engineers with excellent problem-solving skills to develop machine-learning powered Data Products designed to enhance customer experiences. About us: Nurtured from the seed of a single great idea - to empower the traveler - MakeMyTrip went on to pioneer India’s online travel industry Founded in the year 2000 by Deep Kalra, MakeMyTrip has since transformed how India travels. One of our most memorable moments has been to ring the bell at NASDAQ in 2010. Post-merger with the Ibibo group in 2017, we created a stronger identity and traction for our portfolio of brands, increasing the pace of product and technology innovations. Ranked amongst the LinkedIn Top 25 companies 2018. GO-MMT is the corporate entity of three giants in the Online Travel Industry—Goibibo, MakeMyTrip and RedBus. The GO-MMT family celebrates the compounded strengths of their brands. The group company is easily the most sought after corporate in the online travel industry. About the team: MakeMyTrip as India’s leading online travel company and provides petabytes of raw data which is helpful for business growth, analytical and machine learning needs. Data Platform Team is a horizontal function at MakeMyTrip to support various LOBs (Flights, Hotels, Holidays, Ground) and works heavily on streaming datasets which powers personalized experiences for every customer from recommendations to in-location engagement. Our team's key responsibilities are: Design, construct, and maintain robust data systems and architectures Develop and optimize data capture, storage, processing, serving, and querying platforms Create data products for personalization, recommendation, customer segmentation, and intelligence Enhance our Measurement platform for A/B experimentation Contribute to our Feature Store, an internal unified data analytics platform Participate in the development of our next-generation Travel Planner using Generative AI and Multi-Agent frameworks. Implement and optimize data solutions for travel-specific use cases, such as: Analyzing cross-city travel patterns to extend trip recommendations Identifying correlations between hotel bookings in different areas to suggest complementary destinations Required Skills and Experience: Extensive experience working with large data sets with hands-on technology skills to design and build robust data architecture At least 6+ years of hands-on experience in Spark/BigData Tech stack Expertise in stream processing engines like Spark Structured Streaming or Apache Flink Analytical processing on Big Data using Spark At least 4+ years of experience in Scala. Experience with Python will be a plus. Hands-on administration, configuration management, monitoring, performance tuning of Spark workloads, Distributed platforms, and JVM based systems At least 2+ years of cloud deployment experience – AWS | Azure | Google Cloud Platform 2 or more product deployments of big data technologies – Business Data Lake, NoSQL databases etc Awareness and decision-making ability to choose among various big data, NoSQL, analytics tools and technologies Should have experience in architecting and implementing domain centric big data solutions Ability to frame architectural decisions and provide technology leadership & direction Excellent problem solving, hands-on engineering, and communication skills At MakeMyTrip, we're committed to innovation and excellence in the travel industry. Join us in shaping the future of travel through data-driven solutions and advanced technologies. If you're passionate about leveraging data to create exceptional travel experiences, we want to hear from you!

Posted 1 day ago

Apply

1.5 years

0 Lacs

Pune, Maharashtra, India

On-site

About Improzo At Improzo ( Improve + Zoe; meaning Life in Greek ), we believe in improving life by empowering our customers. Founded by seasoned Industry leaders, we are laser focused on delivering quality-led commercial analytical solutions to our clients. Our dedicated team of experts in commercial data, technology, and operations has been evolving and learning together since our inception. Here, you won't find yourself confined to a cubicle; instead, you'll be navigating open waters, collaborating with brilliant minds to shape the future. You will work with leading Life Sciences clients, seasoned leaders and carefully chosen peers like you! People are at the heart of our success, so we have defined our CARE values framework with a lot of effort, and we use it as our guiding light in everything we do. We CARE ! Customer-Centric: Client success is our success. Prioritize customer needs and outcomes in every action. Adaptive: Agile and Innovative, with a growth mindset. Pursue bold and disruptive avenues that push the boundaries of possibilities. Respect: Deep respect for our clients & colleagues. Foster a culture of collaboration and act with honesty, transparency, and ethical responsibility. Execution: Laser focused on quality-led execution; we deliver! Strive for the highest quality in our services, solutions, and customer experiences. About The Role We are seeking a highly skilled Data and Reporting Developer (Improzo Level - Associate) to join our dynamic team. As a Big Data Developer, you will be responsible for designing, developing, and maintaining large-scale data processing systems using big data technologies. This is an exciting opportunity for a talented individual with a strong technical background and a passion for working with large datasets to deliver high-quality solutions. Key Responsibilities Big Data Application Development: Design, develop, and maintain scalable data pipelines and big data applications. Work with distributed processing frameworks (e.g., Apache Hadoop, Apache Spark) to process and analyze large datasets. Write optimized and high-performance code to handle data ingestion, processing, and analysis in real-time or batch processing environments. Data Architecture: Collaborate with data architects and other stakeholders to design and implement data storage solutions using HDFS, NoSQL databases (e.g., Cassandra, HBase, MongoDB), and cloud data platforms (e.g., AWS, Azure, Google Cloud). Develop and maintain ETL pipelines for data extraction, transformation, and loading (ETL) using ETL tools or Databricks Work with data lakes and data warehousing solutions for large-scale data storage and processing. Data Integration: Integrate various data sources into the big data ecosystem (e.g., data from relational databases, APIs, third-party tools, IoT devices). Ensure seamless data flow between systems while maintaining data quality and integrity. Reporting Development: Design and Build reports on tools like Power BI, Tableau, Microstrategy Design basic UI / UX as per client needs Performance Optimization: Optimize big data workflows and queries to ensure high performance and scalability. Implement data partitioning, indexing, and other techniques to handle large datasets efficiently. Collaboration and Communication: Collaborate with cross-functional teams (data scientists, analysts, engineers, etc.) to understand business requirements and deliver data solutions that meet those needs. Communicate complex technical concepts and data insights clearly to non-technical stakeholders. Testing and Quality Assurance: Perform unit testing and troubleshooting of data pipelines to ensure data consistency and integrity. Implement data validation and error-checking mechanisms to maintain high-quality data. Documentation: Maintain clear documentation of data pipelines, architecture, and workflows for ease of understanding, modification, and scaling. Agile Methodology: Participate in agile development processes, including sprint planning, daily stand-ups, and code reviews. Innovation and Research: Stay up-to-date with the latest trends and advancements in big data technologies and techniques. Continuously evaluate new tools, frameworks, and technologies to improve performance and capabilities. Qualifications Bachelor’s or master’s degree in a quantitative field such as computer science, statistics or mathematics. 1.5+ years of experience on data management or reporting projects involving big data technologies Hands on experience or thorough training on technologies like AWS, Azure, GCP, Databricks, Spark. Experience in a Pharma Commercial setting or Pharma data management will be an added advantage General proficiency in programming languages such Python; experience with data management (SQL, MDM, etc.) and visualization tools like Tableau, PowerBI, etc. Excellent communication, presentation, and interpersonal skills. Attention to details, biased for quality and client centricity. Ability to work independently and as part of a cross-functional team. Benefits Competitive salary and benefits package. Opportunity to work on cutting-edge tech projects, transforming the life sciences industry Collaborative and supportive work environment. Opportunities for professional development and growth. Skills: microstrategy,mongodb,aws,tableau,google cloud,sql,etl tools,databricks,cassandra,nosql databases,nosql database,apache spark,azure,python,distributed processing frameworks,big data application development,hbase,hdfs,mdm,power bi,apache hadoop

Posted 1 day ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Nium, the Leader in Real-Time Global Payments Nium , the global leader in real-time, cross-border payments, was founded on the mission to deliver the global payments infrastructure of tomorrow, today. With the onset of the global economy, its payments infrastructure is shaping how banks, fintechs, and businesses everywhere collect, convert, and disburse funds instantly across borders. Its payout network supports 100 currencies and spans 220+ markets, 100 of which in real-time. Funds can be disbursed to accounts, wallets, and cards and collected locally in 35 markets. Nium's growing card issuance business is already available in 34 countries. Nium holds regulatory licenses and authorizations in more than 40 countries, enabling seamless onboarding, rapid integration, and compliance – independent of geography. The company is co-headquartered in San Francisco and Singapore. About the Team: Tech Support team’s goal is to offer better customer service and manage anything that happens in a live/production environment. Nium is one of the beasts to use all the latest tools for support functions. Tools like Kibana, Nagios, and cloud watch enable us to have greater visibility of our services offered to clients and eventually makes our system available round the clock, our uptime is always greater than 99.95%. About the Role: As part of Tech support team, you will be responsible for resolving technical issues faced by users, whether related to software, hardware, or network systems. They troubleshoot problems, offer solutions, and escalate complex cases to specialized teams when necessary. Using ticketing systems, they manage and prioritize support requests to ensure timely and effective resolutions. This role requires strong problem-solving abilities, excellent communication skills, and a solid understanding of technical systems to help users maintain productivity. Key Responsibilities: Based on customer insights and channel performance data, develop and execute on a content roadmap that engages key personas at each point in the customer journey, from top-funnel acquisition to nurture and ongoing customer education, both on Nium offerings as well as the industry Build, develop and manage a high-performing team and culture to achieve breakthrough results; exceptionally high standards and holds self and others accountable Generating editorial ideas and concepts Work with regional Growth Marketing teams to ensure content development aligns with funnel-building objectives for each target segment Measure the impact of our content strategy as well as the performance of individual assets and proactively refine our resource allocation and prioritization accordingly Requirements: 5-7 yrs experience in Supporting production applications on AWS or other cloud platforms Good knowledge of RDBMS (PostgreSQL or MSSQL) and NoSQL databases Willing to work in day/night shifts Understanding of troubleshooting and monitoring microservice and serverless architectures Working knowledge of containerization technology and various orchestration platforms. e.g., Docker, Kubernetes etc. for troubleshooting and monitoring purposes Experience in build and deploy automation tools (Ansible/Jenkins/Chef) Experienced in release and change management, incident, and problem management both from a technology and process perspective Familiar with Server log Management with tools like ELK, and Kibana Certification in ITIL, COBIT or Microsoft Operations Framework would be an added plus Experience with Scripting languages or shell scripting to automate daily tasks would be an added plus Ability to Diagnose and Troubleshoot Technical Issues Ability to work proactively to identify the issue with the help of log monitoring Experienced in monitoring tools, frameworks, and processes Excellent interpersonal skills Experience with one or more case-handling tools like: Freshdesk, Zendesk, JIRA Skilled at triaging and root cause analysis Ability to provide step-by-step technical help, both written and verbal What we offer at Nium We Value Performance: Through competitive salaries, performance bonuses, sales commissions, equity for specific roles and recognition programs, we ensure that all our employees are well rewarded and incentivized for their hard work. We Care for Our Employees: The wellness of Nium’ers is our #1 priority. We offer medical coverage along with 24/7 employee assistance program, generous vacation programs including our year-end shut down. We also provide a flexible working hybrid working environment (3 days per week in the office). We Upskill Ourselves: We are curious, and always want to learn more with a focus on upskilling ourselves. We provide role-specific training, internal workshops, and a learning stipend We Constantly Innovate: Since our inception, Nium has received constant recognition and awards for how we approach both our business and talent opportunities. - 2022 Great Place To Work Certification - 2023 CB Insights Fintech 100 List of Most Promising Fintech Companies . - CNBC World’s Top Fintech Companies 2024. We Celebrate Together: We recognize that work is also about creating great relationships with each other. We celebrate together with company-wide social events, team bonding activities, happy hours, team offsites, and much more! We Thrive with Diversity: Nium is truly a global company, with more than 33 nationalities, based in 18+ countries and more than 10 office locations. As an equal opportunity employer, we are committed to providing a safe and welcoming environment for everyone. For more detailed region specific benefits : https://www.nium.com/careers#careers-perks For more information visit www.nium.com Depending on your location, certain laws may regulate the way Nium manages the data of candidates. By submitting your job application, you are agreeing and acknowledging that you have read and understand our Candidate Privacy Notice located at www.nium.com/privacy/candidate-privacy-notice .

Posted 1 day ago

Apply

10.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Position: Senior Principal Data Engineer Experience: Must have 10+ years of experience About Role: We are looking for experienced Data engineers with excellent problem-solving skills to develop machine-learning powered Data Products design to enhance customer experiences. About us: Nurtured from the seed of a single great idea - to empower the traveler - MakeMyTrip went on to pioneer India’s online travel industry Founded in the year 2000 by Deep Kalra, MakeMyTrip has since transformed how India travels. One of our most memorable moments has been to ring the bell at NASDAQ in 2010. Post-merger with the Ibibo group in 2017, we created a stronger identity and traction for our portfolio of brands, increasing the pace of product and technology innovations. Ranked amongst the LinkedIn Top 25 companies 2018. GO-MMT is the corporate entity of three giants in the Online Travel Industry—Goibibo, MakeMyTrip and RedBus. The GO-MMT family celebrates the compounded strengths of their brands. The group company is easily the most sought after corporate in the online travel industry. About the team: MakeMyTrip as India’s leading online travel company and provides petabytes of raw data which is helpful for business growth, analytical and machine learning needs. Data Platform Team is a horizontal function at MakeMyTrip to support various LOBs (Flights, Hotels, Holidays, Ground) and works heavily on streaming datasets which powers personalized experiences for every customer from recommendations to in-location engagement. There are two key responsibilities of Data Engineering team: One to develop the platform for data capture, storage, processing, serving and querying. Second is to develop data products starting from; o personalization & recommendation platform o customer segmentation & intelligence o data insights engine for persuasions and o the customer engagement platform to help marketers craft contextual and personalized campaigns over multi-channel communications to users We developed Feature Store, an internal unified data analytics platform that helps us to build reliable data pipelines, simplify featurization and accelerate model training. This enabled us to enjoy actionable insights into what customers want, at scale, and to drive richer, personalized online experiences. Technology experience : Extensive experience working with large data sets with hands-on technology skills to design and build robust data architecture Extensive experience in data modeling and database design At least 6+ years of hands-on experience in Spark/BigData Tech stack Stream processing engines – Spark Structured Streaming/Flink Analytical processing on Big Data using Spark At least 6+ years of experience in Scala Hands-on administration, configuration management, monitoring, performance tuning of Spark workloads, Distributed platforms, and JVM based systems At least 2+ years of cloud deployment experience – AWS | Azure | Google Cloud Platform At least 2+ product deployments of big data technologies – Business Data Lake, NoSQL databases etc Awareness and decision making ability to choose among various big data, no sql, and analytics tools and technologies Should have experience in architecting and implementing domain centric big data solutions Ability to frame architectural decisions and provide technology leadership & direction Excellent problem solving, hands-on engineering, and communication skills

Posted 1 day ago

Apply

5.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Java Software Developer Job Location: Bangalore (Whitefield) – Onsite Duration: 6 months Responsibilities Designing and implementing software using Java. Ensuring the code quality by implementing unit, integration and end-to-end tests. Optimising application for maximum performance. Working with DevOps related activities (CI/CD, infrastructure, etc.). Working in a distributed team and cooperate with other teams on cross-team deliveries. Troubleshooting, analysing, and solving integration and production issues. Skills 5+ years of professional Java software development experience. Strong knowledge of Java 11+. Strong system design skills and programming skills. Experience with Spring Framework 5, Spring Boot 3, REST, CI and Kanban. Familiarity with common algorithms, data structures and multithreading. Familiarity with Git/Gradle, Docker, Kubernetes, Continuous Delivery and DevOps. Experience with RDBMS (MySQL, etc.) and NoSQL (Apache Cassandra, etc.) databases. Comfortable with making technical and architectural decisions autonomously. Communicative, able to explain concepts well to both tech and non-tech. Notice Period: Immediate- 30 Days Email to : vaishnavi.yelgulwar@aptita.com

Posted 1 day ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Position: Senior Data Engineer II Experience: Must have 4+ years of experience About Role: We are looking for experienced Data engineers with excellent problem-solving skills to develop machine-learning powered Data Products design to enhance customer experiences. About us: Nurtured from the seed of a single great idea - to empower the traveler - MakeMyTrip went on to pioneer India’s online travel industry Founded in the year 2000 by Deep Kalra, MakeMyTrip has since transformed how India travels. One of our most memorable moments has been to ring the bell at NASDAQ in 2010. Post-merger with the Ibibo group in 2017, we created a stronger identity and traction for our portfolio of brands, increasing the pace of product and technology innovations. Ranked amongst the LinkedIn Top 25 companies 2018. GO-MMT is the corporate entity of three giants in the Online Travel Industry—Goibibo, MakeMyTrip and RedBus. The GO-MMT family celebrates the compounded strengths of their brands. The group company is easily the most sought after corporate in the online travel industry. About the team: MakeMyTrip as India’s leading online travel company and provides petabytes of raw data which is helpful for business growth, analytical and machine learning needs. Data Platform Team is a horizontal function at MakeMyTrip to support various LOBs (Flights, Hotels, Holidays, Ground) and works heavily on streaming datasets which powers personalized experiences for every customer from recommendations to in-location engagement. There are two key responsibilities of Data Engineering team: One to develop the platform for data capture, storage, processing, serving and querying. Second is to develop data products starting from; o personalization & recommendation platform o customer segmentation & intelligence o data insights engine for persuasions and o the customer engagement platform to help marketers craft contextual and personalized campaigns over multi-channel communications to users We developed Feature Store, an internal unified data analytics platform that helps us to build reliable data pipelines, simplify featurization and accelerate model training. This enabled us to enjoy actionable insights into what customers want, at scale, and to drive richer, personalized online experiences. Technology experience : Extensive experience working with large data sets with hands-on technology skills to design and build robust data architecture Extensive experience in data modeling and database design At least 4+ years of hands-on experience in Spark/BigData Tech stack Stream processing engines – Spark Structured Streaming/Flink Analytical processing on Big Data using Spark At least 4+ years of experience in Java/Scala Hands-on administration, configuration management, monitoring, performance tuning of Spark workloads, Distributed platforms, and JVM based systems At least 2+ years of cloud deployment experience – AWS | Azure | Google Cloud Platform At least 2+ product deployments of big data technologies – Business Data Lake, NoSQL databases etc Awareness and decision making ability to choose among various big data, no sql, and analytics tools and technologies Should have experience in architecting and implementing domain centric big data solutions Ability to frame architectural decisions and provide technology leadership & direction Excellent problem solving, hands-on engineering, and communication skills

Posted 1 day ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Position: Principal Data Engineer Experience: Must have 8+ years of experience About Role: We are looking for experienced Data engineers with excellent problem-solving skills to develop machine-learning powered Data Products design to enhance customer experiences. About us: Nurtured from the seed of a single great idea - to empower the traveler - MakeMyTrip went on to pioneer India’s online travel industry Founded in the year 2000 by Deep Kalra, MakeMyTrip has since transformed how India travels. One of our most memorable moments has been to ring the bell at NASDAQ in 2010. Post-merger with the Ibibo group in 2017, we created a stronger identity and traction for our portfolio of brands, increasing the pace of product and technology innovations. Ranked amongst the LinkedIn Top 25 companies 2018. GO-MMT is the corporate entity of three giants in the Online Travel Industry—Goibibo, MakeMyTrip and RedBus. The GO-MMT family celebrates the compounded strengths of their brands. The group company is easily the most sought after corporate in the online travel industry. About the team: MakeMyTrip as India’s leading online travel company and provides petabytes of raw data which is helpful for business growth, analytical and machine learning needs. Data Platform Team is a horizontal function at MakeMyTrip to support various LOBs (Flights, Hotels, Holidays, Ground) and works heavily on streaming datasets which powers personalized experiences for every customer from recommendations to in-location engagement. There are two key responsibilities of Data Engineering team: One to develop the platform for data capture, storage, processing, serving and querying. Second is to develop data products starting from; o personalization & recommendation platform o customer segmentation & intelligence o data insights engine for persuasions and o the customer engagement platform to help marketers craft contextual and personalized campaigns over multi-channel communications to users We developed Feature Store, an internal unified data analytics platform that helps us to build reliable data pipelines, simplify featurization and accelerate model training. This enabled us to enjoy actionable insights into what customers want, at scale, and to drive richer, personalized online experiences. Technology experience : Extensive experience working with large data sets with hands-on technology skills to design and build robust data architecture Extensive experience in data modeling and database design At least 6+ years of hands-on experience in Spark/BigData Tech stack Stream processing engines – Spark Structured Streaming/Flink Analytical processing on Big Data using Spark At least 6+ years of experience in Scala Hands-on administration, configuration management, monitoring, performance tuning of Spark workloads, Distributed platforms, and JVM based systems At least 2+ years of cloud deployment experience – AWS | Azure | Google Cloud Platform At least 2+ product deployments of big data technologies – Business Data Lake, NoSQL databases etc Awareness and decision making ability to choose among various big data, no sql, and analytics tools and technologies Should have experience in architecting and implementing domain centric big data solutions Ability to frame architectural decisions and provide technology leadership & direction Excellent problem solving, hands-on engineering, and communication skills

Posted 1 day ago

Apply

3.0 years

0 Lacs

Thiruvananthapuram Taluk, India

On-site

Job Summary We are seeking a highly motivated and skilled Data Engineer with 3+ years of experience to join our growing data team. In this role, you will be instrumental in designing, building, and maintaining robust, scalable, and efficient data pipelines and infrastructure. You will work closely with data scientists, analysts, and other engineering teams to ensure data availability, quality, and accessibility for various analytical and machine learning initiatives. Key Responsibilities Design and Development: ○ Design, develop, and optimize scalable ETL/ELT pipelines to ingest, transform, and load data from diverse sources into data warehouses/lakes. ○ Implement data models and schemas that support analytical and reporting requirements. ○ Build and maintain robust data APIs for data consumption by various applications and services. Data Infrastructure: ○ Contribute to the architecture and evolution of our data platform, leveraging cloud services (AWS, Azure, GCP) or on-premise solutions. ○ Ensure data security, privacy, and compliance with relevant regulations. ○ Monitor data pipelines for performance, reliability, and data quality, implementing alerting and anomaly detection. Collaboration & Optimization: ○ Collaborate with data scientists, business analysts, and product managers to understand data requirements and translate them into technical solutions. ○ Optimize existing data processes for efficiency, cost-effectiveness, and performance. ○ Participate in code reviews, contribute to documentation, and uphold best practices in data engineering. Troubleshooting & Support: ○ Diagnose and resolve data-related issues, ensuring minimal disruption to data consumers. ○ Provide support and expertise to teams consuming data from the data platform. Required Qualifications Bachelor's degree in Computer Science, Engineering, or a related quantitative field. 3+ years of hands-on experience as a Data Engineer or in a similar role. Strong proficiency in at least one programming language commonly used for data engineering (e.g., Python, Java, Scala). Extensive experience with SQL and relational databases (e.g., PostgreSQL, MySQL, SQL Server). Proven experience with ETL/ELT tools and concepts. Experience with data warehousing concepts and technologies (e.g., Snowflake, Redshift, BigQuery, Azure Synapse, Data Bricks). Familiarity with cloud platforms (AWS, Azure, or GCP) and their data services (e.g., S3, EC2, Lambda, Glue, Data Factory, Blob Storage, BigQuery, Dataflow). Understanding of data modeling techniques (e.g., dimensional modeling, Kimball, Inmon). Experience with version control systems (e.g., Git). Excellent problem-solving, analytical, and communication skills. Preferred Qualifications Master's degree in a relevant field. Experience with Apache Spark (PySpark, Scala Spark) or other big data processing frameworks. Familiarity with NoSQL databases (e.g., MongoDB, Cassandra). Experience with data streaming technologies (e.g., Kafka, Kinesis). Knowledge of containerization technologies (e.g., Docker, Kubernetes). Experience with workflow orchestration tools (e.g., Apache Airflow, Azure Data Factory, AWS Step Functions). Understanding of DevOps principles as applied to data pipelines. Prior experience in Telecom is a plus. Skills: data modeling,etl/elt,version control,data science,gcp,postgresql, mysql, sql server,azure,scala,apache spark,aws,nosql databases,workflow orchestration,containerization,sql,java,etl/elt tools,cloud services,data streaming,data warehousing,python,python, java, scala

Posted 1 day ago

Apply

3.0 years

5 - 8 Lacs

Thiruvananthapuram Taluk, India

On-site

Job Title: Data Engineer 📍 Location: Trivandrum (Hybrid) 💼 Experience: 3+ Years 💰 Salary: Up to ₹8 LPA Job Summary We are hiring a skilled Data Engineer with 3+ years of experience to design, build, and optimize data pipelines and infrastructure. You will work closely with data scientists, analysts, and engineers to ensure reliable, scalable, and secure data delivery across cloud and on-prem systems. Key Responsibilities Design and develop ETL/ELT pipelines and implement data models for analytics/reporting. Build and maintain data APIs, ensuring data availability, security, and compliance. Develop and optimize data infrastructure on AWS, Azure, or GCP. Collaborate with stakeholders to gather requirements and deliver scalable data solutions. Monitor pipelines for performance and data quality; implement alerts and issue resolution. Participate in code reviews, documentation, and enforce engineering best practices. Mandatory Skills & Qualifications Bachelor’s in Computer Science, Engineering, or related field. 3+ years in Data Engineering or similar role. Strong knowledge in Python/Java/Scala, SQL, and relational databases (PostgreSQL, MySQL, etc.). Experience with ETL/ELT, data warehousing (Snowflake, Redshift, BigQuery, Synapse). Cloud experience in AWS, Azure, or GCP (e.g., S3, Glue, Data Factory, BigQuery). Knowledge of data modeling (Kimball/Inmon). Version control using Git. Strong problem-solving, communication, and collaboration skills. Preferred (Nice To Have) Master’s degree. Experience with Apache Spark, Kafka/Kinesis, NoSQL (MongoDB/Cassandra). Familiarity with Docker/Kubernetes, Airflow, and DevOps for data pipelines. Experience in Telecom domain is a plus. Skills: gcp,kinesis,aws,airflow,sql,pipelines,version control,elt,scala,docker,etl,apache spark,devops,azure,python,data modeling,java,design,data,infrastructure,git,nosql,data warehousing,kubernetes,cloud,kafka

Posted 1 day ago

Apply

2.0 - 6.0 years

0 Lacs

bhopal, madhya pradesh

On-site

As an AI + Backend Developer at AskGalore, you will be responsible for developing and maintaining backend systems and APIs for AI-powered applications. Your role will involve designing, implementing, and optimizing machine learning models and algorithms for real-world applications. You will collaborate with cross-functional teams to integrate AI models into scalable, production-ready systems. Your primary tasks will include building and managing robust, secure, and efficient server-side architectures using Python frameworks such as Django, Flask, and FastAPI. Additionally, you will optimize application performance, troubleshoot backend issues, and work with databases (SQL and NoSQL) to store and manage large datasets for AI applications. Furthermore, you will be involved in implementing CI/CD pipelines for seamless deployment of AI and backend solutions and staying up-to-date with the latest advancements in AI and backend development technologies. Your role will require a Bachelor's/Master's degree in Computer Science, Engineering, or a related field, along with proven experience in backend development using Python and frameworks like Django, Flask, or FastAPI. To excel in this role, you should have a strong understanding of AI/ML concepts, libraries, and tools such as TensorFlow, PyTorch, and Scikit-learn. Hands-on experience in deploying and scaling AI models in production environments, proficiency in working with RESTful APIs and microservices architecture, and solid knowledge of database management systems like PostgreSQL, MySQL, and MongoDB are essential. Preferred skills include knowledge of Natural Language Processing (NLP) or Computer Vision (CV) techniques, experience with big data processing tools like Apache Spark and Hadoop, and familiarity with DevOps practices for AI/ML pipelines. An understanding of data security and privacy best practices, as well as experience with cloud platforms such as AWS, Google Cloud, or Azure and containerization technologies like Docker and Kubernetes, will be advantageous. As an AI + Backend Developer at AskGalore, you will play a crucial role in designing and developing scalable backend systems while integrating cutting-edge AI technologies. You will contribute to creating robust APIs, deploying AI models, and ensuring efficient and secure system operations to deliver impactful solutions for business and client needs. If you are passionate about AI and ready to lead impactful projects, we invite you to join us at AskGalore, where innovation meets excellence. The working hours for this full-time position are from 9:30 AM to 6:30 PM at Maple Highstreet, Bhopal. The position offers a competitive salary and compensation package in line with industry standards. Don't miss this opportunity to become a vital part of our dynamic team. Apply now to be considered for one of the two vacancies available for this role.,

Posted 1 day ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Skill: Python (Django, Flask, or FastAPI), Restful API Exp: 6 to 15 Years Location: Gurgaon Key Responsibilities: Design, develop, and maintain RESTful APIs using Python (preferably with FastAPI, Flask, or Django REST Framework) Implement server-side logic, caching mechanisms, and background task processing Work with SQL and NoSQL databases like PostgreSQL, MySQL, MongoDB, etc. Integrate third-party APIs and services securely and efficiently Write clean, scalable, and testable code with proper unit testing and API documentation Collaborate with front-end developers, DevOps, and QA teams to ensure seamless system integration Optimize API performance, monitor errors, and ensure uptime/reliability Follow agile practices and participate in code reviews, sprint planning, and standups Requirements: 6+ Yrs of hands-on backend development experience in Python Strong experience in API development and integration Solid understanding of object-oriented programming and design patterns Experience with FastAPI, Flask, or Django (REST Framework preferred) Proficiency with database design, ORMs (SQLAlchemy/Django ORM), and complex queries Knowledge of API security, OAuth2, JWT, and session management Familiarity with Docker, CI/CD, and cloud platforms (AWS/GCP/Azure) is a plus Experience with message brokers like RabbitMQ, Kafka, or Celery is a plus Strong debugging, performance tuning, and unit testing skills Preferred Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field Contributions to open-source projects or GitHub repositories is a bonus

Posted 1 day ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Total Experience: 4 to 12 years Location: NCR/Mumbai/Bangalore Responsibilities: Design and develop agentic automation workflows using frameworks such as LangGraph, AutoGen, CrewAI, and other multi-agent systems (e.g., MCP, A2A) to automate complex business processes. Build and optimize Retrieval-Augmented Generation (RAG) pipelines for enhanced contextual understanding and accurate response generation in automation tasks. Integrate open-source LLMs (e.g. LLaMA) and closed-source LLMs (e.g., OpenAI, Gemini, Vertex AI) to power agentic systems and generative AI applications. Develop robust Python-based solutions using libraries like LangChain, Transformers, Pandas, and PyTorch for automation and AI model development. Implement and manage CI/CD pipelines, Git workflows, and software development best practices to ensure seamless deployment of automation solutions. Work with structured and unstructured data, applying prompt engineering and fine-tuning techniques to enhance LLM performance for specific use cases. Query and manage databases (e.g., SQL, NoSQL) for data extraction, transformation, and integration into automation workflows. Collaborate with stakeholders to translate technical solutions into business value, delivering clear presentations and documentation. Stay updated on advancements in agentic automation, generative AI, and LLM technologies to drive innovation and maintain competitive edge. Ensure scalability, security, and performance of deployed automation solutions in production environments. Experience: 4+ years of hands-on experience in AI/ML, generative AI, or automation development. Proven expertise in agentic frameworks like LangGraph, AutoGen, CrewAI, and multi-agent systems. Experience building and deploying RAG-based solutions for automation or knowledge-intensive applications. Hands-on experience with open-source LLMs (Hugging Face) and closed-source LLMs (OpenAI, Gemini, Vertex AI). Technical Skills: Advanced proficiency in Python and relevant libraries (LangChain, Transformers, Pandas, PyTorch, Scikit-learn). Strong SQL skills for querying and managing databases (e.g., PostgreSQL, MongoDB). Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions), Git workflows, and containerization (e.g., Docker, Kubernetes). Experience with Linux (Ubuntu) and cloud platforms (AWS, Azure, Google Cloud) for deploying automation solutions. Knowledge of automation tools (e.g., UiPath, Automation Anywhere) and workflow orchestration platforms. Soft Skills: Exceptional communication skills to articulate technical concepts to non-technical stakeholders. Strong problem-solving and analytical skills to address complex automation challenges. Ability to work collaboratively in a fast-paced, client-facing environment. Proactive mindset with a passion for adopting emerging technologies. Preferred Qualifications Experience with multi-agent coordination protocols (MCP) and agent-to-agent (A2A) communication systems. Familiarity with advanced generative AI techniques, such as prompt chaining, tool-augmented LLMs, and model distillation. Exposure to enterprise-grade automation platforms or intelligent process automation (IPA) solutions. Contributions to open-source AI/automation projects or publications in relevant domains. Certification in AI, cloud platforms, or automation technologies (e.g., AWS Certified AI Practitioner, RPA Developer).

Posted 1 day ago

Apply

3.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

JOB DESCRIPTION Role Expectations: Design, develop, and maintain scalable backend services using Node.js & Express Build dynamic and responsive UI components with React and Angular Collaborate cross-functionally with product and design teams to deliver seamless user experiences Write clean, testable, and maintainable code following industry best practices Integrate RESTful APIs and work with third-party services Optimize application performance, security, and scalability Conduct code reviews, mentor junior engineers, and uphold engineering excellence Troubleshoot, debug, and efficiently resolve production issues Qualifications: Bachelor’s degree in Electronics / Electrical Engineering / Computer Science or equivalent experience. 3+ years of experience in full-stack development. Strong proficiency in JavaScript (ES6+), Node.js, and Express.js. Hands-on experience with React.js and Angular for front-end development. Good understanding of RESTful APIs, web sockets, and asynchronous programming. Knowledge of database systems (SQL & NoSQL) and ORM frameworks. Familiarity with Git, CI/CD pipelines, Docker, and cloud platforms (AWS/Azure/GCP). Strong problem-solving and debugging skills. Understanding of unit testing, integration testing, and code quality tools. Experience with TypeScript. Knowledge of Microservices architecture. Exposure to GraphQL or Serverless frameworks. Familiarity with Agile/Scrum methodologies.

Posted 1 day ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies