Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Education And Experience Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or related field. 3–6 years of hands-on experience in Scala development, preferably in a data engineering or data pipeline context. Key Responsibilities Collaborate with business analysts and stakeholders to gather and analyze requirements for data pipeline solutions. Design, develop, and maintain scalable data pipelines using Scala and related technologies. Write clean, efficient, and well-documented Scala code for data ingestion, transformation, and processing. Develop and execute unit, integration, and end-to-end tests to ensure data quality and pipeline reliability. Orchestrate and schedule data pipelines using tools such as Apache Airflow, Oozie, or similar workflow schedulers. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Participate in code reviews, provide constructive feedback, and adhere to best practices in software development. Document technical solutions, data flows, and pipeline architectures. Work closely with DevOps and Data Engineering teams to deploy and maintain solutions in production environments. Stay current with emerging technologies and industry trends in big data and Scala development. Required Skills & Qualifications Strong proficiency in Scala, including functional programming concepts. Experience building and maintaining ETL/data pipelines. Solid understanding of data structures, algorithms, and software engineering principles. Experience with workflow orchestration/scheduling tools (e.g., Apache Airflow, Oozie, Luigi, or similar). Familiarity with distributed data processing frameworks (e.g., Apache Spark, Kafka, Flink). Proficiency in writing unit and integration tests for data pipelines. Experience with version control systems (e.g., Git). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Skills & Qualifications Experience with cloud platforms (AWS, Azure, or GCP) and related data services. Knowledge of SQL and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB). Familiarity with containerization and orchestration tools (Docker, Kubernetes). Exposure to CI/CD pipelines and DevOps practices. Experience with data modeling and data warehousing concepts. Knowledge of other programming languages (e.g., Python, Java) is a plus. Experience working in Agile/Scrum environments. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 308597
Posted 7 hours ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Key Responsibilities: • Architect & Build Scalable Systems: Design and implement petabyte-scale lakehousearchitectures (Apache Iceberg, Delta Lake) to unify data lakes and warehouses. • Real-Time Data Engineering: Develop and optimize streaming pipelines using Kafka, Pulsar,and Flink to process structured/unstructured data with low latency. • High-Performance Applications: Leverage Java to build scalable, high-throughput dataapplications and services. • Modern Data Infrastructure: Leverage modern data warehouses and query engines (Trino, Spark)for sub-second operation and analytics on real-time data. • Database Expertise: Work with RDBMS (PostgreSQL, MySQL, SQL Server) and NoSQL(Cassandra, MongoDB) systems to manage diverse data workloads. • Data Governance: Ensure data integrity, security, and compliance across multi-tenant systems. • Cost & Performance Optimization: Manage production infrastructure for reliability, scalability,and cost efficiency. • Innovation: Stay ahead of trends in the data ecosystem (e.g., Open Table Formats, streamprocessing) to drive technical excellence. • API Development (Optional): Build and maintain Web APIs (REST/GraphQL) to expose dataservices internally and externally. Qualifications: • 8+ years of data engineering experience with large-scale systems (petabyte-level). • Expert proficiency in Java for data-intensive applications. • Hands-on experience with lakehouse architectures, stream processing (Flink), and event streaming (Kafka/Pulsar). • Strong SQL skills and familiarity with RDBMS/NoSQL databases. • Proven track record in optimizing query engines (e.g., Spark, Presto) and data pipelines. • Knowledge of data governance, security frameworks, and multi-tenant systems. • Experience with cloud platforms (AWS, GCP, Azure) and infrastructure-as-code (Terraform). What we offer? • Unique experience in Fin-Tech industry, with a leading, fast-growing company. • Good atmosphere at work and a comfortable working environment. • Additional benefit of Group Health Insurance - OPD Health Insurance • Coverage for Self + Family (Spouse and up to 2 Children) • Attractive Leave benefits like Maternity, Paternity Benefit, Vacation leave & Leave Encashment • Reward & Recognition – Monthly, Quarterly, Half yearly & yearly. • Loyalty benefits • Employee referral program
Posted 7 hours ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Education And Experience Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or related field. 3–6 years of hands-on experience in Scala development, preferably in a data engineering or data pipeline context. Key Responsibilities Collaborate with business analysts and stakeholders to gather and analyze requirements for data pipeline solutions. Design, develop, and maintain scalable data pipelines using Scala and related technologies. Write clean, efficient, and well-documented Scala code for data ingestion, transformation, and processing. Develop and execute unit, integration, and end-to-end tests to ensure data quality and pipeline reliability. Orchestrate and schedule data pipelines using tools such as Apache Airflow, Oozie, or similar workflow schedulers. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Participate in code reviews, provide constructive feedback, and adhere to best practices in software development. Document technical solutions, data flows, and pipeline architectures. Work closely with DevOps and Data Engineering teams to deploy and maintain solutions in production environments. Stay current with emerging technologies and industry trends in big data and Scala development. Required Skills & Qualifications Strong proficiency in Scala, including functional programming concepts. Experience building and maintaining ETL/data pipelines. Solid understanding of data structures, algorithms, and software engineering principles. Experience with workflow orchestration/scheduling tools (e.g., Apache Airflow, Oozie, Luigi, or similar). Familiarity with distributed data processing frameworks (e.g., Apache Spark, Kafka, Flink). Proficiency in writing unit and integration tests for data pipelines. Experience with version control systems (e.g., Git). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Skills & Qualifications Experience with cloud platforms (AWS, Azure, or GCP) and related data services. Knowledge of SQL and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB). Familiarity with containerization and orchestration tools (Docker, Kubernetes). Exposure to CI/CD pipelines and DevOps practices. Experience with data modeling and data warehousing concepts. Knowledge of other programming languages (e.g., Python, Java) is a plus. Experience working in Agile/Scrum environments. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 308597
Posted 7 hours ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role: Oracle Database Engineer Location: Gurgaon Experience: 6 Year+ Job Description : JD is as below Role Overview: We are looking for a highly skilled Senior Oracle Database Resource to join our innovative team. The ideal candidate will possess significant experience in database design, development, and administration, with a strong emphasis on Oracle databases alongside familiarity with ETL tools and Snowflake. Experience in migrating databases from Oracle to PostgreSQL will be an added advantage. Key Responsibilities: Database Design & Development: Design, implement, and maintain robust Oracle database solutions. Develop and manage ETL processes to ensure data accuracy and quality. Collaborate with business stakeholders to gather requirements and develop scalable database solutions. Performance Tuning & Optimisation: Monitor and enhance database performance and efficiency. Implement industry best practices for database management and security compliance. Migration Expertise: Lead projects on database migration from Oracle to PostgreSQL. Assist in establishing strategies and methodologies for successful database transitions. Documentation & Training: Produce and maintain comprehensive documentation for database architecture, processes, and procedures. Provide training and support to team members and end-users. Quality Assurance: Ensure adherence to data governance and regulatory compliance. Conduct regular database backups and develop disaster recovery plans. Required Qualifications: Extensive experience with Oracle Database (specific version if necessary). Proficient in ETL tools (e.g., Informatica, Talend, Apache NiFi). Solid understanding of Snowflake and its integration with existing systems. Proven experience in designing and implementing complex database solutions. Familiarity with database migration processes, especially from Oracle to PostgreSQL. Desired Skills: Strong analytical and problem-solving abilities. Excellent verbal and written communication skills. Ability to work collaboratively in a team environment and mentor junior members.
Posted 7 hours ago
2.0 - 4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Summary: We are seeking a detail-oriented Full Stack Engineer with strong debugging and performance optimization skills. The primary responsibility of this role is to maintain existing systems, fix bugs, resolve production issues, and continuously enhance application performance. The ideal candidate should be proficient in React, Java Spring Boot, and PostgreSQL, and must be an expert in debugging across the stack. Key Responsibilities:- Investigate, analyze, and fix bugs across frontend and backend codebases. Debug and resolve production issues with quick turnaround and root cause analysis. Improve performance of existing systems (both backend APIs and frontend UI). Collaborate with development teams to implement sustainable technical solutions. Optimize queries and ensure database efficiency using PostgreSQL. Participate in code reviews and suggest performance improvements. Contribute to documentation related to bug-fixes and improvements. Required Skills & Qualifications:- 2 - 4 years of experience in full stack development and system maintenance. Strong proficiency in React.js, JavaScript, HTML, and CSS. Solid backend development experience in Java Spring Boot. In-depth knowledge of PostgreSQL and query optimization. Expert in debugging production systems and troubleshooting real-time issues. Good understanding of performance tuning techniques and tools. Familiarity with version control (Git) and CI/CD pipelines. Nice to Have:- Experience with unit/integration testing tools (JUnit, Jest) Experience with monitoring tools (Site 24*7, Grafana, Prometheus). Background in microservices and distributed systems. Experience with automated testing tools and frameworks. Educational Qualification: Bachelor's degree in Computer Science, IT, or related field
Posted 7 hours ago
6.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Education And Experience Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or related field. 3–6 years of hands-on experience in Scala development, preferably in a data engineering or data pipeline context. Key Responsibilities Collaborate with business analysts and stakeholders to gather and analyze requirements for data pipeline solutions. Design, develop, and maintain scalable data pipelines using Scala and related technologies. Write clean, efficient, and well-documented Scala code for data ingestion, transformation, and processing. Develop and execute unit, integration, and end-to-end tests to ensure data quality and pipeline reliability. Orchestrate and schedule data pipelines using tools such as Apache Airflow, Oozie, or similar workflow schedulers. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Participate in code reviews, provide constructive feedback, and adhere to best practices in software development. Document technical solutions, data flows, and pipeline architectures. Work closely with DevOps and Data Engineering teams to deploy and maintain solutions in production environments. Stay current with emerging technologies and industry trends in big data and Scala development. Required Skills & Qualifications Strong proficiency in Scala, including functional programming concepts. Experience building and maintaining ETL/data pipelines. Solid understanding of data structures, algorithms, and software engineering principles. Experience with workflow orchestration/scheduling tools (e.g., Apache Airflow, Oozie, Luigi, or similar). Familiarity with distributed data processing frameworks (e.g., Apache Spark, Kafka, Flink). Proficiency in writing unit and integration tests for data pipelines. Experience with version control systems (e.g., Git). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Skills & Qualifications Experience with cloud platforms (AWS, Azure, or GCP) and related data services. Knowledge of SQL and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB). Familiarity with containerization and orchestration tools (Docker, Kubernetes). Exposure to CI/CD pipelines and DevOps practices. Experience with data modeling and data warehousing concepts. Knowledge of other programming languages (e.g., Python, Java) is a plus. Experience working in Agile/Scrum environments. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 308597
Posted 7 hours ago
6.0 years
0 Lacs
Gandhinagar, Gujarat, India
On-site
Job Title: Odoo Tech Lead / Team Leader (6+ Years Experience) Experience Required: Minimum 6 Years in Odoo Development & Team Management Employment Type: Full-time. About the Role: We are seeking an experienced and proactive Odoo Technical Lead / Team Leader who will not only lead and mentor a team but also actively write code and build modules in the initial phase of projects. This is a hands-on leadership role for someone who is passionate about solving complex business challenges with scalable ERP solutions using Odoo. You will oversee the technical architecture, supervise code quality, and ensure timely delivery, while also working closely with clients and functional teams. Key Responsibilities: ● Lead the technical planning, architecture, and end-to-end implementation of Odoo-based solutions ● Write and review code for custom modules, especially in the initial stages of the project ● Guide junior and mid-level developers through design decisions, reviews, and problem-solving ● Translate functional requirements into detailed technical solutions ● Manage project timelines, code quality, deployments, and documentation ● Collaborate with functional consultants and QA to ensure delivery accuracy and system performance ● Handle complex customizations and third-party API integrations ● Ensure adherence to coding standards, version control, and CI/CD practices ● Stay up to date with new features in Odoo and emerging ERP technologies Required Skills & Qualifications: ● Bachelor’s or Master’s in Computer Science or related field ● Minimum 6 years of Odoo development experience across multiple versions (v10 to latest) ● Must have the ability to write Odoo modules from scratch and modify core functionalities if needed ● Agile/Scrum experience with tools like Jira or ClickUp ● Strong command of Python, PostgreSQL, XML, JavaScript, QWeb, and Odoo ORM ● Solid understanding of backend and frontend customization in both Community and Enterprise editions ● Experience with tools like Odoo.sh, GitHub, Docker, Jenkins, etc. ● Hands-on experience with REST APIs and 3rd-party app integrations ● Familiarity with business domains like Sales, Purchase, Inventory, Manufacturing, HR, Accounting ● Strong leadership, problem-solving, and communication skills ● Experience in performance tuning and large database management in Odoo ● Capable of managing a team and delivering projects independently Preferred Qualifications: ● Odoo Certification (Technical or Functional) is highly desirable
Posted 7 hours ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Us “Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Job Title: Java Developer Exp: 5+ Years Location: Pune In this role, you will: Work towards priorities defined by product owners in the business to collaboratively build out the Product/Platform · To have a clear view of technology strategy for design & delivery of the technical aspects of the product…focusing on not only business delivery but constant focus on remediating tech debt. · Responsible for delivering tasks end to end with high quality and in line with design and architecture laid out. Strive towards no post implementation issues. · Production support, environment management, release support and automation implementation to be part of day job. · Ensuring that quality (code / performance) and discipline (TDD, BDD, unit, JIRA usage etc) are always maintained. · Maintaining our Agile and delivery principles. · Working with UX and Architecture to ensure that Design Driven ethos is upheld. · Collaboration with business and team along with DevOps principles maintained all the time. To be successful in this role, you should meet the following requirements: Demonstrable experience of Continuous Delivery software development methods, including TDD and automated testing (including non-functional performance testing). · Experience of working on high volume data integration and throughput requirements (profiling) · Experience of micro service architecture · Experience of REST services. · Experience of Developing microservices and deploying on Containerized environment · A background of solid architectural work is advantageous. Technical Specifics: Java 17 or above, Spring Boot components, Spring framework Proficiency with ServiceNow development, such as scripting, workflows, and integrations. Oracle, PostgreSQL, MySQL Some experience with NoSQL, Elastic, Google Cloud, Kubernetes, Ansible, AI/ML is good to have. Non-Technical: · Strong communication skills – experience of interfacing with IT Lead/Delivery Manager, Architect, business Product Owners and IT offshore teams. · Model – Strive to be Role Model for the peers.
Posted 7 hours ago
3.0 years
15 - 17 Lacs
Pune, Maharashtra, India
On-site
Role & Responsibilities Design, implement, and maintain backend services and RESTful APIs using Python and frameworks like Django or Flask. Collaborate with product owners and UI/UX designers to translate business requirements into technical solutions. Optimize application performance through code reviews, profiling, and effective caching strategies. Integrate with SQL/NoSQL databases, ensuring data integrity and efficient query performance. Develop and maintain automated tests (unit, integration) to ensure code quality and reliability. Participate in agile ceremonies, contribute to sprint planning, and drive continuous improvement initiatives. Skills & Qualifications Must-Have 3+ years of hands-on experience in Python development with strong OOP and scripting skills. Proficiency in Django or Flask for building web applications and APIs. Solid experience with relational (PostgreSQL/MySQL) and NoSQL (MongoDB) databases. Hands-on knowledge of RESTful API design principles and microservices architecture. Familiarity with Git workflows, branching strategies, and code review tools. Strong problem-solving skills, debugging techniques, and command over Linux/Unix environments. Preferred Experience with containerization technologies such as Docker and orchestration using Kubernetes. Exposure to CI/CD pipelines and infrastructure as code (Jenkins, GitLab CI, Terraform). Knowledge of asynchronous task queues (Celery, RabbitMQ) and real-time messaging systems. Benefits & Culture Highlights Collaborative on-site environment with open communication and agile best practices. Continuous learning culture: access to training budgets, certifications, and tech workshops. Clear career progression paths and regular performance feedback to fuel professional growth. Skills: python,git,oop,django,sql,microservices,restful apis,flask,nosql,linux/unix
Posted 7 hours ago
5.0 years
0 Lacs
Bengaluru East, Karnataka, India
Remote
Req ID: 336888 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a SQL Developer to join our team in Bangalore, Karnātaka (IN-KA), India (IN). Once You Are Here, You Will: Developer: SQL (PostgreSQL/ETL)Data Analysis Agile process knowledge Act as the first point of escalation for daily service issues along with PM and be a primary point of contact for Stakeholders . Proficiency in SQL, data environments, and data transformation tools (Python). Strong understanding of ETL data pipelines, including integration with APIs and databases. Hands-on experience with cloud-based Data Warehousing solutions (Snowflake). Knowledge of SDLC and Agile development techniques Practical experience with source control (GIT, SVN, etc.) Knowledge of design, development, and data linkages inside RDBMS and file data stores for MS SQL Server databases (CSV, XML, JSON, etc.) Thorough understanding of the development methods for batch and real-time system integration Prepare/Review Test Scripts and Unit testing of changes. Provide training, support, and leadership to the larger project team Required Qualifications: 5+ years’ experience in : SQL (PostgreSQL/ETL)Data Analysis Agile process knowledge consulting role that include completing at least 4 projects in a developer role Preferred Experience: Prior experience with a software development methodology, Agile preferred Experience with data migration using Data Loader Ideal Mindset: Problem Solver. You are creative but also practical in finding solutions to problems that may arise in the project to avoid potential escalations. Analytical. You like to dissect complex processes and can help forge a path based on your findings #salesforce About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com Whenever possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client’s needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, https://us.nttdata.com/en/contact-us . NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 8 hours ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Looking Immediate Joiners: Serving Notice Period Role: Data Engineer Experience: 3-5 Years Location: Hyderabad Work Mode: Hybrid Interview Mode: Face-to-Face Experience, Qualifications, Knowledge and Skills Bachelor's degree (B. A. / B. S.) from four-year college or university; and two to four years related experience and/or training; or equivalent combination of education and experience. 2+ years Healthcare industry experience preferred 3+ years of experience with SQL, database design, optimization , and tuning 3+ years of experience with open source relational (e.g. Postgresql ) 3+ years of experience using Github 3+ years of experience in Shell Scripting and one other object oriented language such as Python , or PhP. 3+ years of experience in continuous integration and development methodologies tools such as Jenkins 3+ years of experience in an Agile development environment Time management skills Professionalism Programming skills particularly SQL, Shell Scripting, and Python Detail oriented Conscientious Team player Oral and written communication skills * Note: Candidates must have hands-on experience with PostgreSQL, SQL, Python, and Shell scripting &ETL If you are interested, please share updated resume to prasanna@intellistaff.in
Posted 8 hours ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Key Responsibilities System Architecture & Event-Driven Design • Design and implement event-driven architectures using Apache Kafka to orchestrate distributed microservices and streaming pipelines. • Define scalable message schemas (e.g., JSON/Avro), data contracts, and versioning strategies to support AI-powered services. • Architect hybrid event + request-response systems to balance real-time streaming and synchronous business logic. Backend & AI/ML Integration • Develop Python-based microservices using FastAPI, enabling both standard business logic and AI/ML model inference endpoints. • Collaborate with AI/ML teams to operationalize ML models (e.g., classification, recommendation, anomaly detection) via REST APIs, batch processors, or event consumers. • Integrate model-serving platforms such as SageMaker, MLflow, or custom Flask/ONNX-based services. Cloud-Native & Serverless Deployment (AWS) • Design and deploy cloud-native applications using AWS Lambda, API Gateway, S3, CloudWatch, and optionally SageMaker or Fargate. • Build AI/ML-aware pipelines that automate retraining, inference triggers, or model selection based on data events. • Implement autoscaling, monitoring, and alerting for high-throughput AI services in production. Data Engineering & Database Integration • Ingest and manage high-volume structured and unstructured data across MySQL, PostgreSQL, and MongoDB. • Enable AI/ML feedback loops by capturing usage signals, predictions, and outcomes via event streaming. • Support data versioning, feature store integration, and caching strategies for efficient ML model input handling. Testing, Monitoring & Documentation • Write unit, integration, and end-to-end tests for both standard services and AI/ML pipelines. • Implement tracing and observability for AI/ML inference latency, success/failure rates, and data drift. • Document ML integration patterns, input/output schema, service contracts, and fallback logic for AI systems. Preferred Qualifications • 6+ years of backend software development experience with 2+ years in AI/ML integration or MLOps. • Strong experience in productionizing ML models for classification, regression, or NLP use cases. • Experience with streaming data pipelines and real-time decision systems. • AWS Certifications (Developer Associate, Machine Learning Specialty) are a plus. • Exposure to data versioning tools (e.g., DVC), feature stores, or vector databases is advantageous.
Posted 8 hours ago
10.0 - 15.0 years
0 Lacs
India
Remote
Job Role : Senior Lecturer Subject : Data Science with good knowledge of AWS, ML Ops and Bid Data Location : Remote Responsibilities : Develop and manage a robust academic framework for the Data Science vertical. Collaborate with various departments to ensure efficient resource allocation and program delivery. Stay updated with the latest trends in Data Science and emerging technologies to keep the curriculum relevant. Represent the institution at academic and professional conferences, contributing to thought leadership in the Data Science field. Qualifications: M.Sc. (Computer Science), MCA (Master in Computer Application), or B.Tech/M.Tech (Computer Engineering/IT). Doctor of Philosophy (Optional) A minimum of 10-15 years of teaching experience in Data Science or related fields. Proven experience in managing large-scale academic programs or corporate training initiatives. Technical Skills: Programming Languages: Python. Database Knowledge: Experience with MySQL, Oracle, SQL Server, or PostgreSQL (any one). Data Science Expertise: Numpy, Pandas, Matplotlib, Seaborn, Exploratory Data Analysis (EDA). Machine Learning: Proficiency with Scikit-learn (Sklearn) and experience in ML models for regression, classification, and clustering problems. Big Data - PySpark ML, PySpark NLP, Apache Kafka MlOps - Git, Github, Docker, PyCaret, MLFlow. Additional Knowledge: Familiarity with Tableau or Power BI is advantageous. Desired Skills: Strong client-facing and presentation skills. Ability to develop technical solutions tailored to client needs. Strong leadership and collaboration skills, with experience working in cross-functional teams. Exceptional communication and problem-solving abilities. You can also email at sadafa@regenesys.net lls
Posted 8 hours ago
3.0 years
0 Lacs
India
Remote
About Huzzle At Huzzle, we connect high-performing professionals with global companies across the UK, US, Canada, Europe, and Australia. Our clients include startups, digital agencies, and tech platforms in industries like SaaS, MarTech, FinTech, and EdTech. We match top sales talent to full-time remote roles where they're hired directly into client teams and provided ongoing support by Huzzle. As part of our talent pool, you'll have access to exclusive SDR opportunities matched to your background and preferences. About The Company We're looking for a AI Engineer —or as we like to call it, a Vibe Coder . This isn't your typical engineering gig. You'll play a hybrid role, part engineer, part product visionary, part UX craftsman pushing the boundaries of what's possible with AI. You'll work across the full stack, invent features that feel like magic, and co-create Olivia's future alongside the founding team. If you thrive in high-agency, zero-handholding environments and want to work on agentic, generative, and conversational AI systems, this role was built for you. Key Responsibilities Full-stack execution: Design, build, and ship core product features using React, Node.js, and our AI-first architecture. AI-first engineering: Prototype and deploy magical features using tools like Augment and Cursor. Design-forward mindset: Craft seamless user experiences—no design background needed, just good taste and intuition. Autonomous systems: Develop scalable, intelligent agents capable of brand-consistent, on-demand generation. Creative API orchestration: Combine tools like OpenAI, Google AI, Anthropic, and Bedrock into intelligent, unified pipelines. Strategic input: Shape product roadmaps and infrastructure decisions as part of a small, founder-led team. Rapid iteration: Build fast, ship faster, and bring a founder's mindset to debugging, feature testing, and performance tuning. Who You Are A former founder, founding engineer, or technical operator with deep ownership mentality. A creative problem-solver who codes with empathy and thinks in user workflows, not just code modules. A hands-on AI builder already using tools like Cursor or Augment to supercharge your dev flow. A startup-native who thrives in ambiguity and builds structure from chaos. A UX-aware engineer who sweats the details and instinctively builds interfaces that just feel right. A clear communicator who knows when to loop in others—and when to sprint solo. A relentless learner excited by the future of AI and always hunting for better ways to build. A product thinker who treats features like micro-startups: own the vision, build the thing, ship and iterate. Tech Stack Languages: TypeScript, JavaScript, Python (bonus) Frontend: React Backend: Node.js, Wasp (easy to pick up) Infra: Cloudflare Workers/R2, PostgreSQL, Docker AI & APIs: OpenAI, Anthropic, Google AI, Bedrock, Openrouter Dev Tools: Cursor, Augment, Git, Linear A Day in the Life Jump into a fast, focused standup to align on goals Prototype generative features that combine UX, backend, and AI orchestration Share demos via Loom, jam with founders in Slack, and rapidly ship to prod Ideate new user flows, sketch mockups, or dive deep into technical tradeoffs End the day knowing you shipped real value—and helped shape the future of design Requirements 3+ years of hands-on experience in full-stack development using JavaScript/TypeScript (Node.js, React) Strong understanding of modern backend architecture and scalable infrastructure (PostgreSQL, Docker, Cloudflare, AWS/GCP) proven experience across Product, Engineering, UX Research Proven track record of shipping production-ready products or meaningful side projects Experience working with or strong interest in AI development tools (e.g., OpenAI, Anthropic, Cursor, Augment, Bedrock) Solid grasp of API orchestration and prompt engineering for generative/conversational AI systems Natural product intuition with a UX-first mindset—you care about how it feels, not just how it works Comfort working in high-autonomy, high-speed startup environments Ability to balance speed, quality, and experimentation in an agile development cycle Excellent communication skills—able to collaborate asynchronously and explain technical decisions clearly Passionate about AI, startups, and the future of creative tooling Benefits 💰 Competitive compensation with equity potential at milestones 🌍 Fully remote, async-first culture with high flexibility 🚀 Zero bureaucracy, 100% impact environment 🎨 Creative ownership—you shape what gets built ⚙️ Cutting-edge AI stack and tools 📈 Be a foundational team member at a venture-scale company 🔥 Work on a product people feel when they use
Posted 8 hours ago
0 years
0 Lacs
Trivandrum, Kerala, India
On-site
We are seeking an experienced Python Solution Architect to join our dynamic team. In this role, you will be responsible for designing and implementing scalable, high-performance software solutions that meet business requirements. You will collaborate with cross-functional teams to define architecture, best practices, and oversee the development process. Job Responsibilities · Architect scalable, efficient, and high-performance Python-based applications. · Design microservices architecture and cloud-native solutions using Python frameworks (e.g., Django, Flask, FastAPI). · Ensure Python solutions align with business goals and enterprise architecture. · Design and manage RESTful APIs and web services, leveraging Python's capabilities. · Expertise in selecting the right Python frameworks, libraries, and tools for different use cases. · Architect and optimize database interactions, including SQL and NoSQL databases. · Ensure efficient data processing, ETL pipelines, and integrations with data analytics platforms (e.g., Pandas, NumPy, SQLAlchemy). · Design seamless integrations with third-party services, APIs, and external systems using Python-based solutions. · Ensure smooth data flow between Python applications and other enterprise systems. · Architect solutions in cloud environments (AWS, GCP, Azure) using Python. · Implement CI/CD pipelines for Python projects and manage infrastructure-as-code (Terraform, Ansible). · Ensure security best practices in Python code (e.g., OWASP, cryptography, input validation). · Lead efforts to comply with data protection and regulatory requirements in Python solutions. · Provide guidance to Python developers on architectural decisions, design patterns, and code quality. · Mentor teams on Python best practices, writing clean, maintainable, and efficient code. · Work closely with customers, business analysts, project managers, and development teams to understand requirements. · Communicate complex technical concepts to non-technical stakeholders. · Ensure solutions address functional and non-functional requirements (e.g., performance, scalability, security). Preferred Skills · Deep knowledge of Python frameworks like Django, Flask, or FastAPI. · Proficiency with asynchronous programming in Python (e.g., asyncio, concurrent.futures). · Hands-on experience with designing and deploying microservices-based architectures. · Understanding of containerization technologies like Docker and orchestration tools like Kubernetes. · Strong experience with AWS, GCP, or Azure for deploying and scaling Python applications. · Familiarity with cloud services like Lambda (AWS), Cloud Functions (GCP), or similar. · Experience with CI/CD pipelines and automation tools (e.g., Jenkins, GitLab CI, CircleCI). · Knowledge of Infrastructure-as-Code (IaC) tools like Terraform or Ansible. · Proficiency with relational databases (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Redis). · Experience with database optimization, indexing, and query tuning. · Strong understanding of RESTful APIs, GraphQL, and API documentation standards (e.g., OpenAPI/Swagger). · Experience with integrating third-party services via APIs. · Proficient with Git, GitHub, or GitLab for version control and collaboration in Python projects. · Familiarity with branching strategies (e.g., GitFlow) and code review practices. · Experience with Python security tools and practices (e.g., PyJWT, OAuth2, secure coding). · Familiarity with encryption, authentication, and data protection standards. · Hands-on experience working in Agile environments, familiar with Scrum or Kanban. · Ability to break down complex technical tasks into sprints and manage backlogs. · Knowledge of popular Python AI/ML libraries such as TensorFlow, PyTorch, and Scikit-learn. · Experience with deploying machine learning models in production environments.
Posted 8 hours ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
React (FE)/ Typescript (BE) node.js PostGreSQL are a must Fargate is a plus Everyone should be rest API thinking by default. tech stack: AWS Cloudfront, S3 buckets, node.js, typescript, React.js, Next.js, Stencil.js, Aurora Postgres SQL, uses Rest APIs,uses Databricks as Data Lake, Auth0 for access/login Does this help? microservice based architecture Auth0 is important on the lead dev Tech lead: if he can show me that he has been leading complex exercises, be put in front of any business person and capable into explaining well to the business. The profile who can articulate what the architect has done. How would you build a portal with an auth0 mechanism for 1 Million user, if the answer is not good that person cannot be a good tech-lead. I dont need to hear the steps but actually the solution from it Technical Leadership: Lead a squad of developers in delivering high-quality, scalable backend solutions Provide technical direction, review code, and ensure best practices in architecture, design, and implementation Collaborate closely with other technical leads to align on standards, interfaces, and dependencies Hands-On on Coding and Development: Design and implement microservices using Nodejs and TypeScript Build and maintain RESTful APIs Optimize services hosted on AWS (CloudFront, S3, Aurora PostgreSQL, etc) System Architecture & Operations: Contribute to system architecture and component design in collaboration with other leads and architects Leverage Databricks as a data lake backend (no data engineering needed) Ensure secure and reliable authentication/authorization using Auth0 Agile Delivery: Contribute to agile ceremonies (stand-ups, sprint planning, retrospectives) Occasionally take on the role of Scrum Master to facilitate team delivery Use Jira and Confluence to track and document work Cross-Team Collaboration: Work with front-end engineers (Reactjs, Nextjs), DevOps, QA, and product managers Ensure consistency and maintainability across services and teams Required Qualifications 8+ years of back-end development experience, including: 6+ years in cloud-native development using AWS Strong proficiency in Nodejs, TypeScript, and REST APIs Experience with AWS CloudFront, S3, Aurora PostgreSQL Demonstrated experience leading small teams or engineering squads Deep understanding of microservice architectures Familiarity with Reactjs, Nextjs, and Auth0 integration Experience working in agile environments using Jira and Confluence Strong communication skills and ability to influence cross-functional stakeholders Develop and maintain REST APIs to support various applications and services Ensure secure and efficient access/login mechanisms using Auth0 Collaborate with cross-functional teams to define, design, and ship new features Mentor and guide junior developers, fostering a culture of continuous learning and improvement Conduct code reviews and ensure adherence to best practices and coding standards Troubleshoot and resolve technical issues, ensuring high availability and performance of applications Stay updated with the latest industry trends and technologies to drive innovation within the team
Posted 8 hours ago
6.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Job Title: Cloud Application Engineer (Ba ckend -Python ) Location: Gurugram, Haryana, India Job Type: Full-Time, Hybrid Company Overview: Schneider Electric is a global leader in energy management and automation, committed to providing innovative solutions that ensure Life Is On everywhere, for everyone, and at every moment. We are expanding our team in Gurugram and looking for a backend develop er to enhance our cloud capabilities and drive the integration of digital technologies in our operations. We expect the applicant to be technically proficient, having strong fundamentals and a good grasp of the latest tools and technologies. Roles and Responsibilities: Platform Scalability & Workflow Optimization Enhance existing software services and tools to simplify workflows and ensure scalability to handle 10x the current IoT data traffic . Implement performance-optimized data pipelines, leveraging stream processing, efficient storage, and distributed systems. Microservices Architecture Advancement Leverage and evolve the current microservices architecture to support modular, maintainable, and extensible development of future applications. Promote containerization and orchestration best practices (e.g., Docker, Kubernetes) for deployment consistency and scalability. Mentorship Mentor junior engineers , fostering a culture of learning, ownership, and technical excellence. Conduct regular knowledge-sharing sessions, code walkthroughs to upskill the team. Process Improvement & Engineering Culture Continuously improve engineering processes around: Code reviews : Focus on quality, readability, and maintainability. Testing : Strengthen unit, integration, and load testing coverage. Documentation : Ensure clarity and completeness for internal and external stakeholders. Hiring : Participate in talent acquisition to build a high-performing team. Technology Evaluation & Adoption Evaluate emerging technologies (e.g., edge computing, AI/ML for anomaly detection, time-series databases) aligned with business goals. Conduct proof-of-concepts and technical feasibility studies to validate new tools and frameworks. Cross-functional Collaboration & Delivery Set aggressive yet achievable timelines for key initiatives. Collaborate closely with hardware, product, and business teams to ensure alignment and timely delivery. Drive end-to-end ownership of features—from ideation to production rollout. Serve as the tech lead for the development squad within an agile framework- fostering engineering best practices, mentoring team members, and ensuring smooth sprint execution without direct people management responsibilities. Qualifications & Experience Educational Background: Bachelor's or master's degree in computer science, Electronics & Communication Engineering, or a related field. Core Competencies: Strong analytical, problem-solving, and communication skills. Proficient in presenting technical concepts to diverse audiences. Hands-on experience with agile methodologies such as SCRUM and Kanban. Self-driven and comfortable working in fast-paced, dynamic environments with minimal supervision. Technical Skills Must-Have: 6-8 years of hands-on development experience in Python and frameworks like Django, Flask, or FastAPI . Expertise in building scalable data pipelines using tools like Kafka, Airflow, or Temporal. Solid understanding of distributed systems (e.g., Kafka, Cassandra, Druid, CouchDB). Experience with scalable time-series databases (e.g., InfluxDB , TimescaleDB , Druid, TDengine , Timestream, Bigtable). Proficiency in relational databases (PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB). Experience working on high-throughput systems handling at least 500 million transactions per day. Skilled in designing scalable APIs (REST/ GraphQL ). Experience with asynchronous task management using Celery. Strong grasp of algorithms and data structures with practical application for performance optimization. Knowledge of SOLID principles and design patterns. Deep understanding of architectural principles including microservices and event-driven design. Experience with unit testing and test-driven development (TDD). Familiarity with Docker and containerized environments. Experience with cloud platforms such as AWS, Azure, or GCP. Mastery of Git for source control and collaboration. Good-to-Have: Working knowledge of JavaScript frameworks (ReactJS, Angular). Exposure to DevOps practices including CI/CD pipelines, container orchestration (Kubernetes). Prior experience in developing IoT technology stacks. Looking to make an IMPACT with your career? When you are thinking about joining a new team, culture matters. At Schneider Electric, our values and behaviors are the foundation for creating a great culture to support business success. We believe that our IMPACT values - Inclusion, Mastery, Purpose, Action, Curiosity, Teamwork - starts with us. IMPACT is also your invitation to join Schneider Electric where you can contribute to turning sustainability ambition into actions, no matter what role you play. It is a call to connect your career with the ambition of achieving a more resilient, efficient, and sustainable world. We are looking for IMPACT Makers; exceptional people who turn sustainability ambitions into actions at the intersection of automation, electrification, and digitization. We celebrate IMPACT Makers and believe everyone has the potential to be one. Become an IMPACT Maker with Schneider Electric - apply today! €36 billion global revenue +13% organic growth 150 000+ employees in 100+ countries #1 on the Global 100 World’s most sustainable corporations You must submit an online application to be considered for any position with us. This position will be posted until filled. Schneider Electric aspires to be the most inclusive and caring company in the world, by providing equitable opportunities to everyone, everywhere, and ensuring all employees feel uniquely valued and safe to contribute their best. We mirror the diversity of the communities in which we operate, and ‘inclusion’ is one of our core values. We believe our differences make us stronger as a company and as individuals and we are committed to championing inclusivity in everything we do. At Schneider Electric, we uphold the highest standards of ethics and compliance, and we believe that trust is a foundational value. Our Trust Charter is our Code of Conduct and demonstrates our commitment to ethics, safety, sustainability, quality and cybersecurity, underpinning every aspect of our business and our willingness to behave and respond respectfully and in good faith to all our stakeholders. You can find out more about our Trust Charter here Schneider Electric is an Equal Opportunity Employer. It is our policy to provide equal employment and advancement opportunities in the areas of recruiting, hiring, training, transferring, and promoting all qualified individuals regardless of race, religion, color, gender, disability, national origin, ancestry, age, military status, sexual orientation, marital status, or any other legally protected characteristic or conduct.
Posted 8 hours ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are looking for a highly skilled and motivated Java Full Stack Developer with 6+ years of hands-on experience in building scalable web applications using modern Java technologies and front-end frameworks. The ideal candidate will be responsible for both back-end and front-end development, with a strong understanding of best practices in software design, coding, testing, and deployment. Key Responsibilities: Develop and maintain robust and scalable web applications using Java (Spring Boot) and modern front-end frameworks. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code across the entire stack. Participate in code reviews, architectural discussions, and agile development processes. Build RESTful APIs and integrate with external systems. Optimize application performance, scalability, and security. Troubleshoot and debug issues across the application stack. Develop unit, integration, and automated tests. Stay up to date with new technologies and industry trends to ensure optimal development practices. Technical Skills & Requirements: Back-End: Strong experience with Java (Java 8 or above) Proficient in Spring Framework , especially Spring Boot , Spring MVC, Spring Security Experience with RESTful API development Familiarity with ORM tools like Hibernate or JPA Knowledge of Microservices Architecture is a plus Front-End: Solid experience with HTML5, CSS3, JavaScript, and TypeScript Hands-on experience with React.js or Angular Familiarity with Bootstrap , Material UI , or other UI libraries Database & Tools: Experience with Relational Databases like MySQL, PostgreSQL or Oracle Knowledge of NoSQL databases like MongoDB is a plus Familiar with version control tools like Git Experience with Maven/Gradle , Jenkins , and CI/CD pipelines Cloud & DevOps (Preferred): Exposure to cloud platforms like AWS, Azure, or GCP Understanding of Docker , Kubernetes , and containerized applications Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field Minimum of 5 years of experience in full stack software development Strong problem-solving skills and the ability to work independently or in a team Excellent verbal and written communication skills Nice to Have: Experience with Agile/Scrum methodologies Exposure to testing frameworks like JUnit, Mockito, or Selenium Knowledge of GraphQL, WebSockets, or message queues (Kafka, RabbitMQ)
Posted 8 hours ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
This role is for one of Weekday's clients Min Experience: 6 years Location: Pune JobType: full-time Requirements Essential Duties and Responsibilities: Work with development teams to ideate software solutions Design and implementation of the overall web architecture Develop and manage well-functioning databases and applications Work with their US counterpart to conduct scrums, sprint planning and sprint retrospective Design and implementation of continuous integration and deployment Build features and applications with a mobile responsive design Problem-solving with alternative approaches and in consultation with stakeholders Working as part of a team encourages innovation & best practices Required Qualifications: 5+ years of Proven work experience in Ruby development Deep expertise in object-oriented development, including strong design pattern knowledge Good understanding of the syntax of Ruby and its nuances Degree in Computer Science, Statistics, or relevant field Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery) and JavaScript frameworks (e.g. Angular, React, Node.js) Familiarity with databases (e.g. PostgreSQL, MySQL, MSSQL, Oracle, MongoDB), web servers (e.g. Apache) and UI/UX design Thorough understanding of user experience and possibly even product strategy Experience implementing testing platforms and unit tests Understanding of Messaging concepts and technologies Active MQ/RabbitMQ etc
Posted 8 hours ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
We’re seeking a highly skilled Senior Fullstack Engineer with deep expertise in TypeScript, a strong preference for React on the frontend and NestJS on the backend, and rock-solid software-engineering fundamentals. You’ll balance frontend and backend work, contribute ideas when new technical challenges arise, and help deliver features quickly and reliably. Key Responsibilities Design, develop, and maintain scalable fullstack applications using TypeScript (React + NestJS preferred). Integrate AI/ML capabilities into web applications using APIs or custom-trained models to enhance user experiences and automation. Comfortably tackle complex sprint tickets and help teammates unblock issues, delivering high-quality solutions efficiently. Propose and discuss technical approaches with the team when new problems surface. Collaborate closely with designers, product managers, data scientists, and engineers to ship intelligent, high-quality features. Write clean, testable, maintainable code and participate in code reviews. Deploy and troubleshoot applications in AWS-based environments. Qualifications 7+ years of professional experience across frontend and backend development. A background in AI/ML integration—particularly deploying AI-powered features within fullstack applications Advanced proficiency in TypeScript with significant React and NestJS experience. Hands-on experience integrating AI/ML APIs or services (e.g., OpenAI, AWS SageMaker, TensorFlow Serving, or similar). Strong foundations in design patterns, automated testing, clean architecture, and SOLID principles. Experience with relational databases (e.g., PostgreSQL) and ORMs such as Prisma or TypeORM. Practiced in writing and maintaining automated tests (e.g., Jest, Playwright, Cypress). Fluent English—clear, efficient verbal and written communication. Experience deploying applications to AWS (Lambda, S3, DynamoDB, API Gateway, IAM). Comfortable working in Agile environments, with a strong sense of ownership and accountability for quality and performance. Preferred Qualifications Familiarity with event-driven architectures, including tools and patterns such as Kafka / Amazon MSK, SNS + SQS fan-out, and Amazon EventBridge. Experience building microservices or modular monoliths and understanding their trade-offs. Familiarity with CI/CD pipelines (including GitHub Actions) and infrastructure-as-code tooling. Awareness of application-security best practices and performance tuning techniques. Experience with GraphQL, WebSockets, or other real-time communication patterns. Exposure to ML pipelines or MLOps workflows, even at a basic level, is a strong plus. Demonstrated eagerness to learn new technologies, especially in the evolving AI/ML space. About Us TechAhead is a global digital transformation company with a strong presence in the USA and India. We specialize in AI-first product design thinking and bespoke development solutions . With over 15 years of proven expertise, we have partnered with Fortune 500 companies and leading global brands to drive digital innovation and deliver excellence. At TechAhead, we are committed to continuous learning, growth and crafting tailored solutions that meet the unique needs of our clients. Join us to shape the future of digital innovation worldwide and drive impactful results with cutting-edge AI tools and strategies!
Posted 8 hours ago
0 years
0 Lacs
Mumbai Metropolitan Region
Remote
Checkmate is building advanced Voice AI systems for some of the largest restaurant and retail brands in the US, including several in the top 10. Unlike many companies still in the prototype phase, our AI solutions are live in production with real customers, achieving over 80% accuracy. This is a $1 billion market opportunity, and we're scaling to 3,000+ stores by the end of this year. Join us at this pivotal moment to shape AI products used daily by thousands of staff and customers, driving measurable impact at scale. Requirements Prompt Design & Evaluation - Develop, test, and refine prompts for tasks such as text generation, question answering, data classification, and structured data extraction to optimize Voice AI performance Data-Driven Analysis & Quality Measurement - Design evaluation frameworks and analyze prompt outputs using quantitative metrics, human-in-the-loop evaluation, and user feedback to identify improvement opportunities Experimentation & Iteration - Conduct experiments to test prompt variations, measure their business and operational impact, and iterate to enhance accuracy, consistency, and safety Regression Testing & Compliance - Build principled regression test suites using tools like LangFuse and Galileo to ensure prompts remain compliant and high-performing as models and use cases evolve Collaboration Across Teams - Work closely with data science, product, legal, engineering, and operations teams to align prompt designs with business goals, operational workflows, and compliance requirements Model Adaptation & Strategy Develop - prompts across multiple LLMs (GPT, LLaMA, Gemini, and Checkmate's fine-tuned models), understanding model differences to optimize outputs effectively Team Leadership & Mentorship Lead - a team of analysts focused on prompt evaluation and data quality analysis, guiding prioritization, experimentation, and reporting. Collaborate with ops teams for seamless deployment and feedback loops Research & Continuous Learning -Stay up to date on emerging prompting techniques, LLM behaviors, evaluation frameworks, and AI safety practices to keep Checkmate's AI solutions best-in-class. Minimum Qualifications Strong analytical and data science skills, with hands-on experience in Python (pandas, NumPy, scikit-learn) Experience designing and conducting experiments and evaluations in applied AI or NLP contexts Proficiency in SQL and working with relational databases (e.g. MySQL, PostgreSQL, Oracle, MS SQL) Good understanding of data processing, quality measurement, and testing fundamentals Experience leading analyst or operations teams, with strong prioritization, mentorship, and collaboration skills Strong problem-solving mindset with a drive to explore, optimize, and automate workflows Excellent communication skills for presenting insights to technical and non-technical stakeholders Bachelor's degree in Data Science, Computer Science, Statistics, Engineering, or a related field Flexible to work US hours until at least 6 p.m. ET, with a strong remote setup Preferred Qualifications Experience with LLM evaluation and prompt engineering workflows Familiarity with tools like LangFuse and Galileo for prompt evaluation and analysis Knowledge of cloud platforms (AWS, GCP, Azure) and data pipeline tools Familiarity with machine learning concepts and NLP workflows Master's or PhD in Data Science, Statistics, Computer Science, or a related field
Posted 8 hours ago
10.0 - 18.0 years
0 Lacs
Tamil Nadu, India
On-site
Exp Range : 10 -18 YearsLocation: Chennai / Hyderabad Notice Period: Immediate - 30 days Six or more years of relevant experience required, demonstrated through one or a combination of work experience, or specialized training. Extensive working experience with overall IT architecture, design, systems development in Telecom Equipment build, Network build, Inventory & Topology domains. Extensive understanding of Wireline Access, Edge & Core Network technologies, respective Network build & management process Extensive knowledge on Layer 1, Layer 2, Layer 3 provisioning Experience in distributed systems with data intensive applications and data ingestion/processing frameworks Strong skills in system design and architecture, including feature definition and low-level design creation. Experience working with relational and non-relational databases (e.g., Oracle DB, PostgreSQL, Neo4J, MongoDB, Cassandra). Knowledge of cloud platforms (AWS, OCI, Google Cloud Platform, Kubernetes). Experience with telecom tools, including Ericsson and Oracle suite of systems such as TIRKS, WFA, SOAC, MARCH, and M6. Experience and solid understanding of the broad business goals and direction, delivering solutions to complex business/ordering problems Quickly able to organize around a problem and engage various teams to solve the challenge. Strong communication skills to coordinate with stakeholders and report to management. Ability to collaborate effectively with global teams, including vendors and offshore resources
Posted 8 hours ago
7.0 years
0 Lacs
Tamil Nadu, India
On-site
Exp Range: 7-10 Years Location: chennai Summary: We are seeking a highly skilled and experienced Senior Full Stack Developer to join our dynamic software engineering team. The ideal candidate will have a strong command of both front-end and back-end technologies, with a proven track record of designing, developing, and deploying robust, scalable, and high-performance web applications. This role requires not only technical expertise but also strong problem-solving skills, leadership potential, and the ability to work collaboratively in an agile environment. Required Skills and Qualifications: Experience: 7+ years of professional experience in full-stack web development.Front-End Expertise (Proficiency in at least one):Languages: HTML5, CSS3, JavaScript (ES6+), TypeScript.Frameworks/Libraries: React.js, Angular, Vue.js (React.js preferred).State Management: Redux, MobX, Context API, NgRx, Vuex.Styling: Styled-Components, CSS, Material-UI, Bootstrap.Build Tools: WebpackBack-End Expertise (Proficiency in at least one language/framework):Languages: Node.js (with Express.js/NestJS), Python (with Django/Flask), Java (with Spring Boot), Go, Ruby on Rails, C# (.NET Core).Databases:Relational: PostgreSQL, MySQL, SQL Server (strong SQL query writing and optimization skills).NoSQL: MongoDB, Cassandra, DynamoDB.API Development: RESTful APIs, GraphQL.Version Control: Expert proficiency with Git and GitHub/GitLab/Bitbucket.Testing: Experience with unit testing frameworks (e.g., Jest, React Testing Library, Mocha, JUnit, Pytest) and integration testing.Cloud Platforms (Experience with at least one): AWS, Azure, Google Cloud Platform (GCP). Understanding of serverless architectures (Lambda, Azure Functions) is a plus.Containerization & Orchestration (Preferred): Docker, Kubernetes.CI/CD: Experience setting up and managing CI/CD pipelines (e.g., Jenkins, GitLab CI, GitHub Actions, CircleCI).Operating Systems: Linux/Unix command-line proficiency.Soft Skills:Excellent problem-solving and analytical skills with a pragmatic approach.Strong communication skills, both written and verbal, with the ability to articulate complex technical concepts to non-technical stakeholders.Ability to work independently and as part of a highly collaborative, cross-functional agile team.Strong leadership potential and a desire to mentor junior developers.High attention to detail and commitment to producing high-quality, maintainable code
Posted 8 hours ago
7.0 - 10.0 years
0 Lacs
Tamil Nadu, India
On-site
Location: Chennai, India Workplace Type: Hybrid About The Role We are seeking a highly skilled and experienced PostgreSQL Database Administrator to join our dynamic team. In this role, you will be responsible for the administration, maintenance, and optimization of our PostgreSQL databases, ensuring their reliability, performance, and security. You will work closely with development and operations teams to support our critical business applications. The ideal candidate will have a strong background in PostgreSQL administration, AWS Aurora/RDS PostgreSQL, and scripting languages such as Shell and Python. Experience with SQL Server, Redshift, and Opensearch/ElasticSearch is a plus. This is a fantastic opportunity to contribute to a growing organization and make a significant impact on our database infrastructure. Key Responsibilities Administer and maintain PostgreSQL databases, ensuring their availability, performance, and security Implement and maintain database security measures, including user access controls and encryption Monitor database performance and identify areas for optimization Perform database backups and recovery procedures Troubleshoot database issues and provide timely resolutions Develop and maintain database documentation Collaborate with development and operations teams to support application deployments and database changes Automate database tasks using Shell and Python scripting Manage and maintain AWS Aurora/RDS PostgreSQL instances Participate in on-call rotation for database support Implement and maintain high availability and disaster recovery solutions Perform database upgrades and patching Conduct performance tuning and query optimization Ensure compliance with data governance and security policies Required Skills & Qualifications Bachelor's degree in Computer Science or a related field 7-10 years of experience in PostgreSQL database administration Expertise in PostgreSQL administration, including installation, configuration, and maintenance Strong experience with AWS Aurora/RDS PostgreSQL Proficiency in Shell and Python scripting for database automation Solid understanding of database security principles and best practices Experience with database backup and recovery procedures Excellent troubleshooting and problem-solving skills Strong communication and collaboration skills Ability to work independently and as part of a team Must be able to join within Immediate to 15 days notice period
Posted 8 hours ago
3.0 - 8.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Experience: 3 to 8 years Location :- Ahmedabad (WFO) Key Responsibilities Design and build microservices using Java (Spring Boot), following well-establishedpatterns and practices. Design and implement real-time features using WebSocket for low-latency, bidirectional communication. Implement background and scheduled tasks using ScheduledExecutorService for precise, programmatic control. Apply at least one microservice design pattern (e.g., Circuit Breaker, CQRS, Saga, API Gateway) effectively as part of system architecture. Implement clean service boundaries, well-defined APIs, and asynchronous communication (REST/Kafka/etc.). Contribute to decisions around service granularity, data consistency, and fault tolerance. Write maintainable, testable, and secure code that meets business and technical requirements. Participate in code reviews, design discussions, and production troubleshooting. Collaborate with DevOps to ensure smooth deployment, monitoring, and observability of services. Mentor junior engineers and share technical insights within the team. Skills & Experience Required Strong programming skills in Java (11 or higher) and experience with Spring Boot,Spring Cloud Gateway for building RESTful microservices. Practical knowledge and application of at least one microservices design pattern (e.g., Circuit Breaker, API Gateway, Saga, CQRS, Service Mesh). Solid hands-on experience with Spring Data JPA, and Hibernate for data persistence. Hands-on experience with WebSocket in Java (preferably using Spring WebSocket or Netty). Proficiency in ScheduledExecutorService for scheduling and managing background jobs. Experience with event-driven systems using Kafka or similar messaging platforms. Proficiency in working with RDBMS (PostgreSQL/MS SQL) and optionally NoSQL (MongoDB/Redis). Familiarity with containerized environments using Docker, with working knowledge of Kubernetes. Understanding of authentication and authorization principles (OAuth2, JWT). Hands-on experience with CI/CD pipelines and monitoring/logging tools like Prometheus, Grafana, ELK, etc. Strong problem-solving mindset and experience in troubleshooting distributed systems.
Posted 9 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France