Home
Jobs
25 Job openings at Robosoft Technologies
Robosoft Technologies - Full Stack Engineer - React.js/AngularJS

Greater Kolkata Area

0 - 2 years

Not disclosed

On-site

Not specified

6 months contract period We are seeking a highly skilled Full Stack Developer to join our dynamic team. The ideal candidate will have a solid background in both front-end and back-end development, with a keen eye for detail and a passion for delivering high-quality software solutions. You will be responsible for designing, developing, and maintaining web applications that provide an excellent user experience and meet the evolving needs of our clients. Requirements Front-End Skills : Proficiency in HTML, CSS, JavaScript, and modern front-end frameworks (React, Angular, Vue.js, etc.). Back-End Skills : Strong knowledge of GraphQL and frameworks (Node.js, Typescripts, Express, Django, Ruby on Rails, etc.). Database Skills : Experience with SQL and NoSQL databases (MySQL, PostgreSQL, MongoDB, etc.), strong knowledge of Data bricks working and query management Tools : Familiarity with version control systems (Git), containerization (Docker), and CI/CD pipelines. Problem-Solving : Strong analytical and problem-solving skills, with the ability to troubleshoot complex issues. Communication : Excellent communication and teamwork skills, with the ability to collaborate effectively with cross-functional teams. Adaptability : Ability to work in a fast-paced environment and adapt to changing project requirements and deadlines. Deployment Experience : Hands-on experience with deploying web applications using cloud services (AWS, Azure, Google Cloud) and CI/CD tools. (ref:hirist.tech)

DataOps Engineer

Udupi, Karnataka, India

2 - 5 years

INR 2.0 - 5.0 Lacs P.A.

On-site

Full Time

We are seeking a highly skilled and experienced DataOps Engineer to join our dynamic team. The ideal candidate will have a strong background in DataOps practices with a focus on AWS and Azure DevOps, Databricks setup and management, PostgreSQL administration, Docker management, CI/CD setup, and Azure/AWS infrastructure management. This role is critical in ensuring the seamless integration and deployment of data infrastructure, enabling efficient and reliable data operations. Responsibilities: DevOps Management: Design, implement, and manage DevOps pipelines for automated build, test, and deployment processes with tools like Git, Azure DevOps, Jenkins, GitActions etc., Mostly for data workloads. Build and manage Databricks, snowflake, Kafka, and other cloud native data services/tools setup. Collaborate with development teams to integrate code changes and ensure seamless delivery to production. Databricks Setup and Management: Set up and manage Azure Databricks environments for large-scale data processing and analytics. Optimize Databricks clusters and manage costs while ensuring high availability and performance. PostgreSQL Administration: Administer PostgreSQL databases, ensuring their optimal performance, security, and reliability. Perform routine database maintenance tasks such as backups, restoration, and tuning queries for performance. Docker Management: Develop and manage Docker containers to ensure consistency across development, testing, and production environments. Monitor containerized applications for performance and resolve any issues related to container orchestration. CI/CD Setup: Design and implement CI/CD pipelines to automate software deployments and data pipelines. Ensure that the CI/CD pipelines are scalable, secure, and capable of handling large volumes of data. Azure/AWS Infrastructure Management: Manage Azure infrastructure components such as Virtual Networks, Storage Accounts, and Resource Groups. Monitor and optimize the performance of the Azure environment, ensuring scalability and reliability. Implement security best practices across Azure services to protect data and applications. Collaboration and Communication: Work closely with data engineers, software developers, and IT teams to integrate DataOps processes across the organization. Provide technical guidance and mentorship to junior team members on DataOps best practices. Monitoring and Optimization: Implement monitoring solutions to track the health and performance of data pipelines and infrastructure. Continuously optimize processes and infrastructure for cost-effectiveness and efficiency. Requirements: Bachelor s degree in Computer Science- Information Technology, or a related field. Minimum of 7 years of hands-on experience in a DataOps or DevOps role, with a strong focus on data infrastructure and cloud platforms. Proficiency in Azure DevOps for managing code repositories, CI/CD pipelines, and build/release processes. Extensive experience in setting up and managing Databricks and Snowflake environments. Strong PostgreSQL administration skills, including performance tuning, backups, and security. Experience with Docker for containerizing applications and managing container orchestration. Hands-on experience in setting up and managing CI/CD pipelines. Expertise in Azure/AWS infrastructure management, including monitoring, security, and cost optimization. Exposure to Sage maker is nice to have. Certification in Azure DevOps, Databricks, or PostgreSQL, AWS Solution or developer certificate. Experience with other cloud platforms (GCP) is a plus. Knowledge of scripting languages (e.g., Python, PowerShell) for automation tasks. Strong problem-solving skills and attention to detail. Excellent communication and collaboration abilities. Ability to work independently and as part of a team in a fast-paced environment.

DataOps Engineer

Navi Mumbai, Maharashtra, India

2 - 5 years

INR 2.0 - 5.0 Lacs P.A.

On-site

Full Time

We are seeking a highly skilled and experienced DataOps Engineer to join our dynamic team. The ideal candidate will have a strong background in DataOps practices with a focus on AWS and Azure DevOps, Databricks setup and management, PostgreSQL administration, Docker management, CI/CD setup, and Azure/AWS infrastructure management. This role is critical in ensuring the seamless integration and deployment of data infrastructure, enabling efficient and reliable data operations. Responsibilities: DevOps Management: Design, implement, and manage DevOps pipelines for automated build, test, and deployment processes with tools like Git, Azure DevOps, Jenkins, GitActions etc., Mostly for data workloads. Build and manage Databricks, snowflake, Kafka, and other cloud native data services/tools setup. Collaborate with development teams to integrate code changes and ensure seamless delivery to production. Databricks Setup and Management: Set up and manage Azure Databricks environments for large-scale data processing and analytics. Optimize Databricks clusters and manage costs while ensuring high availability and performance. PostgreSQL Administration: Administer PostgreSQL databases, ensuring their optimal performance, security, and reliability. Perform routine database maintenance tasks such as backups, restoration, and tuning queries for performance. Docker Management: Develop and manage Docker containers to ensure consistency across development, testing, and production environments. Monitor containerized applications for performance and resolve any issues related to container orchestration. CI/CD Setup: Design and implement CI/CD pipelines to automate software deployments and data pipelines. Ensure that the CI/CD pipelines are scalable, secure, and capable of handling large volumes of data. Azure/AWS Infrastructure Management: Manage Azure infrastructure components such as Virtual Networks, Storage Accounts, and Resource Groups. Monitor and optimize the performance of the Azure environment, ensuring scalability and reliability. Implement security best practices across Azure services to protect data and applications. Collaboration and Communication: Work closely with data engineers, software developers, and IT teams to integrate DataOps processes across the organization. Provide technical guidance and mentorship to junior team members on DataOps best practices. Monitoring and Optimization: Implement monitoring solutions to track the health and performance of data pipelines and infrastructure. Continuously optimize processes and infrastructure for cost-effectiveness and efficiency. Requirements: Bachelor s degree in Computer Science- Information Technology, or a related field. Minimum of 7 years of hands-on experience in a DataOps or DevOps role, with a strong focus on data infrastructure and cloud platforms. Proficiency in Azure DevOps for managing code repositories, CI/CD pipelines, and build/release processes. Extensive experience in setting up and managing Databricks and Snowflake environments. Strong PostgreSQL administration skills, including performance tuning, backups, and security. Experience with Docker for containerizing applications and managing container orchestration. Hands-on experience in setting up and managing CI/CD pipelines. Expertise in Azure/AWS infrastructure management, including monitoring, security, and cost optimization. Exposure to Sage maker is nice to have. Certification in Azure DevOps, Databricks, or PostgreSQL, AWS Solution or developer certificate. Experience with other cloud platforms (GCP) is a plus. Knowledge of scripting languages (e.g., Python, PowerShell) for automation tasks. Strong problem-solving skills and attention to detail. Excellent communication and collaboration abilities. Ability to work independently and as part of a team in a fast-paced environment.

IT Sales Lead

Navi Mumbai, Maharashtra, India

4 - 7 years

INR 4.0 - 7.0 Lacs P.A.

On-site

Full Time

We are looking for a passionate Sales hacker with a track record of proven sales success in Software Solutions/Services in Fortune 1000/500/100 companies. Resource will play a key role in driving our business to great heights and drive our revenue growth in parallel. Responsibilities: As a sales lead, you will be: - Forming Strategic Partnerships, building/hunting new logos, Business Development, Building Team, Client Acquisitions adding successful accounts and managing a long term relationship. Manage existing accounts and build new potential network/clients- Create great impression, generate a good stack of qualified leads and grow them. Actively participate in conferences and meetups to network with talented individuals- Consultative Selling, facilitate solutioning and lead discussions. Requirements: Strong Interpersonal, Organisational, Presentation, Negotiation Communication skills. Prior sales experience in selling enterprise software services/solution and result driven. Track record of successful selling IT Solution/IT Services to Director -C-Level Executives at- Fortune 1000/500/100 companies ( CTO, CIO and CEO). Should have prior worked for any of leading BFSI / Retail / Manufacturing / Consumer / Mobile Technology. Consistent track record in running a metric driven business target with proven results (quarterly/Annually). Should have solid understanding on how customer acquisition impacts the profitability of our business. Experience in integrating the Digital transformation of the organization, should understand the Business processes, Products and Digital services. Quality Compliance Compliance to Quality and Information Security is critical in ensuring the integrity, confidentiality, availability of data and the consistent delivery of high-quality services is an important aspect of hiring for this position.

IT Sales Lead

Udupi, Karnataka, India

4 - 7 years

INR 4.0 - 7.0 Lacs P.A.

On-site

Full Time

We are looking for a passionate Sales hacker with a track record of proven sales success in Software Solutions/Services in Fortune 1000/500/100 companies. Resource will play a key role in driving our business to great heights and drive our revenue growth in parallel. Responsibilities: As a sales lead, you will be: - Forming Strategic Partnerships, building/hunting new logos, Business Development, Building Team, Client Acquisitions adding successful accounts and managing a long term relationship. Manage existing accounts and build new potential network/clients- Create great impression, generate a good stack of qualified leads and grow them. Actively participate in conferences and meetups to network with talented individuals- Consultative Selling, facilitate solutioning and lead discussions. Requirements: Strong Interpersonal, Organisational, Presentation, Negotiation Communication skills. Prior sales experience in selling enterprise software services/solution and result driven. Track record of successful selling IT Solution/IT Services to Director -C-Level Executives at- Fortune 1000/500/100 companies ( CTO, CIO and CEO). Should have prior worked for any of leading BFSI / Retail / Manufacturing / Consumer / Mobile Technology. Consistent track record in running a metric driven business target with proven results (quarterly/Annually). Should have solid understanding on how customer acquisition impacts the profitability of our business. Experience in integrating the Digital transformation of the organization, should understand the Business processes, Products and Digital services. Quality Compliance Compliance to Quality and Information Security is critical in ensuring the integrity, confidentiality, availability of data and the consistent delivery of high-quality services is an important aspect of hiring for this position.

Tech Lead - Data Bricks

Udupi, Karnataka, India

4 - 7 years

INR 4.0 - 7.0 Lacs P.A.

On-site

Full Time

We are seeking a skilled Databricks Architect to design, implement, and optimize scalable data solutions within our cloud-based data platform. This role requires extensive knowledge of Databricks (Azure/AWS), data engineering, and a deep understanding of data architecture principles, with the ability to drive strategy, best practices, and hands-on implementation for high-performance data processing and analytics solutions. Responsibilities: Solution Architecture: Design and architect end-to-end data solutions using Databricks and Azure/AWS, including data ingestion, processing, and storage. Delta Lake Implementation: Leverage Delta Lake and Lakehouse architecture to create robust, unified data structures that support advanced analytics and machine learning. Data Processing Development: Develop, design, and automate large-scale, high-performance data processing systems (batch and/or streaming) to drive business growth and enhance the product experience. Performance Tuning: Ensure optimal performance of data pipelines and workloads by implementing best practices for resource management, auto-scaling, and query optimization in Databricks. Engineering Best Practices: Advocate for high-quality software engineering practices in building scalable data infrastructure and pipelines. Architecture/Solution Development: Develop Architecture or solution for large data project using Databricks. Project Leadership: Lead data engineering projects to ensure pipelines are reliable, efficient, testable, and maintainable. Data Modeling: Design data models optimized for storage, retrieval, and critical product and business requirements. Logging Architecture: Understand and influence logging to support data flow, implementing logging best practices as needed. Standardization and Tooling: Contribute to shared data engineering tools and standards to boost productivity and quality for Data Engineers across the company. Collaboration: Work closely with leadership, engineers, program managers, and data scientists to understand and meet data needs. Partner Education: Use data engineering expertise to identify gaps and improve existing logging and processes for partners. Data Governance: Collaborate with stakeholders to build data lineage, data governance, and data cataloging using unity catalog. Agile Project Management: Lead projects using agile methodologies. Communication: Communicate effectively with stakeholders at all organizational levels. Team Development: Recruit, retain, and develop team members, preparing them for increased responsibilities and challenges. Requirements: 10+ years of relevant industry experience. ETL Expertise: Skilled in custom ETL design, implementation, and maintenance. Data Modeling: Experience in developing and designing data models for reporting systems. Databricks Proficiency: Hands-on experience with Databricks SQL workloads. Data Ingestion: Expertise in data ingestion from offline files (e.g., CSV, TXT, JSON) along with API and DB, CDC data ingestion. Should have handled such projects in past. Pipeline Observability: Skilled in setting up robust observability for complete pipelines and Databricks in Azure/AWS. Database Knowledge: Proficient in relational databases and SQL query authoring. Programming and Frameworks: Experience with Java, Scala, Spark, PySpark, Python, and Databricks. Cloud Platforms: Cloud experience required (Azure/AWS preferred). Data Scale Handling: Experience working with large-scale data. Pipeline Design and Operations: Proven experience in designing, building, and operating robust data pipelines. Performance Monitoring: Skilled in deploying high-performance pipelines with reliable monitoring and logging. Cross-Team Collaboration: Able to work effectively across teams to establish overarching data architecture and provide team guidance. ETL Optimization: Ability to optimize ETL pipelines to reduce data transfer and storage costs. Auto Scaling: Skilled in using Databricks SQL s auto-scaling feature to adjust worker numbers based on workload. Tech Stack: Cloud Platform: Azure/AWS. Azure/AWS: Databricks SQL Serverless, Databricks SQL, Databricks workspaces, Databricks notebooks, Databricks job scheduling, Data Catalog. Data Architecture: Delta Lake, Lakehouse concepts. Data Processing: Spark Structured/Streaming. File Formats: CSV, Avro, Parquet. CI/CD: CI/CD for ETL pipelines. Governance Model: Databricks SQL unified governance model (Unity Catalog) across clouds, supporting open formats and APIs.

Senior Engineer MLops

Navi Mumbai, Maharashtra, India

0 - 4 years

INR 0.5 - 4.0 Lacs P.A.

On-site

Full Time

As an MLOps Engineer, you will be responsible for building and optimizing our machine learning infrastructure. You will leverage AWS services, containerization, and automation to streamline the deployment and monitoring of ML models. Your expertise in MLOps best practices, combined with your experience in managing large ML operations, will ensure our models are effectively deployed, managed, and maintained in production environments. Responsibilities: Machine Learning Operations (MLOps) & Deployment: Build, deploy, and manage ML models in production using AWS SageMaker, AWS Lambda, and other relevant AWS services. Develop automated pipelines for model training, validation, deployment, and monitoring to ensure high availability and low latency. Implement best practices for CI/CD in ML model deployment and manage versioning for seamless updates. Infrastructure Development & Optimization: Design and maintain scalable, efficient, and secure infrastructure for machine learning operations using AWS services (e.g., EC2, S3, SageMaker, ECR, ECS/EKS). Leverage containerization (Docker, Kubernetes) to deploy models as microservices, optimizing for scalability and resilience. Manage infrastructure as code (IaC) using tools like Terraform, AWS CloudFormation, or similar, ensuring reliable and reproducible environments. Model Monitoring & Maintenance: Set up monitoring, logging, and alerting for deployed models to track model performance, detect anomalies, and ensure uptime. Implement feedback loops to enable automated model retraining based on new data, ensuring models remain accurate and relevant over time. Troubleshoot and resolve issues in the ML pipeline and infrastructure to maintain seamless operations. AWS Connect & Integration: Integrate machine learning models with AWS Connect or similar services for customer interaction workflows, providing real-time insights and automation. Work closely with cross-functional teams to ensure models can be easily accessed and utilized by various applications and stakeholders. Collaboration & Stakeholder Engagement: Collaborate with data scientists, engineers, and DevOps teams to ensure alignment on project goals, data requirements, and model deployment standards. Provide technical guidance on MLOps best practices and educate team members on efficient ML deployment and monitoring processes. Actively participate in project planning, architecture decisions, and road mapping sessions to improve our ML infrastructure. Security & Compliance: Implement data security and compliance measures, ensuring all deployed models meet organizational and regulatory standards. Apply appropriate data encryption and manage access controls to safeguard sensitive information used in ML models. Requirements: Bachelor s or Master s degree in Computer Science, Engineering, or a related field. Experience: 5+ years of experience as an MLOps Engineer, DevOps Engineer, or similar role focused on machine learning deployment and operations. Strong expertise in AWS services, particularly SageMaker, EC2, S3, Lambda, and ECR/ECS/EKS. Proficiency in Python, including ML-focused libraries like scikit-learn and data manipulation libraries like pandas. Hands-on experience with containerization tools such as Docker and Kubernetes. Familiarity with infrastructure as code (IaC) tools such as Terraform or AWS CloudFormation. Experience with CI/CD pipelines, Git, and version control for ML model deployment. MLOps & Model Management: Proven experience in managing large ML projects, including model deployment, monitoring, and maintenance. AWS Connect & Integration: Understanding of AWS Connect for customer interactions and integration with ML models. Soft Skills: Strong communication and collaboration skills, with the ability to explain technical concepts to non-technical stakeholders. Experience with data streaming and message queues (e.g., Kafka, AWS Kinesis). Familiarity with monitoring tools like Prometheus, Grafana, or CloudWatch for tracking model performance. Knowledge of data governance, security, and compliance requirements related to ML data handling. Certification in AWS or relevant cloud platforms. Work Schedule: This role requires significant overlap with CST time zone to ensure real-time collaboration with the team and stakeholders based in the U.S. Flexibility is key, and applicants should be available for meetings and work during U.S. business hours.

Roku Developer

Navi Mumbai, Maharashtra, India

2 - 6 years

INR 2.0 - 6.0 Lacs P.A.

Remote

Full Time

We are excited to announce an opening for a talented Roku to join our vibrant development team. If you are enthusiastic about securing scalable and efficient web applications and thrive in a collaborative setting, we want to hear from you. Key Responsibilities Hands on experience on designing channels using Roku, understanding of the key UI and design principles, and handling remote-control buttons. Extensive hands-on experience of creating Roku app custom UI controls, high performing API integrations, advanced state management, cross device support and performance optimisation. Complete knowledge of SceneGraph components, XML configuration, handling application events, event loops, threads and controlling screen layouts. Requirements: Develop cutting edge Roku video streaming apps for OTT. Quality Compliance Compliance to Quality and Information Security is critical in ensuring the integrity, confidentiality, availability of data and the consistent delivery of high-quality services is an important aspect of hiring for this position.

Roku Developer

Udupi, Karnataka, India

2 - 6 years

INR 2.0 - 6.0 Lacs P.A.

Remote

Full Time

We are excited to announce an opening for a talented Roku to join our vibrant development team. If you are enthusiastic about securing scalable and efficient web applications and thrive in a collaborative setting, we want to hear from you. Key Responsibilities Hands on experience on designing channels using Roku, understanding of the key UI and design principles, and handling remote-control buttons. Extensive hands-on experience of creating Roku app custom UI controls, high performing API integrations, advanced state management, cross device support and performance optimisation. Complete knowledge of SceneGraph components, XML configuration, handling application events, event loops, threads and controlling screen layouts. Requirements: Develop cutting edge Roku video streaming apps for OTT. Quality Compliance Compliance to Quality and Information Security is critical in ensuring the integrity, confidentiality, availability of data and the consistent delivery of high-quality services is an important aspect of hiring for this position.

Senior Engineer-IVA Chatbot

Navi Mumbai, Maharashtra, India

5 - 9 years

INR 5.0 - 9.0 Lacs P.A.

On-site

Full Time

As part of our digital transformation efforts, we are building an advanced Intelligent Virtual Assistant (IVA) to enhance customer interactions, and we are seeking a talented and motivated Machine Learning (ML) / Artificial Intelligence (AI) Engineer to join our dynamic team full time to support this effort. Responsibilities: Design, develop, and implement AI-driven chatbots and IVAs to streamline customer interactions. Work on conversational AI platforms to create a seamless customer experience, with a focus on natural language processing (NLP), intent recognition, and sentiment analysis. Collaborate with cross-functional teams, including product managers and customer support, to translate business requirements into technical solutions. Build, train, and fine-tune machine learning models to enhance IVA capabilities and ensure high accuracy in responses. Continuously optimize models based on user feedback and data-driven insights to improve performance. Integrate IVA/chat solutions with internal systems such as CRM and backend databases. Ensure scalability, robustness, and security of IVA/chat solutions in compliance with industry standards. Participate in code reviews, testing, and deployment of AI solutions to ensure high quality and reliability. Requirements: Bachelors or Master s degree in Computer Science, Data Science, AI/ML, or a related field. 5+ years of experience in developing IVA/chatbots, conversational AI, or similar AI-driven systems using AWS services. Expert in using Amazon Lex, Amazon Polly, AWS lambda, AWS connect. AWS Bedrock experience with Sage maker will have added advantage. Solid understanding of API integration and experience working with RESTful services. Strong problem-solving skills, attention to detail, and ability to work independently and in a team. Excellent communication skills, both written and verbal. Experience in financial services or fintech projects. Knowledge of data security best practices and compliance requirements in the financial sector. This role requires significant overlap with CST time zone to ensure real-time collaboration with the team and stakeholders based in the U.S. Flexibility is key, and applicants should be available for meetings and work during U.S. business hours.

Tech Lead - Data Bricks

Navi Mumbai, Maharashtra, India

4 - 7 years

INR 4.0 - 7.0 Lacs P.A.

On-site

Full Time

We are seeking a skilled Databricks Architect to design, implement, and optimize scalable data solutions within our cloud-based data platform. This role requires extensive knowledge of Databricks (Azure/AWS), data engineering, and a deep understanding of data architecture principles, with the ability to drive strategy, best practices, and hands-on implementation for high-performance data processing and analytics solutions. Responsibilities: Solution Architecture: Design and architect end-to-end data solutions using Databricks and Azure/AWS, including data ingestion, processing, and storage. Delta Lake Implementation: Leverage Delta Lake and Lakehouse architecture to create robust, unified data structures that support advanced analytics and machine learning. Data Processing Development: Develop, design, and automate large-scale, high-performance data processing systems (batch and/or streaming) to drive business growth and enhance the product experience. Performance Tuning: Ensure optimal performance of data pipelines and workloads by implementing best practices for resource management, auto-scaling, and query optimization in Databricks. Engineering Best Practices: Advocate for high-quality software engineering practices in building scalable data infrastructure and pipelines. Architecture/Solution Development: Develop Architecture or solution for large data project using Databricks. Project Leadership: Lead data engineering projects to ensure pipelines are reliable, efficient, testable, and maintainable. Data Modeling: Design data models optimized for storage, retrieval, and critical product and business requirements. Logging Architecture: Understand and influence logging to support data flow, implementing logging best practices as needed. Standardization and Tooling: Contribute to shared data engineering tools and standards to boost productivity and quality for Data Engineers across the company. Collaboration: Work closely with leadership, engineers, program managers, and data scientists to understand and meet data needs. Partner Education: Use data engineering expertise to identify gaps and improve existing logging and processes for partners. Data Governance: Collaborate with stakeholders to build data lineage, data governance, and data cataloging using unity catalog. Agile Project Management: Lead projects using agile methodologies. Communication: Communicate effectively with stakeholders at all organizational levels. Team Development: Recruit, retain, and develop team members, preparing them for increased responsibilities and challenges. Requirements: 10+ years of relevant industry experience. ETL Expertise: Skilled in custom ETL design, implementation, and maintenance. Data Modeling: Experience in developing and designing data models for reporting systems. Databricks Proficiency: Hands-on experience with Databricks SQL workloads. Data Ingestion: Expertise in data ingestion from offline files (e.g., CSV, TXT, JSON) along with API and DB, CDC data ingestion. Should have handled such projects in past. Pipeline Observability: Skilled in setting up robust observability for complete pipelines and Databricks in Azure/AWS. Database Knowledge: Proficient in relational databases and SQL query authoring. Programming and Frameworks: Experience with Java, Scala, Spark, PySpark, Python, and Databricks. Cloud Platforms: Cloud experience required (Azure/AWS preferred). Data Scale Handling: Experience working with large-scale data. Pipeline Design and Operations: Proven experience in designing, building, and operating robust data pipelines. Performance Monitoring: Skilled in deploying high-performance pipelines with reliable monitoring and logging. Cross-Team Collaboration: Able to work effectively across teams to establish overarching data architecture and provide team guidance. ETL Optimization: Ability to optimize ETL pipelines to reduce data transfer and storage costs. Auto Scaling: Skilled in using Databricks SQL s auto-scaling feature to adjust worker numbers based on workload. Tech Stack: Cloud Platform: Azure/AWS. Azure/AWS: Databricks SQL Serverless, Databricks SQL, Databricks workspaces, Databricks notebooks, Databricks job scheduling, Data Catalog. Data Architecture: Delta Lake, Lakehouse concepts. Data Processing: Spark Structured/Streaming. File Formats: CSV, Avro, Parquet. CI/CD: CI/CD for ETL pipelines. Governance Model: Databricks SQL unified governance model (Unity Catalog) across clouds, supporting open formats and APIs.

Senior Engineer-IVA Chatbot

Udupi, Karnataka, India

5 - 9 years

INR 5.0 - 9.0 Lacs P.A.

On-site

Full Time

As part of our digital transformation efforts, we are building an advanced Intelligent Virtual Assistant (IVA) to enhance customer interactions, and we are seeking a talented and motivated Machine Learning (ML) / Artificial Intelligence (AI) Engineer to join our dynamic team full time to support this effort. Responsibilities: Design, develop, and implement AI-driven chatbots and IVAs to streamline customer interactions. Work on conversational AI platforms to create a seamless customer experience, with a focus on natural language processing (NLP), intent recognition, and sentiment analysis. Collaborate with cross-functional teams, including product managers and customer support, to translate business requirements into technical solutions. Build, train, and fine-tune machine learning models to enhance IVA capabilities and ensure high accuracy in responses. Continuously optimize models based on user feedback and data-driven insights to improve performance. Integrate IVA/chat solutions with internal systems such as CRM and backend databases. Ensure scalability, robustness, and security of IVA/chat solutions in compliance with industry standards. Participate in code reviews, testing, and deployment of AI solutions to ensure high quality and reliability. Requirements: Bachelors or Master s degree in Computer Science, Data Science, AI/ML, or a related field. 5+ years of experience in developing IVA/chatbots, conversational AI, or similar AI-driven systems using AWS services. Expert in using Amazon Lex, Amazon Polly, AWS lambda, AWS connect. AWS Bedrock experience with Sage maker will have added advantage. Solid understanding of API integration and experience working with RESTful services. Strong problem-solving skills, attention to detail, and ability to work independently and in a team. Excellent communication skills, both written and verbal. Experience in financial services or fintech projects. Knowledge of data security best practices and compliance requirements in the financial sector. This role requires significant overlap with CST time zone to ensure real-time collaboration with the team and stakeholders based in the U.S. Flexibility is key, and applicants should be available for meetings and work during U.S. business hours.

Android TV Developer

Udupi, Karnataka, India

5 - 8 years

INR 5.0 - 8.0 Lacs P.A.

On-site

Full Time

We are seeking a talented Android TV Developer to join our innovative development team. If you are passionate about creating high-quality and user-friendly TV applications, have strong expertise in Android TV development, and enjoy working in a collaborative environment, we want to hear from you. Your skills in building scalable and efficient TV apps will be key in delivering exceptional entertainment experiences to our users. Responsibilities: Works in designing & architecting/developing mobile solutions (native/hybrid/web) in leading mobile platforms especially Android and Android / Android TV. Working with TV Libraries - androidx libraries available for TV devices that provide widgets for building user interfaces: Building TV Layouts TV Navigation Manage TV Controllers Android Lean Back Experience for TV Expertise with ExoPlayer for streaming videos. Experience in Video Players Native ExoPlayer for Streaming Videos Requirements: Bachelor of Engineering in CS/IT/ECE stream is preferred. Overall 5+ years of application development experience. Experienced in integrating mobile apps with web-services and external APIs. Experience is a must in vast number of mobility platforms, especially from Enterprise architecture integrations design and implementation perspectives. Ability to define, analyse and document Enterprise Mobility Architecture. Good understanding of Industry standard design patterns such as MVC, MVVM etc. and good knowledge on SOA, REST/JSON and SOAP/XML Android / Android TV / Player Experience. Good communication & collaboration skills. Good analytical, problem solving and troubleshooting skills. Experience in developing reusable artifacts/frameworks, re-usable assets, Industry solutions, reference architecture, design, development, and QA best practice.

Android TV Developer

Bengaluru / Bangalore, Karnataka, India

5 - 8 years

INR 5.0 - 8.0 Lacs P.A.

On-site

Full Time

We are seeking a talented Android TV Developer to join our innovative development team. If you are passionate about creating high-quality and user-friendly TV applications, have strong expertise in Android TV development, and enjoy working in a collaborative environment, we want to hear from you. Your skills in building scalable and efficient TV apps will be key in delivering exceptional entertainment experiences to our users. Responsibilities: Works in designing & architecting/developing mobile solutions (native/hybrid/web) in leading mobile platforms especially Android and Android / Android TV. Working with TV Libraries - androidx libraries available for TV devices that provide widgets for building user interfaces: Building TV Layouts TV Navigation Manage TV Controllers Android Lean Back Experience for TV Expertise with ExoPlayer for streaming videos. Experience in Video Players Native ExoPlayer for Streaming Videos Requirements: Bachelor of Engineering in CS/IT/ECE stream is preferred. Overall 5+ years of application development experience. Experienced in integrating mobile apps with web-services and external APIs. Experience is a must in vast number of mobility platforms, especially from Enterprise architecture integrations design and implementation perspectives. Ability to define, analyse and document Enterprise Mobility Architecture. Good understanding of Industry standard design patterns such as MVC, MVVM etc. and good knowledge on SOA, REST/JSON and SOAP/XML Android / Android TV / Player Experience. Good communication & collaboration skills. Good analytical, problem solving and troubleshooting skills. Experience in developing reusable artifacts/frameworks, re-usable assets, Industry solutions, reference architecture, design, development, and QA best practice.

Senior Developer

Bengaluru / Bangalore, Karnataka, India

6 - 11 years

INR 6.0 - 11.0 Lacs P.A.

On-site

Full Time

We are seeking a highly skilled and detail-oriented Senior Developer with expertise in Model-Based Development (MBD) and embedded systems to join our team. The ideal candidate will have strong experience in designing, implementing, and validating software solutions using MATLAB/Simulink/TargetLink and other industry-standard tools. You will play a critical role in developing control algorithms, ensuring software quality, and collaborating across teams to deliver robust and innovative embedded software solutions. Responsibilities: Model-Based Development (MBD), validate, and maintain models using MATLAB/Simulink/TargetLink and other tools like Embedded Coder for embedded software solutions. Perform detailed requirements analysis and develop comprehensive documentation, including design specifications, test plans, and validation reports. Design and implement control algorithms in a model-based environment. Conduct MIL (Model-in-the-Loop), SIL (Software-in-the-Loop), and PIL (Processor-in-the-Loop) testing to validate models and software. Execute manual and automated testing to identify software defects and ensure compliance with quality standards. Leverage tools like GitLab, Jenkins, and BTC Embedded Tester for continuous integration, automated testing, and debugging. Streamline back-to-back testing workflows using tools like CoverageMaster WinAMS. Develop and test software components based on Classic AUTOSAR standards. Investigate and resolve complex issues during the integration and deployment phases. Requirements: Minimum 8 years of hands-on experience in Model-Based Development Testing. Hands-on experience in Matlab / Simulink / Targetlink / Embedded Coder/WinAms/TaskingVX/BTC. Requirements analysis, Functional specification document and test specification creation. Design, implement, document, test, and debug. Good knowledge of testing types: MIL, SIL, PIL. Knowledge / Experience in Software QA, Manual testing, Test automation. Performing static analysis, unit testing, and actual device testing of software developed by using MBD. Experience of working with Classic AUTOSAR. Testing tool environments: Gitlab, Jenkins.

Senior Data Analyst

Udupi, Karnataka, India

3 - 6 years

INR 3.0 - 6.0 Lacs P.A.

On-site

Full Time

Front face customer to understand data requirements and analyse data Build solutions and systems to manage data and analytics workload Improve reliability, quality, and time-to-market of our suite of software solutions. Responsibilities: Using SQL/Python to extract data from primary and secondary sources Performing analysis to assess quality and meaning of data Working with programmers, engineers, and management heads to identify process improvement. Opportunities, propose system modifications, provide all required inputs for required change Manual and automated adjustments to new systems if required Data quality checks and testing Work closely with account directors, program managers Monitor and support new implementation by analysing Splunk, AppDynamics, and Grafana Dashboards Reverse engineering of current PL/SQL modules, Manage complex SQL queries Automate current workload for a better throughput Required skills and qualifications: Excellent written and verbal communication skills Great analytical, critical thinking and problem-solving abilities Excellent skills on PL/SQL and good exposure to data transformation language like python knowledge of AWS, Java Basic, API, databases 5 or more years of Data Analyst experience Bachelors degree or higher in business analysis, computer science Ability to work with customers, product owners to understand the key requirements and turn them into reports or data insights Experience with Jira, Confluence Accuracy and attention to detail Adept at queries and writing reports

Delivery Manager - Data & Analytics

Remote, , India

10 - 12 years

INR 10.0 - 12.0 Lacs P.A.

On-site

Full Time

We are looking for an experienced Delivery Manager with a strong background in Data & Analytics to manage client communications, analyze requirements, define and allocate tasks, and track project progress. The ideal candidate should have hands-on experience in data engineering and advanced analytics projects, enabling them to effectively lead and deliver high-quality solutions. This role demands a mix of technical acumen and project management expertise to ensure successful execution of data-driven initiatives. Responsibilities: Client Communication: Serve as the primary point of contact for clients, ensuring clear and consistent communication throughout the project lifecycle. Understand client needs and translate them into actionable tasks for the team. Requirement Analysis: Engage with clients and stakeholders to gather and analyze requirements for data and analytics projects. Ensure all requirements are documented and understood by the team. Task Definition & Allocation: Break down high-level project requirements into detailed, low-level tasks. Allocate tasks to appropriate resources based on skill sets and project needs. Ensure all team members have a clear understanding of their responsibilities. Project Planning & Roadmap: Define the project roadmap, including milestones, deliverables, and timelines. Develop a Work Breakdown Structure (WBS) to organize tasks and responsibilities. Estimate project timelines and resources required to achieve project goals. Resource Management: Identify and allocate the right resources to project tasks, ensuring optimal utilization. Monitor resource availability and adjust allocations as necessary to meet project demands. Progress Tracking & Reporting: Track the progress of tasks and the overall project, ensuring alignment with the roadmap. Identify and address any roadblocks or issues that may impact project delivery. Provide regular updates to stakeholders on project status, risks, and outcomes. Technical Oversight: Offer guidance and support to the team on data engineering and analytics projects. Ensure the technical quality and integrity of deliverables. Stay up to date with industry trends and technologies to drive innovation within the team. Project Execution: Lead the team in executing project tasks, ensuring adherence to quality standards and deadlines. Coordinate with cross-functional teams to integrate efforts and achieve project objectives. Requirements: Bachelors degree in Computer Science, Information Technology, Data Science, or a related field. A Master s degree is a plus. Minimum of 10 years of experience in project management, with at least 7 years in managing data engineering and advanced analytics projects. Proven experience in client management and requirement analysis. Strong background in data engineering, including knowledge of ETL processes, data warehousing, and data modeling. Experience with advanced analytics techniques, including machine learning, statistical analysis, and data visualization. Excellent communication and interpersonal skills, with the ability to manage client relationships effectively. Strong organizational skills, with the ability to manage multiple tasks and projects simultaneously. Proficiency in project management tools such as Jira, Trello, or MS Project. Understanding of Agile and Waterfall methodologies. Technical proficiency in relevant tools and technologies (e.g., SQL, Python, Big Data technologies, Snowflake, Data Bricks, cloud platforms, like AWS, Azure). Certifications: PMP, PRINCE2, or Agile certification is a plus. AWS or Azure certification is a plus.

Pre-Sales Analyst - Data & Analytics

Remote, , India

5 - 9 years

INR 5.0 - 9.0 Lacs P.A.

On-site

Full Time

We are looking for a highly skilled Pre-Sales Analyst - Data & Analytics with 7+ years of experience to join our dynamic team. If you have a proven track record of managing and building growth strategies, vendor management, marketing analysis, product design, Solution development, proposal writing and partnership development within Data & Analytics domain, would like to hear from you. Responsibilities: RFP response and proposal writing - Work with sales and technical team to formulate RFP response, proposal, SOW and other sales agreement. Exploratory calls management - Participate in exploratory calls with prospect client and explain solution and service offering, gauge client interest and define actionable, and provide confidence to the client before tech call takes place. Market Analysis - Conduct comprehensive market analysis to identify growth opportunities, market trends, and potential areas for expansion within the domain like BFSI, B2C, OTT, manufacturing etc. Solutioning - Drive project solutioning activities to address complex challenges and capitalize on emerging opportunities. Collaborate with cross-functional teams to implement effective solutions. Customer Acquisition - Develop and implement effective customer acquisition strategies, leveraging data-driven insights and market research to target and attract the right customer segments. Cross-Sell - Keep track of projects, prospect and initiative and look for new opportunities. Collateral and marketing content - Work with marketing and communication team to design collateral, sales deck, social media contents. Business Operations - Manage and track business KPIs, assist management in data driven decisions. Requirements: Bachelor s degree in Computer Science Engineering or Business Administration. 5+ years of experience in pre-sales, customer acquisition etc. within Data & Analytics Domain. Immense resolve to achieve ambitious targets. Must have worked on data related projects in past as developer, architect or project manager. Should have basic understanding of the cloud technologies and solutions. Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and stakeholders. Strong problem-solving and analytical skills, with a focus on delivering innovative and scalable data solutions. Relevant certifications in data architecture, cloud computing, or related areas are a plus.

Software Developer

Bengaluru / Bangalore, Karnataka, India

1 - 5 years

INR 1.0 - 5.0 Lacs P.A.

On-site

Full Time

We are looking for a talented Software Developer (MBD) to join our dynamic team. The ideal candidate will have strong expertise in designing and debugging control algorithms, working with AUTOSAR standards, and leveraging automotive communication protocols to deliver robust and efficient embedded software solutions. Responsibilities: Design, develop, and implement control algorithms using MATLAB, Simulink, Stateflow, and TargetLink. Generate and validate code using Autocode generation tools such as Embedded Coder or BTC Embedded Tester. Perform model design, optimize, and debug issues at both model and code levels. Develop and configure software components compliant with Classic AUTOSAR standards, ensure integration of AUTOSAR software modules. Develop and modify C/C++ programming languages and scripting tools for custom implementations, automation, and enhanced functionality. Work with Ethernet, CAN, SPI, and LIN protocols to ensure robust communication between electronic control units (ECUs) in automotive systems. Verify system functionality through simulation, static analysis, and unit-level testing and system-level testing to validate models and generate codes. Requirements: 4+ years of experience in Model-Based Development. Hands-on experience in Matlab / Simulink / Stateflow / TargetLink / Embedded coder/BTC. Requirements analysis, Functional specification document and test specification creation. Design, Implement, document, test, and debug. Reasonable knowledge of testing types: MIL, SIL. Performing static analysis, unit testing, and actual device testing of software developed by using MBD. Experience of working with Classic AUTOSAR.

Software Developer (MBD & BTC)

Bengaluru / Bangalore, Karnataka, India

1 - 5 years

INR 1.0 - 5.0 Lacs P.A.

On-site

Full Time

We are looking for a proactive and skilled Software Developer (MBD) to join our team. The ideal candidate will have expertise in design, development, validating, and optimizing control models for embedded systems using tools like MATLAB, Simulink, and TargetLink. This role involves translating system requirements into efficient software designs, conducting thorough testing, and collaborating across teams to deliver high-quality solutions for safety-critical applications. Responsibilities: Design and develop control models using MATLAB, Simulink, Stateflow, and TargetLink for embedded software systems. Validate code using tools like Embedded Coder. Perform requirements analysis and other technical documentation to support the development lifecycle. Translate system requirements into software architecture and design using model-based approaches and adherence to coding standards. Conduct MIL (Model-in-the-Loop) and SIL (Software-in-the-Loop) testing to verify and validate software models. Perform static analysis, unit testing, and actual device testing. Identify and resolve issues during the development and integration phases by debugging and testing software models. Work with Classic AUTOSAR standards to design, integrate, and test AUTOSAR-compliant software components. Follow best practices in code generation, version control, and documentation to maintain project consistency. Requirements: 4+ years of experience in Model-Based Development. Strong experience working Knowledge on Matlab/Simulink/Autocode Generation. Hands-on development experience including design, debugging and auto-code generation using Matlab / Simulink / Stateflow / TargetLink / BTC. Experience of working with Classic AUTOSAR. Experience with standard scripting languages or programming languages C/C++. Good in Automotive Communication Protocols Ethernet/CAN/SPI/LIN. Knowledge and experience in unit level and system level testing. Strong collaboration and communication skills are essential.

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview