Jobs
Interviews
6 Job openings at Skan
About Skan

Skan is an AI-powered software company that specializes in process discovery and analytics, enabling enterprises to visualize and optimize their business operations.

Data Engineer - Databricks Specialist

Bengaluru

3 - 5 years

INR 4.0 - 8.0 Lacs P.A.

Work from Office

Full Time

Job Summary We are seeking a skilled Data Engineer with 3 to 5 years of experience in building scalable data pipelines and solutions, with strong hands-on expertise in Databricks . The ideal candidate should be proficient in working with large-scale data processing frameworks and have a solid understanding of Delta Lake , PySpark , and cloud-based data platforms . Key Responsibilities: Design, build, and maintain robust ETL/ELT pipelines using Databricks (PySpark/SQL) . Develop and optimize data workflows and pipelines on Delta Lake and Databricks Lakehouse architecture . Integrate data from multiple sources, ensuring data quality, reliability, and performance. Collaborate with data scientists, analysts, and business stakeholders to translate requirements into scalable data solutions. Monitor and troubleshoot production data pipelines; ensure performance and cost optimization. Work with DevOps teams for CI/CD integration and automation of Databricks jobs and notebooks. Maintain metadata, documentation, and versioning for data pipelines and assets. Required Skills: 3-4 years of experience in data engineering or big data development . Strong hands-on experience with Databricks (Notebook, Jobs, Workflows) . Proficiency in PySpark , Spark SQL , and Delta Lake . Experience working with Azure or AWS (preferably Azure Data Lake, Blob Storage, Synapse, etc.). Strong SQL skills for data manipulation and analysis. Familiarity with Git , CI/CD pipelines , and job orchestration tools (e.g., Airflow , Databricks Workflows ). Understanding of data modeling , data warehousing , and data governance best practices. Required Skills: 3-5 years of experience in data engineering or big data development . Strong hands-on experience with Databricks (Notebook, Jobs, Workflows) . Proficiency in PySpark , Spark SQL , and Delta Lake . Experience working with Azure or AWS (preferably Azure Data Lake, Blob Storage, Synapse, etc.). Strong SQL skills for data manipulation and analysis. Familiarity with Git , CI/CD pipelines , and job orchestration tools (e.g., Airflow , Databricks Workflows ). Understanding of data modeling , data warehousing , and data governance best practices.

Customer Success Operations Lead

Bengaluru

7 - 12 years

INR 10.0 - 14.0 Lacs P.A.

Work from Office

Full Time

We are seeking a dynamic Customer Success and Services Operations lead to drive customer satisfaction, retention, and growth while optimizing operational processes across our customer-facing teams. This role combines strategic customer relationship management with operational excellence to ensure seamless customer experiences and scalable business operations. Key Responsibilities Design and optimize customer success processes and workflows Establish and track key performance indicators (KPIs) for customer success metrics Help with the implementation/configuration of customer success technology stack (CRM, CS platforms, analytics tools) Create standardized processes for customer onboarding, support escalation, and renewal management Develop customer segmentation strategies and playbooks Manage resource allocation and capacity planning for customer success team Data Analysis and Reporting Locate, gather, and organize relevant data from various internal and external sources. Ensure the accuracy, completeness, and quality of data by implementing data validation techniques and audits. Create comprehensive reporting dashboards for leadership and stakeholders Develop and maintain dashboards, reports, and analytics to track key performance indicators (KPIs) for Customer Success. Analyze data to provide actionable insights and recommendations to support customer retention and satisfaction initiatives. Support ad-hoc reporting needs and provide analytical support for ongoing projects. Cross-functional Collaboration Partner with Customer Success Managers and other stakeholders to understand business needs and translate them into process improvements and reporting solutions. Work closely with IT, Product, and other teams to ensure seamless integration of data systems and tools. Process Documentation & Maintenance Develop, document, and maintain standardized processes for the Customer Success team. Continuously review and refine processes to ensure efficiency and alignment with company goals. Create and update process documentation, manuals, and guides to ensure consistency and clarity. Qualifications Education: Bachelors degree in business, Finance, Data Analytics, Information Systems, or related field. Experience: 7+ years of experience in a similar role, preferably within Customer Success, Professional Services, Operations, or Data Analytics. Experience with data management, reporting, and analysis tools (e.g., Excel, SQL, Tableau, Power BI). Experience in process documentation and improvement initiatives. Skills: Proficiency with CRM systems (Salesforce, HubSpot, etc.) Experience with customer success platforms (Gainsight, ChurnZero, Totango) Strong analytical skills with proficiency in Excel/Google Sheets and data visualization tools Excellent attention to detail and commitment to data accuracy. Proficiency in process documentation and workflow design. Effective communication and collaboration skills with cross-functional teams. Ability to manage multiple tasks and prioritize effectively in a fast-paced environment.

Senior Frontend Engineer

Bengaluru

4 - 8 years

INR 6.0 - 10.0 Lacs P.A.

Work from Office

Full Time

We are looking for a passionate and experienced Senior Frontend Engineer to join our engineering team. You will be instrumental in shaping the user experience of our products, building scalable UI components, and driving front-end best practices across the organization. Our platform enables Agent-to-Agent communication and autonomous decision-making, moving beyond traditional LLM applications into structured, enterprise-grade action flows. This is your chance to build the future of agentic AI systems and shape how intelligent agents collaborate, reason, and execute in real-world enterprise processes Responsibilities Design and develop high-quality, scalable, and maintainable front-end features using modern JavaScript frameworks Collaborate with cross-functional teams including product, design, and backend engineers to deliver end-to-end solutions Write modular, testable, and clean code using component-based architecture Participate in code reviews and contribute to frontend architecture decisions Ensure high performance and responsiveness of UI across devices and platforms Implement unit tests and end-to-end testing to ensure code reliability Maintain and improve CI/CD workflows for frontend code Advocate and implement frontend security and privacy best practices Required Skills 4-8 years of professional frontend development experience Strong proficiency in HTML, CSS, and JavaScript (ES6+) Solid experience with React , Angular , or Svelte Working knowledge of TypeScript in a frontend environment Deep understanding of responsive and accessible design principles Experience integrating with RESTful APIs Proficient with Git and modern version control workflows (e.g., GitHub/GitLab) Familiarity with modern build tools such as Vite, npm, or pnpm Hands-on experience with end-to-end testing tools like Playwright or Puppeteer Comfortable with unit testing frameworks (e.g., Vitest , Jest, etc.) Understanding of CI/CD pipelines and frontend deployment processes Awareness of enterprise security and data privacy best practices

Full-Stack Engineer

Bengaluru

3 - 6 years

INR 5.0 - 8.0 Lacs P.A.

Work from Office

Full Time

We are hiring a Full-Stack Engineer with a strong backend foundation to help us build scalable, secure, and intelligent systems. You ll be building at the core of AI innovation working with Large Language Models (LLMs) and cutting-edge agent development frameworks like LangGraph, CrewAI, and the Model Context Protocol (MCP). Our platform enables Agent-to-Agent communication and autonomous decision-making, moving beyond traditional LLM applications into structured, enterprise-grade action flows. This is your chance to build the future of agentic AI systems and shape how intelligent agents collaborate, reason, and execute in real-world enterprise processes. Key Responsibilities Develop scalable backend systems using Python (Django, FastAPI) Design and maintain secure RESTful APIs and backend services Model and manage data using relational databases and ORMs Build containerized services using Docker ; deploy and debug on Linux Integrate AI features using LLMs or external AI APIs Collaborate on frontend features using JavaScript/TypeScript and frameworks like Svelte , React , or Angular Implement unit and integration tests, CI/CD pipelines, and observability tools Ensure security, performance, and compliance across services Technical Skills: 3-6 years of experience. Python, Django, FastAPI REST API design and development SQL databases (PostgreSQL/MySQL) and ORMs (Django ORM, SQLAlchemy) Docker, Git, CI/CD pipelines JavaScript/TypeScript (basic proficiency); modern frontend frameworks Bonus Points For Asynchronous Python (e.g., asyncio, FastAPI async routes) Event-driven architecture (Kafka, RabbitMQ) Kubernetes, Helm, or IaC tools (Terraform, Ansible) Security/auth (OAuth2, OpenID), RBAC Monitoring/observability (Prometheus, Grafana, ELK) Experience with AI model serving, ETL pipelines, or compliance standards

Lead Software Developer (Python)

Bengaluru

6 - 8 years

INR 9.0 - 14.0 Lacs P.A.

Work from Office

Full Time

We are seeking a skilled and motivated Lead Software Developer (Python) with 6-8 years of hands-on experience in designing, developing, and deploying Python-based microservices. The ideal candidate should have expertise working with cloud-native architectures using Docker and Kubernetes, and integrating services with Large Language Models (LLMs) via OpenAI APIs. You will lead the design, development, and deployment of scalable services while working in an agile/scrum environment. Essential Duties and Responsibilities: Lead the design and implementation of Python microservices hosted on Kubernetes or Docker environments. Develop and maintain python micro-services where communication between microservices uses RabbitMQ Design and optimize database schemas; implement data access layers using PostgreSQL and MongoDB. Integrate LLM capabilities via OpenAI or similar APIs into microservices. Write unit, integration, and system tests; ensure code quality and maintainability. Track work progress and maintain up-to-date tasks on Azure Boards (or similar work item tracking systems). Manage source code repositories, branching strategies, pull requests, and reviews using Git-based tools (Azure Repos, GitHub, or similar). Build, configure, and maintain CI/CD pipelines using Azure Pipelines for automated testing and deployments. Participate in Agile ceremonies (sprint planning, stand-ups, retrospectives) and collaborate effectively with cross-functional teams. Mentor and guide junior developers on coding standards, best practices, and architecture decisions. Required experience: Programming Skills: Strong expertise in Python (3.x), with knowledge of best practices for building scalable services. Application of proven programming principles and patterns Prompt engineering skill to generate code using Cursor, Co-Pilot or similar tools. Design Skills: Strong expertise in OOP with Python. Strong expertise in code design /modelling skills using UML or similar tools Frameworks/Libraries: Experience with popular Python frameworks - SQL Alchemy, Alembic Experience with building custom python packages. Containerization & Orchestration: Hands-on experience with Docker, Kubernetes (AKS or self-hosted). Messaging Systems: Proven experience using RabbitMQ for asynchronous service communication. Databases: Relational: PostgreSQL (schema design, performance tuning). NoSQL: MongoDB (data modelling, CRUD operations). APIs & LLMs: Integration of microservices with LLMs or OpenAI APIs; handling authentication, request/response flows, and prompt engineering basics In-depth Knowledge and Experience in the following areas: High level understanding of following tech stack Source control with Git (Azure Repos, GitHub). Building and deploying using Azure Pipelines or similar CI/CD tools. Familiarity with container registries (ACR, Docker Hub). Work item tracking and task management using Azure Boards (or Jira). Experience working in Agile/Scrum methodologies, including sprint ceremonies, story point estimation, and continuous delivery. Additional skills: Excellent problem-solving and troubleshooting / debugging skills. Strong understanding of RESTful API design principles. Familiarity with observability tools (logging, metrics, tracing) is a plus. Excellent verbal and written communication skills. Ability to lead technical discussions and present solutions effectively.

Sr Software Development Engineer

Bengaluru

3 - 8 years

INR 5.0 - 10.0 Lacs P.A.

Work from Office

Full Time

What You ll Do Develop and maintain backend APIs and services using FastAPI and Flask-RESTX . Design microservices-based solutions that are scalable, modular, and maintainable. Work with PostgreSQL and MongoDB to build robust data models and efficient queries. Implement messaging and task workflows using RabbitMQ . Integrate secure authentication and authorization flows using Auth0 . Monitor and debug production systems using Elasticsearch and APM tools. Write clean, testable code and participate in design/code reviews. Collaborate with cross-functional teams across engineering, DevOps, and product. Must-Have Skills Strong hands-on experience in Python backend development. Practical experience with FastAPI , Flask , or Flask-RESTX . Solid understanding and real-world experience with microservices architecture . Proficiency in either MongoDB or PostgreSQL (ideally both). Experience with RabbitMQ for async messaging and job queues. Familiarity with API security and integration using Auth0 or similar. Understanding of observability practices using Elasticsearch and APM . Strong debugging, performance tuning, and optimization skills. Nice to Have Experience with SQLAlchemy and Alembic for ORM and migrations. Exposure to PostgREST or GraphQL APIs. Knowledge of containerized development with Docker . Familiarity with CI/CD workflows and Git-based version control. Prior experience in event-driven, large-scale data processing systems.

FIND ON MAP

Skan

Skan logo

Skan

|

Information Technology / Process Analytics

Mountain View

51-200 Employees

6 Jobs

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview