Jobs
Interviews

5 Sigma Computing Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You are looking for a Mid-Level Full Stack Developer specializing in Data Visualization & Reporting, particularly skilled in tools like Power BI, SSIS, and Snowflake. The role is based in Bangalore under a hybrid work model, requiring 7-10 years of experience with an immediate to 15 days" notice period. As a Full Stack Developer, you will be responsible for analyzing, designing, and developing online dashboards, visualizations, and offline reports in an agile environment. You will work across the complete secure software development life cycle, from concept to deployment. Your mandatory skills should include proficiency in reporting tools such as Sigma Computing, Power BI, Microsoft SQL Server, Microservices, and Event-driven architecture using C#/.NET. Additionally, you should have a strong understanding of Artificial Intelligence (AI) and GenAI tools to accelerate development. Experience in Data Modeling and Data Engineering using Snowflake is crucial for this role. Having knowledge of Agile methodologies and Gen AI would be considered a bonus. The interview process for this position will be conducted virtually. If you are interested in this opportunity, please share your resume with netra.s@twsol.com.,

Posted 2 days ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You will be working as a Mid-Level Full Stack Developer in the field of Data Visualization & Reporting with expertise in tools like Power BI, SSIS, and Snowflake. Your primary responsibilities will include the analysis, design, and development of online dashboards, visualizations, and offline reports in an agile environment. You will be involved in the full software development life cycle from conception to deployment. Your mandatory skills should include proficiency in reporting tools such as Sigma Computing, Power BI, Microsoft SQL Server, as well as Microservices and Event-driven architecture using C#/.NET. It is essential to have a strong familiarity with Artificial Intelligence (AI) and GenAI tools for development acceleration. Additionally, you must possess solid experience in Data Modeling and Data Engineering using Snowflake. Having knowledge of Agile methodologies and Gen AI would be considered as nice-to-have skills for this role. The work model for this position is hybrid, and the location of work is in Bangalore. The ideal candidate should have 7-10 years of relevant experience and be available to join on immediate notice or within 15 days. The interview process for this position will be conducted virtually. If you find this opportunity aligning with your expertise and interests, please share your resume with netra.s@twsol.com.,

Posted 3 days ago

Apply

7.0 - 12.0 years

0 - 2 Lacs

Hyderabad, Pune

Hybrid

Role: Data Analytical Engineer Exp: 7+ years Work location: Hyderabad & Pune Work Mode : Hybrid (3 days in office) Purpose of the Role The Data Engineer will play a critical role in delivering key milestones of the Procurement Data Lake Plan. This includes ingesting and transforming data from procurement systems, cleaning and organizing it in Snowflake, and creating dashboard-ready datasets for Sigma Computing. He will help ensure data reliability, reduce manual work, and enable automated insights for stakeholders across procurement, legal, and operations. Key Responsibilities Design, build, and maintain scalable ETL data pipelines. Ingest, clean, and standardize data from Coupa, NetSuite, IntelAgree, Zip, ProcessUnity, and Monday.com. Integrate data into Snowflake with appropriate schema and performance optimization. Enable real-time and scheduled analytics through Sigma Computing dashboards. Collaborate with procurement, legal, and data teams to meet milestone reporting needs. Ensure documentation of workflows, datasets, and dashboard requirements. Technical Requirements Advanced SQL for transformation and analytics use cases. Proficiency in Python or R for data wrangling and automation. Experience using Airflow or similar tools for orchestration. Strong understanding of Snowflake or equivalent cloud data warehouse. Proficiency in Sigma Computing/Tableau or similar BI tools: building dashboards, designing datasets, and user interactivity. Familiarity with Git and version control best practices. Preferred Qualifications Background in procurement, finance, or legal analytics. Experience with procurement tools like Coupa, IntelAgree, Zip, Netsuite, and ProcessUnity. Strong stakeholder engagement and communication skills. Agile and milestone-driven project delivery experience. Expected Deliverables Automated data pipelines for spend contract, intake, and travel & expense data. Cleaned, structured datasets stored in Snowflake. Sigma dashboards that support milestone and executive reporting. Documentation of data processes, schemas, and maintenance runbooks.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You will be responsible for designing, developing, and maintaining dashboards and reports using Sigma Computing. Your main focus will be on collaborating with business stakeholders to understand data requirements and deliver actionable insights. It will be crucial for you to write and optimize SQL queries that run directly on cloud data warehouses. Additionally, enabling self-service analytics for business users via Sigma's spreadsheet interface and templates will be part of your responsibilities. You will need to apply row-level security and user-level filters to ensure proper data access controls. Furthermore, you will work closely with data engineering teams to validate data accuracy and ensure model alignment. Troubleshooting performance or data issues in reports and dashboards will also be a key aspect of your role. You will be expected to train and support users on Sigma best practices, tools, and data literacy. To excel in this role, you should have at least 5 years of experience in Business Intelligence, Analytics, or Data Visualization roles. Hands-on experience with Sigma Computing is highly preferred. Strong SQL skills and experience working with cloud data platforms such as Snowflake, BigQuery, or Redshift are essential. Familiarity with data modeling concepts and modern data stacks is required. Your ability to translate business requirements into technical solutions will be crucial. Knowledge of data governance, security, and role-based access controls is important. Excellent communication and stakeholder management skills are necessary for effective collaboration. Experience with tools like Looker, Tableau, Power BI, or similar ones will be beneficial for comparative insights. Familiarity with dbt, Fivetran, or other ELT/ETL tools is a plus. Exposure to Agile or Scrum methodologies would also be advantageous.,

Posted 1 week ago

Apply

4 - 9 years

20 - 27 Lacs

Pune, Delhi / NCR

Hybrid

Job Description : - Design, develop, and maintain dashboards and reports using Sigma Computing . Collaborate with business stakeholders to understand data requirements and deliver actionable insights. Write and optimize SQL queries that run directly on cloud data warehouses. Enable self-service analytics for business users via Sigma's spreadsheet interface and templates. Apply row-level security and user-level filters to ensure proper data access controls. Partner with data engineering teams to validate data accuracy and ensure model alignment. Troubleshoot performance or data issues in reports and dashboards. Train and support users on Sigma best practices, tools, and data literacy. Required Skills & Qualifications: 3+ years of experience in Business Intelligence, Analytics, or Data Visualization roles. Hands-on experience with Sigma Computing is highly preferred. Strong SQL skills and experience working with cloud data platforms (Snowflake, BigQuery, Redshift, etc.). Experience with data modeling concepts and modern data stacks. Ability to translate business requirements into technical solutions. Familiarity with data governance, security, and role-based access controls. Excellent communication and stakeholder management skills. Experience with Looker, Tableau, Power BI, or similar tools (for comparative insight). Familiarity with dbt, Fivetran, or other ELT/ETL tools. Exposure to Agile or Scrum methodologies.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies