Home
Jobs

68 Snowsql Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

20 - 25 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Snowflake Developer _ Reputed US based IT MNC If you are a Snowflake Matillion Developer, Email your CV to jagannaath@kamms.net Experience : 5 Years + (Must be 100% real time experience can apply) Role : Snowflake Developer Preferred : Snowflake certifications (SnowPro Core/Advanced) Position Type: Full time/ Permanent Location : Hyderabad, Bengaluru and Chennai ( Hybrid - local candidates) Notice Period: Immediate to 15 Days Salary: As per your experience Responsibilities: 5+ years of experience in data engineering, ETL, and Snowflake development. Strong expertise in Snowflake SQL scripting, performance tuning, data warehousing concepts. Strong knowledge of cloud platforms (AWS/Azure/GCP) and cloud-based data architecture. Proficiency in SQL, Python, or scripting languages for automation & transformation. Experience with API integrations & data ingestion frameworks. Understanding of data governance, security policies, and access control in Snowflake. Excellent communication skills ability to interact with business and technical stakeholders. Self-starter who can work independently and drive projects to completion

Posted 2 days ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Bengaluru, Karnataka, India

On-site

Job Summary Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications. Role: Security Engineer / Analyst Industry Type: IT Services & Consulting Department: IT & Information Security Employment Type: Full Time, Permanent Role Category: IT Security

Posted 2 days ago

Apply

6.0 - 11.0 years

35 - 50 Lacs

Pune, Gurugram, Delhi / NCR

Hybrid

Role: Snowflake Data Engineer Mandatory Skills: #Snowflake, #AZURE, #Datafactory, SQL, Python, #DBT / #Databricks . Location (Hybrid) : Bangalore, Hyderabad, Chennai, Pune, Gurugram & Noida. Budget: Up to 50 LPA' Notice: Immediate to 30 Days Serving Notice Experience: 6-11 years Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.

Posted 5 days ago

Apply

6.0 - 11.0 years

35 - 50 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role: Snowflake Data Engineer Mandatory Skills: #Snowflake, #AZURE, #Datafactory, SQL, Python, #DBT / #Databricks . Location (Hybrid) : Bangalore, Hyderabad, Chennai, Pune, Gurugram & Noida. Budget: Up to 50 LPA' Notice: Immediate to 30 Days Serving Notice Experience: 6-11 years Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.

Posted 5 days ago

Apply

4.0 - 9.0 years

12 - 22 Lacs

Pune, Bengaluru, Delhi / NCR

Work from Office

We need Snowflake Developer profiles with Python experience urgently with having hands on experience in Snowflake coding + Advance SQL + Python

Posted 5 days ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI . Inviting applications for the role of Lead Consulta nt- Snowflake Data Engineer ( Python+Cloud ) ! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification is must. Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL , Bulk copy, Snowpipe , Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/ Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python and Pyspark . Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2 . Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/ PySpark , AWS/Azure, ETL concepts, & Data Warehousing concepts Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .

Posted 6 days ago

Apply

4.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireSnowFlake Professionals in the following areas : Senior Snowflake Developer Job description: Responsible for designing and implementing data pipelines, ETL processes, and data modeling in Snowflake Responsible to translate business requirements into ELT pipelines using data replication tools and data transformation tools (such as DBT) or advanced SQL scripting (views, Snowflake Store Procedure, UDF). Deep understanding of Snowflake architecture and processing Exp with performance tuning of Snowflake data warehouse, Experience working with Snowflake Functions, hands on exp with Snowflake utilities, stage and file upload features, time travel, fail safe Responsible for development, deployment, code reviews, and production support. Maintain and implement best practices for Snowflake infrastructure Hands-on in complex SQL, parsing complex data sets Primary Skills: Must have 4 to 6 yrs. in IT, 3+ years working as a Snowflake Developer, and 5+ years in Data warehouse, ETL, and BI projects. Must have experience in at least one complex implementation of a Snowflake Data Warehouse and DBT hands-on experience Expertise in Snowflake data modeling, ELT using Snowflake SQL or Modern Data Replication tools Snowflake Store Procedures / UDF / advanced SQL scripting, and standard Data Lake / Data Warehouse concepts. Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, and time travel. Expertise in deploying Snowflake features such as data sharing, events, and lake-house patterns. Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe and Big Data model techniques. Deep understanding of relational data stores, methods, and approaches (star and snowflake, dimensional modeling). Hands-on experience with DBT Core or DBT Cloud, including dev and prod deployment using CI/CD (BitBucket) is a plus. Should be able to develop and maintain documentation of the data architecture, data flow, and data models of the data warehouse. Good communication skills Python and API experience is a plus At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 1 week ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant - Snowflake! Responsibilities: . Ability to design and implement effective analytics solutions and models with Snowflake . Hand-on experience in Snowflake SQL, Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data . Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. . Should be able to implement Snowpipe, Stage and file upload to Snowflake database . Hand-on Experience on any RBDMS/NoSQL database with strong SQL writing skills . In-depth understanding of Data Warehouse/ODS, ETL concept and modeling structure principles . Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling. . Hands-on Experience on Azure Blob Qualifications we seek in you! Minimum Qualifications / Skills . SnowSQL, SnowPipe, Tasks, Streams, Time travel . Certified SnowPro Core . Good Understanding of Data Warehousing & Reporting tools . Able to work on own initiative and as a team player . Good organizational skills with cultural awareness and sensitivity . Education: ME/ M.Tech./ MS (Engg/ Sciences) and BE/BTech (Engineering) . Industry: Manufacturing/Industrial Behavioral Requirements: . Lives client&rsquos core values of courage and curiosity to deliver the best business solutions for EL-Business . Ability to o work in diversified teams o convey messages and ideas clearly to the users and project members o listen, understand, appreciate, and appropriately respond to the users . Excellent team player with strong oral and written communication skills . Possess strong time management skills . Keeps up-to-date and informed of client technology landscape and client IS Strategy planned or ad-hoc changes. Preferred Skills/Qualifications Azure storage services such as Blob, Data Lake, Cosmos DB and SQL Server. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at and on , , , and . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 1 week ago

Apply

5.0 - 9.0 years

25 - 30 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Looking for Snowflake Data Engineer who has below technical skills - Able to write SnowSQL queries, stored procedure Have good understanding of Snowflake Warehouse Architecture and Design. Have sound troubleshooting skills. Have knowledge how to fix the Query performance issue in Snowflake Familiar with AWS services - S3, Lambda Function, Glue Jobs etc. Hands-on in Pyspark Also, the Person must have the right attitude, quick learning and analytical skills. The person should be good team player. The person with Insurance (Claims & Settlements) domain knowledge will be preferred.

Posted 1 week ago

Apply

2.0 - 5.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Job Description. Tietoevry Create is seeking a skilled Snowflake Developer to join our team in Bengaluru, India. In this role, you will be responsible for designing, implementing, and maintaining data solutions using Snowflake's cloud data platform. You will work closely with cross-functional teams to deliver high-quality, scalable data solutions that drive business value.. 7+ years of experience in designing, development of Datawarehouse & Data integration projects (SSE / TL level). Experience of working in Azure environment. Developing ETL pipelines in and out of data warehouse using a combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake.. Good understanding of database design concepts Transactional / Datamart / Data warehouse etc.. Expertise in loading from disparate data sets and translating complex functional and technical requirements into detailed design. Will also perform analysis of vast data stores and uncover insights.. Snowflake data engineers will be responsible for architecting and implementing substantial scale data intelligence solutions around Snowflake Data Warehouse.. A solid experience and understanding of architecting, designing, and operationalizing large-scale data & analytics solutions on Snowflake Cloud Data Warehouse is a must.. Very good articulation skill. Flexible and ready to learn new skills.. Additional Information. At Tietoevry, we believe in the power of diversity, equity, and inclusion. We encourage applicants of all backgrounds, genders (m/f/d), and walks of life to join our team, as we believe that this fosters an inspiring workplace and fuels innovation.?Our commitment to openness, trust, and diversity is at the heart of our mission to create digital futures that benefit businesses, societies, and humanity.. Diversity,?equity and?inclusion (tietoevry.com). Show more Show less

Posted 1 week ago

Apply

5.0 - 10.0 years

0 - 3 Lacs

Noida

Work from Office

• Act as Data domain expert for Snowflake in a collaborative environment to provide demonstrated understanding of data management best practices and patterns. • Design and implement robust data architectures to meet and support business requirements leveraging Snowflake platform capabilities. • Develop and enforce data modelling standards and best practices for Snowflake environments. • Develop, optimize, and maintain Snowflake data warehouses. • Leverage Snowflake features such as clustering, materialized views, and semi structured data processing to enhance data solutions. • Ensure data architecture solutions meet performance, security, and scalability requirements. • Stay current with the latest developments and features in Snowflake and related technologies, continually enhancing our data capabilities. • Collaborate with cross-functional teams to gather business requirements, translate them into effective data solutions in Snowflake and provide data-driven insights. • Stay updated with the latest trends and advancements in data architecture and Snowflake technologies. • Provide mentorship and guidance to junior data engineers and architects. • Troubleshoot and resolve data architecture-related issues effectively. Skills Requirement: • 5+ years of proven experience as a Data Engineer with 3+ years as Data Architect. • Proficiency in Snowflake with Hands-on experience with Snowflake features such as clustering, materialized views, and semi-structured data processing. • Experience in designing and building manual or auto ingestion data pipeline using Snowpipe. • Design and Develop automated monitoring processes on Snowflake using combination of Python, PySpark, Bash with SnowSQL. • SnowSQL Experience in developing stored Procedures writing Queries to analyse and transform data • Working experience on ETL tools like Fivetran, DBT labs, MuleSoft • Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero copy clone, time travel and automating them. • Excellent problem-solving skills and attention to detail. • Effective communication and collaboration abilities. • Relevant certifications (e.g., SnowPro Core / Advanced) are a must have. • Must have expertise in AWS. Azure, Salesforce Platform as a Service (PAAS) model and its integration with Snowflake to load/unload data. • Strong communication and exceptional team player with effective problem-solving skills Educational Qualification Required: • Masters degree in Business Management (MBA / PGDM) / Bachelor's degree in computer science, Information Technology, or related field.

Posted 1 week ago

Apply

8.0 - 13.0 years

12 - 22 Lacs

Hyderabad, Bengaluru

Hybrid

Skills -Snowflake, AWS, SQL, PLSQL/TSQL, DWH, Python, PySpark •Experience with Snowflake utilities, SnowSQL, SnowPipe, Able to administer & monitor Snowflake computing platform •Good in Cloud Computing AWS NP-Immediate Email- sachin@assertivebs.com

Posted 1 week ago

Apply

8.0 - 14.0 years

8 - 14 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Roles & Responsibilities: Analyze requirements: Collaborate with stakeholders to scrutinize requirements for new software applications or enhancements, guaranteeing that developed software fulfills customer needs. Design software solutions: Contribute to crafting detailed design specifications for data models, schemas, views, and stored procedures using Snowflake features such as time travel, zero copy cloning, and secure data sharing based on gathered requirements, steering the development process to ensure the resulting software meets functional and technical demands. Develop and deploy scalable software: Write clean, maintainable, and well-documented data pipelines using Snowflake SQL, Snowpipe, and other tools to ingest, transform, and deliver data from various sources, leveraging your expertise to ensure scalability, complexity, efficiency and lead deployment activities of that code in multiple environments. Integrate software components: Seamlessly integrate software components into a fully functional software system, ensuring compatibility and interoperability with existing systems for smooth communication and data exchange. Perform unit testing: Conduct thorough unit testing of developed queries and components, ensuring data quality and accuracy by implementing data validation, testing, and monitoring frameworks and tools adhering to quality standards and expected performance levels. Debug and troubleshoot: Skillfully debug and troubleshoot software applications, swiftly identifying and resolving issues encountered during development or deployment to ensure uninterrupted operation and minimal downtime for end-users. Provide technical support: Offer expert technical support and guidance to end-users by applying Snowflake best practices such as partitioning, clustering, caching, and compression, empowering them to utilize the software effectively and troubleshoot any encountered issues. Stay updated with technology: Remain abreast of emerging technologies, trends, and best practices in Snowflake and data domain, integrating relevant advancements into our software solutions. Collaborate with team: Foster effective communication and coordination throughout the software development lifecycle by collaborating with IT team members, data engineers, project managers, and end-users, ensuring a collaborative work environment and successful project delivery. Mentor and lead junior developers. Document processes: Document processes, procedures, and technical specifications related to software development and deployment, facilitating knowledge sharing within the team and streamlining future development efforts. Experience Requirement: 8-14 years of experience with software development tools, including integrated development environments (IDEs), version control systems (e.g., Git), and issue tracking systems (e.g., Jira), DevOps principles and CI/CD pipelines. Experience providing technical support and guidance to end-users during the implementation and deployment of software applications. Strong analytical thinking skills to understand complex requirements and design software solutions accordingly. Ability to read and understand other developer's code. Proficiency in industry standard testing methodologies and debugging techniques to ensure software quality and identify and resolve issues. Ability to document processes, procedures, and technical specifications related to software development and deployments.

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

, India

On-site

Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireSnowflake Professionals in the following areas : JD as below Snowflake SnowSQL, PL/SQL Any ETL Tool Job Description: 3+ years of IT experience in Analysis, Design, Development and unit testing of Data warehousing applications using industry accepted methodologies and procedures Write complex SQL queries to implement ETL (Extract, Transform, Load) processes and for Business Intelligence reporting. Strong experience with Snowpipe execustion, snowflake Datawarehouse, deep understanding of snowflake architecture and Processing, Creating and managing automated data pipelines for both batch and streaming data using DBT. Designing and implementing data models and schemas to support data warehousing and analytics within Snowflake. Writing and optimizing SQL queries for efficient data retrieval and analysis. Deliver robust solutions through Query optimization ensuring Data Quality. Should have experience in writing Functions and Stored Procedures. Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables, star and snowflake schema modelling Analyse & translate functional specifications /user stories into technical specifications. Good to have experience in Design/ Development in any ETL tool. Good interpersonal skills, experience in handling communication and interactions between different teams. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 2 weeks ago

Apply

6.0 - 8.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireSnowflake Professionals in the following areas : JD as below Snowflake SnowSQL, PL/SQL Any ETL Tool Job Description: Min 6-7 years of IT experience in Analysis, Design, Development and unit testing of Data warehousing applications using industry accepted methodologies and procedures Write complex SQL queries to implement ETL (Extract, Transform, Load) processes and for Business Intelligence reporting. Strong experience with Snowpipe execustion, snowflake Datawarehouse, deep understanding of snowflake architecture and Processing, Creating and managing automated data pipelines for both batch and streaming data using DBT. Designing and implementing data models and schemas to support data warehousing and analytics within Snowflake. Writing and optimizing SQL queries for efficient data retrieval and analysis. Deliver robust solutions through Query optimization ensuring Data Quality. Should have experience in writing Functions and Stored Procedures. Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables, star and snowflake schema modelling. Analyse & translate functional specifications /user stories into technical specifications. Good to have experience in Design/ Development in any ETL tool like snaplogic, Informatica, Datastage. Good interpersonal skills, experience in handling communication and interactions between different teams. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 2 weeks ago

Apply

2.0 - 5.0 years

3 - 6 Lacs

Ahmedabad

Work from Office

Roles and Responsibility : 1. Database and Datawarehouse Expertise: Demonstrate excellent understanding of Database and Datawarehouse concepts. Strong proficiency in writing SQL queries. 2. Snowflake Cloud Data Warehouse: Design and implement Snowflake cloud data warehouse. Develop and implement cloud-related architecture and data modeling. 3. Migration Projects: Manage migration projects, specifically migrating from On-prem to Snowflake. 4. Snowflake Capabilities: Utilize comprehensive knowledge of Snowflake capabilities such as Snow pipe, Stages, SnowSQL, Streams, and tasks. 5. Advanced Snowflake Concepts: Implement advanced Snowflake concepts like setting up resource monitor, RBAC controls, Virtual Warehouse sizing, Zero copy clone. 6. Data Migration Expertise: Possess in-depth knowledge and experience in data migration from RDBMS to Snowflake cloud data warehouse. 7. Snowflake Feature Deployment: Deploy Snowflake features such as data sharing, event, and lake house patterns. 8. Incremental Extraction Loads: Execute Incremental extraction loads, both batched and streaming.

Posted 2 weeks ago

Apply

9.0 - 14.0 years

15 - 30 Lacs

Pune, Chennai, Bengaluru

Work from Office

The Snowflake Data Specialist will manage projects in Data Warehousing, focusing on Snowflake and related technologies. The role requires expertise in data modeling, ETL processes, and cloud-based data solutions.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 9 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Snowflake Developer exp: 5+ years Location: Pan India Work Mode: WFO

Posted 2 weeks ago

Apply

5.0 - 7.0 years

5 - 7 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Job Summary Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications. Role: Security Engineer / Analyst Industry Type: IT Services & Consulting Department: IT & Information Security Employment Type: Full Time, Permanent Role Category: IT Security

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Ahmedabad

Hybrid

Key Responsibilities: Lead the end-to-end Snowflake platform implementation , including architecture, design, data modeling, and governance. Oversee the migration of data and pipelines from legacy platforms to Snowflake, ensuring quality, reliability, and business continuity. Design and optimize Snowflake-specific data models , including use of clustering keys, materialized views, Streams, and Tasks. Build and manage scalable ELT/ETL pipelines using modern tools and best practices. Define and implement standards for Snowflake development , testing, and deployment, including CI/CD automation. Collaborate with cross-functional teams including data engineering, analytics, DevOps, and business stakeholders. Establish and enforce data security , privacy, and governance policies using Snowflakes native capabilities. Monitor and tune system performance and cost efficiency through appropriate warehouse sizing and usage patterns. Lead code reviews, technical mentoring, and documentation for Snowflake-related processes. Required Snowflake Expertise: Snowflake Architecture – Deep understanding of virtual warehouses, data sharing, multi-cluster, zero-copy cloning. Ability to enhance architecture and implement solutions as per the architecture. Performance Optimization – Proficient in tuning queries, clustering, caching, and workload management. Data Engineering – Experience with processing batch, real time using multiple Snowflake features like Snowpipe, Streams & Tasks, stored procedures, and data ingestion patterns. Data Security & Governance – Strong experience with RBAC, dynamic data masking, row-level security, and tagging. Experience enabling such capabilities in Snowflake and at least one enterprise product solution. Advanced SQL – Expertise writing, analyzing, performance optimization of complex SQL queries, transformations, semi-structured data handling (JSON, XML). Cloud Integration – Experience with at least one of the major cloud platforms (AWS/GCP/Azure) and services like S3, Lambda, Step Functions, etc.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

0 - 2 Lacs

Pune, Chennai, Mumbai (All Areas)

Hybrid

Role & responsibilities :Snowflake Developer Skill Set : Snowflake,IICS,ETL,Cloud Experience- 5 Years- 12 Years Location-Pune/Mumbai/Chennai/Bangalore/Hyderabad/Delhi Notice Period: immediate- 30 days If all above criteria matches to your profile please share your updated CV with all below details Total Exp- ? Relevant Exp- ? Current CTC- ? Exp. CTC- ? Notice Period- ? IF serving what is LWD? Pan Card Number -?Mandatory Passport size photo please attach -Mandatory Please share your all above details on sneha.joshi@alikethoughts.com

Posted 2 weeks ago

Apply

7.0 - 12.0 years

0 Lacs

Kochi

Work from Office

Greetings from TCS Recruitment Team! Role: SNOWFLAKE LEAD/ SNOWFLAKE SOLUTION ARCHITECT/ SNOWFLAKE ML ENGINEER Years of experience: 7 to 18 Years Walk-In-Drive Location: Kochi Walk-in-Location Details: Tata Consultancy Services TCS Centre SEZ Unit, Infopark Kochi Phase 1, Infopark Kochi P.O, Kakkanad, Kochi - 682042, Kerala India Drive Time: 9 am to 1:00 PM Date: 21-Jun-25 Must Have Deep knowledge of Snowflakes architecture, SnowSQL, Snowpipe, Streams, Tasks, Stored Procedures. Strong understanding of cloud platforms (AWS, Azure, GCP). Proficiency in SQL, Python, or scripting languages for data operations. Experience with ETL/ELT tools, data integration, and performance tuning. Familiarity with data security, governance, and compliance standards (GDPR, HIPAA, SOC 2).

Posted 2 weeks ago

Apply

6.0 - 11.0 years

17 - 30 Lacs

Kolkata, Hyderabad/Secunderabad, Bangalore/Bengaluru

Hybrid

Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts

Posted 2 weeks ago

Apply

6.0 - 11.0 years

17 - 30 Lacs

Kolkata, Hyderabad/Secunderabad, Bangalore/Bengaluru

Hybrid

Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts

Posted 2 weeks ago

Apply

7.0 - 12.0 years

15 - 22 Lacs

Hyderabad, Pune

Work from Office

Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications. Perks and benefits: Mention available facilities and benefits the company is offering with this job.

Posted 2 weeks ago

Apply
page 1 of 3 results
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies