Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 - 1 Lacs
Gurugram, Bengaluru
Hybrid
Job Responsibilities: Exp : 4-9 Year Snowflake SnowPro Certified professionals are First priority. (Mandatory) -At least 4 years of experience with 4+ years on Snowflake Data Cloud. Must-Have Skills: Snowflake Cloud Platform strong hands-on experience ETL/ELT Tools – experience with one or more tools such as: Azure Data Factory AWS Glue Informatica Talend Qlik Replicate Workflow Orchestration – proficiency with tools like: Apache Airflow Control-M Tidal Automation Programming: Advanced SQL Python (including working with dataframes using Pandas, PySpark, or Snowpark) Data Engineering Concepts: Strong knowledge of data pipelines, data wrangling, and optimization Good-to-Have Skills: SQL scripting and procedural logic Data modeling tools (e.g., Erwin, dbt) Integration tools like Fivetran, Stitch. Please Share your updated your resume at poonampal@kpmg.com
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have at least 3+ years of experience in designing, developing, and administering Snowflake data warehouse solutions with a strong focus on scalability and performance. Your primary responsibilities will include writing and optimizing complex Snowflake SQL queries and scripts to ensure efficient data extraction, transformation, and loading (ETL/ELT). Additionally, you will be expected to develop and implement robust ETL/ELT pipelines using Snowflake and associated tools. Applying design patterns and best practices in data pipeline and system design will be crucial in this role. You will work extensively with cloud platforms, preferably Azure, to integrate Snowflake solutions. Tuning Snowflake warehouses for optimal query performance, including sizing, clustering, and partitioning strategies will also be part of your responsibilities. Collaboration with the DataOps Live platform to orchestrate, automate, and monitor data workflows and pipelines is essential. You will need to review and interpret design documents, including UML diagrams, to ensure alignment with technical solutions. Implementing data security measures such as masking policies, role-based access control, and compliance standards within Snowflake and Azure environments will be required. You should have experience utilizing version control systems like Git and participating in DevOps practices for continuous integration and deployment. Active engagement in Agile methodologies and effective collaboration with cross-functional teams will be expected. Clear and professional communication with clients and team members is necessary to ensure project alignment and success. About Virtusa: Virtusa values teamwork, quality of life, and professional and personal development. When you join Virtusa, you become part of a global team of 27,000 people who care about your growth. Virtusa aims to provide you with exciting projects, opportunities, and work with state-of-the-art technologies throughout your career with the company. At Virtusa, great minds and great potential come together. The company values collaboration and a team environment, seeking to provide dynamic opportunities for great minds to nurture new ideas and foster excellence.,
Posted 1 month ago
2.0 - 5.0 years
7 - 17 Lacs
Hyderabad
Work from Office
Key Responsibilities: Design and implement scalable data models using Snowflake to support business intelligence and analytics solutions. Implement ETL/ELT solutions that involve complex business transformations. Handle end-to-end Data warehousing solutions Migrate the data from legacy systems to Snowflake systems Write complex SQL queries for extracting, transforming, and loading data, ensuring high performance and accuracy. Optimize the SnowSQL queries for better processing speeds Integrate Snowflake with 3rd party applications Use any ETL/ELT technology Implement data security policies, including user access control and data masking, to maintain compliance with organizational standards. Document solutions and data flows. Skills & Qualifications: Experience: 2+ years of experience in data engineering, with a focus on Snowflake. Proficient in SQL and Snowflake-specific SQL functions . Experience with ETL/ELT tools and cloud data integrations. Technical Skills: Strong understanding of Snowflake architecture, features, and best practices. Experience in using Snowpark, Snowpipe, Streamlit Experience in using Dynamic tables is good to have Familiarity with cloud platforms (AWS, Azure, or GCP) and other cloud-based data technologies. Experience with data modeling concepts like star schema, snowflake schema, and data partitioning. Experience with Snowflakes Time Travel, Streams, and Tasks features Experience in data pipeline orchestration. Knowledge of Python or Java for scripting and automation. Knowledge of Snowflake pipelines is good to have Knowledge of data governance practices, including security, compliance, and data lineage.
Posted 1 month ago
6.0 - 11.0 years
20 - 25 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
DBT - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data Ensure data quality in a big data environment Very strong on PL/SQL - Queries, Procedures, JOINs. Snowflake SQL - Writing SQL queries against Snowflake and developing scripts in Unix, Python, etc., to perform Extract, Load, and Transform operations. Good to have Talend knowledge and hands-on experience. Candidates who have worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes ensuring data quality and integrity. Complex problem-solving capability and continuous improvement approach Desirable to have Talend / Snowflake Certification Excellent SQL coding skills, excellent communication, and documentation skills Familiar with Agile delivery process. Must be analytical, creative, and self-motivate Work effectively within a global team environment Excellent communication skills.
Posted 1 month ago
5.0 - 8.0 years
12 - 14 Lacs
Noida, Hyderabad, Bengaluru
Work from Office
Role & responsibilities .Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tool DBT for transformation 5.Analysis, design, and development of traditional data warehouse and business intelligence solutions. Work with customers to understand and execute their requirements. 6.Working knowledge of software engineering best practices. Should be willing to work in implementation & support projects. Flexible for Onsite & Offshore traveling. 7.Collaborate with other team members to ensure the proper delivery of the requirement. Ability to think strategically about the broader market and influence company direction. 8.Should have good communication skills, team player & good analytical skills. Snowflake certified is preferable. Contact Soniya soniya05.mississippiconsultants@gmail.com
Posted 1 month ago
3.0 - 6.0 years
14 - 18 Lacs
Mumbai
Work from Office
Overview Data Technology group in MSCI is responsible to build and maintain state-of-the-art data management platform that delivers Reference. Market & other critical datapoints to various products of the firm. The platform, hosted on firms’ data centers and Azure & GCP public cloud, processes 100 TB+ data and is expected to run 24*7. With increased focus on automation around systems development and operations, Data Science based quality control and cloud migration, several tech stack modernization initiatives are currently in progress. To accomplish these initiatives, we are seeking a highly motivated and innovative individual to join the Data Engineering team for the purpose of supporting our next generation of developer tools and infrastructure. The team is the hub around which Engineering, and Operations team revolves for automation and is committed to provide self-serve tools to our internal customers. The position is based in Mumbai, India office. Responsibilities Build and maintain ETL pipelines for Snowflake. Manage Snowflake objects and data models. Integrate data from various sources. Optimize performance and query efficiency. Automate and schedule data workflows. Ensure data quality and reliability. Collaborate with cross-functional teams. Document processes and data flows. Qualifications Self-motivated, collaborative individual with passion for excellence B.E Computer Science or equivalent with 5+ years of total experience and at least 2 years of experience in working with Databases Good working knowledge of source control applications like git with prior experience of building deployment workflows using this tool Good working knowledge of Snowflake YAML, Python Experience managing Snowflake databases, schemas, tables, and other objects Proficient in Snowflake SQL, including CTEs, window functions, and stored procedures Familiar with Snowflake performance tuning and cost optimization tools Skilled in building ETL/ELT pipelines using dbt, Airflow, or Python Able to work with various data sources including RDBMS, APIs, and cloud storage Understanding of incremental loads, error handling, and scheduling best practices Strong SQL skills and intermediate Python proficiency for data processing Familiar with Git for version control and collaboration Basic knowledge of Azure, or GCP cloud platforms Capable of integrating Snowflake with APIs and cloud-native services What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Hyderabad
Hybrid
We are seeking a Lead Snowflake Engineer .The ideal candidate will bring deep technical expertise in Snowflake, hands-on experience with DBT (Data Build Tool), and a collaborative mindset for working across data, analytics, and business teams.
Posted 1 month ago
6.0 - 11.0 years
7 - 17 Lacs
Gurugram
Work from Office
We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP. As a Data Engineer,Developing end to end ETL/ELT Pipeline.
Posted 1 month ago
3.0 - 8.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,BCS,BBA,BCom,MCA,MSc Service Line Data & Analytics Unit Responsibilities " Has good knowledge on Snowflake Architecture. Understanding Virtual Warehouses - multi-cluster warehouse, autoscaling Metadata and system objects - query history, grants to users, grants to roles, users Micro-partitions Table Clustering, Auto Reclustering Materialized Views and benefits Data Protection with Time Travel in Snowflake - extremely imp Analyzing Queries Using Query Profile - extremely important (Explain plan) Cache architecture Virtual Warehouse(VW) Named Stages Direct Loading SnowPipe, Data Sharing,Streams, JavaScript Procedures & Tasks Strong ability to design and develop workflows in Snowflake in at least one cloud technology (preferably, AWS) Apply Snowflake programming and ETL experience to write Snowflake SQL and maintain complex, internally developed Reporting system. Preferable knowledge in ETL Activities like data processing from multiple source systems. Extensive Knowledge on Query Performance tuning. Apply knowledge of BI tools. Manage time effectively. Accurately estimate effortfortasks and meet agreed-upon deadlines. Effectively juggle ad-hocrequests and longer-term projects.Snowflake performance specialist- Familiar withzero copy cloningand usingtime travelfeatures to clone table- Familiar in understandingSnowflake query profileand what each step does andidentifying performance bottlenecks from query profile- Understanding of when a table needs to be clustered- Choosing the right cluster keyas a part of table design to help query optimization- Working with materialized views andbenefits vs cost scenario- How Snowflake micro partitions are maintained and what are the performance implications wrt micro partitions/ pruningetc- Horizontal vs vertical scaling. When to do what.Concept of multi cluster warehouse and autoscaling- Advanced SQL knowledge including window functions, recursive queriesand ability to understand and rewritecomplex SQLs as a part of performance optimization" Additional Responsibilities: Domain*Data Warehousing, Business IntelligencePrecise Work LocationBhubaneswar, Bangalore, Hyderabad, Pune Technical and Professional : Mandatory skills*SnowflakeDesired skills*Teradata/Python(Not Mandatory) Preferred Skills: Cloud Platform-Snowflake Technology-OpenSystem-Python - OpenSystem
Posted 1 month ago
3.0 - 8.0 years
25 - 30 Lacs
Pune, Gurugram
Hybrid
Job Title: Snowflake Developer Company: Xebia Location: Gurgaon / Pune (Hybrid/On-site as applicable) Job Type: Full-Time / Contract Department: Data & AI Center of Excellence (COE) Notice Period: Immediate to 2 Weeks Max Only Join Xebia as a Snowflake Developer! Are you passionate about solving complex data problems using modern cloud data platforms ? Were hiring a Skilled and Passionate Snowflake Developer to join our growing Data & AI COE team at Xebia! In this role, you'll design and implement robust Snowflake-based data solutions, engage in client-facing projects, and contribute to reusable frameworks, accelerators, and technical innovations. Key Responsibilities Project Delivery (50%) Develop scalable data pipelines using Snowflake Write and optimize advanced SQL , UDFs , views , and stored procedures Work with stakeholders to deliver performant, production-ready data solutions COE Contributions (30%) Build accelerators, reusable components, and solution templates Research and integrate Snowflake with AWS , Azure , Databricks , etc. Participate in internal training and documentation efforts Support & Collaboration (20%) Troubleshoot Snowflake dev & prod environments Collaborate across DevOps, QA, product, and architecture teams Contribute to pre-sales discussions, PoCs, and code reviews What You Bring 3-6 years in Data Engineering , with 2+ years in Snowflake Strong expertise in SQL , Snowflake scripting , and performance tuning Experience with ETL/ELT pipelines , Snowpipe , Streams & Tasks , Snowpark , Time Travel , etc. Knowledge of AWS (S3, Glue, Lambda) or Azure ecosystem Familiarity with Airflow, dbt, Jenkins , Python , JSON/XML handling Understanding of Data Governance , Security Policies , and Access Controls Great team collaboration & communication skills Nice to Have Snowflake Certifications (Developer/Architect) BI tools like Power BI or Tableau Experience with Git , DataOps , or Data Cataloging Why Join Xebia? At Xebia, innovation meets execution. Join a passionate team driving data-led transformations across industries. Explore the latest in Cloud, AI, Data Engineering, and Automation , while being part of a culture that celebrates continuous learning, trust, and growth. How to Apply Only candidates who can join immediately or within 15 days will be considered. To apply, please share the following details along with your updated resume: Updated CV Current CTC Expected CTC Notice Period Current Location Preferred Location Total Experience Relevant Experience in Snowflake Experience with AWS/Azure (please specify) Reason for job change Send all the details to: Vijay.S@xebia.com Subject Line: Snowflake Developer [Your Name]
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Work Location :Bangalore, chennai, Hyderabad, Pune, Bhubaneshwar, Kochi Experience :5-10yrs Job Description: Hands on experience in Snowflake Experience in Snowpipe, Snowsql Strong datawarehouse experience Please share your updated profile to suganya@spstaffing.in, if you are actively looking for change.
Posted 1 month ago
6.0 - 10.0 years
13 - 23 Lacs
Hyderabad, Bengaluru
Work from Office
Senior Snowflake Developer Bangalore/ Hyderabad 2nd Shift - (2 - 11PM) NOTE: LOOKING FOR SOMEONE WHO CAN START Required Experience: A minimum of 10 years of hands-on experience in the IT industry At least 5 years of experience in client invoicing and automation processes Strong communication skills, both verbal and written Proficient in using Jira for task tracking and project coordination Demonstrated project management experience. Technical Expertise: Minimum of 7 years of hands-on experience in the following areas: Snowflake cloud data platform SQL development (including Snowflake SQL and SQL Server) Experience with Data Modelling and Stored Procedures DBT (Data Build Tool) for data transformation Apache Airflow for workflow orchestration Google Cloud Platform (GCP) services Strong understanding of Business Intelligence (BI) tools, especially Power BI HVR and Fivetran for data replication Apache Kafka for real-time data streaming Octopus Deploy and TeamCity for CI/CD and deployment automation
Posted 1 month ago
6.0 - 10.0 years
20 - 25 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Snowflake Developer _ Reputed US based IT MNC If you are a Snowflake Matillion Developer, Email your CV to jagannaath@kamms.net Experience : 5 Years + (Must be 100% real time experience can apply) Role : Snowflake Developer Preferred : Snowflake certifications (SnowPro Core/Advanced) Position Type: Full time/ Permanent Location : Hyderabad, Bengaluru and Chennai ( Hybrid - local candidates) Notice Period: Immediate to 15 Days Salary: As per your experience Responsibilities: 5+ years of experience in data engineering, ETL, and Snowflake development. Strong expertise in Snowflake SQL scripting, performance tuning, data warehousing concepts. Strong knowledge of cloud platforms (AWS/Azure/GCP) and cloud-based data architecture. Proficiency in SQL, Python, or scripting languages for automation & transformation. Experience with API integrations & data ingestion frameworks. Understanding of data governance, security policies, and access control in Snowflake. Excellent communication skills ability to interact with business and technical stakeholders. Self-starter who can work independently and drive projects to completion
Posted 1 month ago
6.0 - 11.0 years
35 - 50 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role: Snowflake Data Engineer Mandatory Skills: #Snowflake, #AZURE, #Datafactory, SQL, Python, #DBT / #Databricks . Location (Hybrid) : Bangalore, Hyderabad, Chennai, Pune, Gurugram & Noida. Budget: Up to 50 LPA' Notice: Immediate to 30 Days Serving Notice Experience: 6-11 years Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.
Posted 1 month ago
8.0 - 13.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Combine interface design concepts with digital design and establish milestones to encourage cooperation and teamwork. Develop overall concepts for improving the user experience within a business webpage or product, ensuring all interactions are intuitive and convenient for customers. Collaborate with back-end web developers and programmers to improve usability. Conduct thorough testing of user interfaces in multiple platforms to ensure all designs render correctly and systems function properly. Converting the jobs from Talend ETL to Python and convert Lead SQLS to Snowflake. Developers with Python and SQL Skills. Developers should be proficient in Python (especially Pandas, PySpark, or Dask) for ETL scripting, with strong SQL skills to translate complex queries. They need expertise in Snowflake SQL for migrating and optimizing queries, as well as experience with data pipeline orchestration (e.g., Airflow) and cloud integration for automation and data loading. Familiarity with data transformation, error handling, and logging is also essential.
Posted 1 month ago
5.0 - 10.0 years
12 - 18 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Work from Office
Role & responsibilities 1.Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tool DBT for transformation 5.Analysis, design, and development of traditional data warehouse and business intelligence solutions. Work with customers to understand and execute their requirements. 6.Working knowledge of software engineering best practices. Should be willing to work in implementation & support projects. Flexible for Onsite & Offshore traveling. 7.Collaborate with other team members to ensure the proper delivery of the requirement. Ability to think strategically about the broader market and influence company direction. 8.Should have good communication skills, team player & good analytical skills. Snowflake certified is preferable. Contact Soniya soniya05.mississippiconsultants@gmail.com
Posted 1 month ago
10.0 - 20.0 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Detailed JD *(Roles and Responsibilities) 8 + years working experience in Power BI and its advanced components PBI Desktop Service, Power View, Power Query, Power Pivot, Power BI Dashboards, Power BI gateway. Writing DAX queries, Power Automate in Power BI, PBI AI, Data Analytical. Creating reports and dashboards using BI tools such as MS Power BI to visualize data and key performance indicators KPIs Knowledge on cloud-based data platforms and services like Snowflake, AWS, GCP, Azure. Performance analysis and suggest best approaches to team to improve the performance Good experience in consuming data from different sources and designing the Power BI Data Models Hands on Performance Optimization in Power BI reports and dashboards also suggest best approaches to team to improve the performance. Excellent Knowledge on Data Modeling & ETL. Expert in providing BI Architecture Solution Collaborate with team and mentor team members. Mandatory skills*: Power BI (very good in advanced concepts) and BI Solution Architecture Desired skills*: Snowflake SQL, ETL and Data Modelling
Posted 1 month ago
6.0 - 8.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Job location Bangalore Job Title: Module Lead - SnowFlake Experience 6-8 Years Job Description Sr Snowflake Developer Design and develop our Snowflake data platform, including data pipeline building, data transformation, and access management. Minimum 4+years of experience in Snowflake, strong in SQL Develop data warehouse and data mart solutions for business teams. Accountable to design robust, scalable database and data extraction, transformation, and loading (ETL) solutions Understand and evaluate business requirements that impact the Caterpillar enterprise. Liaises with data creators to support project planning, training, guidance on standards, and the efficient creation/maintenance of high-quality data. Contributes to policies, procedures, and standards as well as technical requirements. Ensure compliance with the latest data standards supported by the company, and brand, legal, information security (data security and privacy compliance). Document data models for domains to be deployed including a logical data model, candidate source lists, and canonical formats. Creates, updates, and enhances metadata policies, processes, and catalogs. Good communication and interacting with SME from client Should have capability to lead a team of 4-5 members Snowflake Certifications Mandatory
Posted 2 months ago
4.0 - 9.0 years
1 - 2 Lacs
Hyderabad
Hybrid
seeking an experienced Snowflake Developer to design, develop, and optimize data solutions on the Snowflake cloud data platform. The ideal candidate will have strong SQL skills, experience with ETL/ELT processes, and a deep understanding of cloud data warehousing concepts. Key Responsibilities Develop and maintain scalable data pipelines and workflows using Snowflake. Design and implement complex SQL queries, stored procedures, and views. Optimize Snowflake performance including query tuning and resource management. Collaborate with data engineers, analysts, and business teams to understand data requirements. Implement data security and governance best practices within Snowflake. Integrate Snowflake with various ETL tools and data sources. Monitor and troubleshoot data pipelines and Snowflake environments. Required Skills and Qualifications 3+ years of experience working with Snowflake or similar cloud data warehouses (Redshift, BigQuery). Expertise in writing advanced SQL queries, stored procedures, and scripts. Hands-on experience with ETL/ELT tools such as Talend, Informatica, Matillion, dbt, or Apache Airflow. Familiarity with cloud platforms like AWS, Azure, or GCP. Knowledge of data modeling, data warehousing concepts, and performance tuning. Strong analytical and problem-solving skills. Experience with version control systems like Git. Preferred (Nice to Have) Experience with data visualization tools like Tableau, Power BI, or Looker. Knowledge of scripting languages such as Python or JavaScript. Understanding of DevOps and CI/CD pipelines for data engineering.
Posted 2 months ago
3.0 - 8.0 years
9 - 16 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
Must have skills : Snowflake Data Warehouse Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years fulltime education Share CV on - neha.mandal@mounttalent.com Summary: As an Application Lead for Packaged Application Development, you will be responsible for leading the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve working with Snowflake Data Warehouse and collaborating with cross-functional teams to deliver high-quality solutions. Roles & Responsibilities: - Lead the design, development, and implementation of applications using Snowflake Data Warehouse. - Collaborate with cross-functional teams to ensure the delivery of high-quality solutions that meet business requirements. - Act as the primary point of contact for all application-related issues, providing technical guidance and support to team members. - Ensure that all applications are designed and developed in accordance with industry best practices and standards. - Provide technical leadership and mentorship to team members, ensuring that they have the necessary skills and knowledge to deliver high-quality solutions. Professional & Technical Skills: - Must To Have Skills: Strong experience in Snowflake Data Warehouse. - Good To Have Skills: Experience in other data warehousing technologies such as Redshift, BigQuery, or Azure Synapse Analytics. - Experience in designing and developing applications using Snowflake Data Warehouse. - Strong understanding of data warehousing concepts and best practices. - Experience in working with cross-functional teams to deliver high-quality solutions. - Excellent communication and interpersonal skills.
Posted 2 months ago
5.0 - 6.0 years
3 - 7 Lacs
Hyderabad, Bengaluru
Work from Office
Role & responsibilities Job Title: Developer Work Location: Hyderabad,TG and Bangalore,KA Skill Required: Utilities - Digital : Snowflake Experience Range in Required Skills: 4-6yrs Job Description: Snowflake Essential Skills: Snowflake Desirable Skills: Snowflake
Posted 2 months ago
8.0 - 13.0 years
14 - 24 Lacs
Bengaluru
Remote
Key Responsibilities: Design and implement data solutions using MS Fabric, including data pipelines, data warehouses, and data lakes Lead and mentor a team of data engineers, providing technical guidance and oversight Collaborate with stakeholders to understand data requirements and deliver data-driven solutions Develop and maintain large-scale data systems, ensuring data quality, integrity, and security Troubleshoot data pipeline issues and optimize data workflows for performance and scalability Stay up-to-date with MS Fabric features and best practices, applying knowledge to improve data solutions Requirements: 8+ years of experience in data engineering, with expertise in MS Fabric, Azure Data Factory, or similar technologies Strong programming skills in languages like Python, SQL, or C# Experience with data modeling, data warehousing, and data governance Excellent problem-solving skills, with ability to troubleshoot complex data pipeline issues Strong communication and leadership skills, with experience leading teams
Posted 2 months ago
5.0 - 10.0 years
15 - 25 Lacs
Pune
Hybrid
Role & responsibilities Designed and implemented end-to-end data pipeline using DBT, Snowflake Created and structure DBT models like staging, transformation, marts, YAML configurations for models and tests, dbt seeds. Hands-on experience on DBT Jinja templating, macro development, dbt jobs and snapshot management for Slowly changing dimensions. Develop python script for data cleaning, transformation and automation of repetitive task. Experienced in loading structured and semi-structured data from AWS S3 to Snowflake by designing file formats, configuring storage integration, and automating data loads using Snow pipe. Designed scalable incremental models for handling large datasets, reducing resource usage Preferred candidate profile Candidate must have 5+ Yrs experience. Early joiner, who can join within a month
Posted 2 months ago
7.0 - 12.0 years
19 - 34 Lacs
Hyderabad
Hybrid
Data Scientist Job Description Responsibilities Work with team members across multiple disciplines to understand the data behind product features user behaviors the security landscape and our goals Analyze data from several large sources then automate solutions using scheduled processes models and alerts Work with partners to design and improve metrics that guide our decisions for the product Detect patterns associated with fraudulent accounts and anomalous behavior Solve scientific problems and create new methods independently Translate requirements and security questions into data insights Set up alerting mechanisms so our leadership is always aware of the security posture Qualifications Postgraduate degree with specialization in machine learning artificial intelligence statistics or related fields or 2 years of equivalent work experience in applied machine learning and analytics Experience with SQL Snowflake and NoSQL databases Proficiency in Python programming Familiarity with statistics modeling and data visualization Experience Experience building statistical and machine learning models applying techniques such as regression classification clustering and anomaly detection Time series and Classical ML modeling Familiarity with Snowflake SQL Familiarity with cloud platforms such as AWS Some experience to software development or data engineering Analyze business problems or research questions identify relevant data points and extract meaningful insights
Posted 2 months ago
5.0 - 8.0 years
5 - 8 Lacs
Chennai, Tamil Nadu, India
On-site
Talend - Designing, developing, and documenting existing Talend ETL processes, technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. AWS / Snowflake - Design, develop, and maintain data models using SQL and Snowflake / AWS Redshift-specific features. Collaborate with stakeholders to understand the requirements of the data warehouse. Implement data security, privacy, and compliance measures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Stay current with new AWS/Snowflake services and features and recommend improvements to existing architecture. Design and implement scalable, secure, and cost-effective cloud solutions using AWS / Snowflake services. Collaborate with cross-functional teams to understand requirements and provide technical guidance.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City