Jobs
Interviews

90 Snowflake Sql Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

8 - 18 Lacs

hyderabad, pune, bengaluru

Hybrid

Snowflake Developer EXP : 5.5- 12 Yrs Location : Mumbai/Bangalore/Pune/Chennai/Hyderabad/Indore/Kolkata/Noida/Coimbatore /Bhuvaneswar Mode of Interview: L1: Online Test L2 F2F Mandatory(technical) Note: Plz share date of birth and pan no. Interested plz share the resumes to madhavi.naik@alikethoughts.com

Posted 1 hour ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

pune

Work from Office

What You'll Do We are looking for an experienced and motivated Senior Data Engineer to join our Data Operations team. You have expertise in Python , Snowflake SQL , modern ETL tools , and business intelligence platforms such as Power BI . You will require experience integrating SaaS applications such as Salesforce, Zuora, and NetSuite using REST APIs . You will develop data pipelines, developing data models, and ensuring seamless data integrations that support business reporting. We are looking for flexibility to collaborate in US time zones . What Your Responsibilities Will Be Design scalable data pipelines and workflows using modern ETL tools and Python. Build and optimize SQL queries and data models on Snowflake to support reporting needs. Integrate with SaaS platforms such as Salesforce, Zuora, and NetSuite using APIs or native connectors. Develop and support dashboards and reports using Power BI and other reporting tools. You will Work with data analysts, our users, and other engineering teams to gather requirements and provide high-quality solutions. you will Ensure data quality, accuracy, and consistency across systems and datasets. Write clean, well-documented, and testable code with a focus on performance and reliability. Participate in peer code reviews and contribute to best practices in data engineering. Be available for meetings and collaboration in US time zones. You will report to a manager. What You'll Need to be Successful You have 5+ years' experience in data engineering field, with deep SQL knowledge. Experience in Snowflake - SQL, Python, AWS Services, Power BI, ETL Tools (DBT, Airflow ) is must. Proficiency in Python for data transformation and scripting. Proficiency in writing complex SQL queries, Stored Procedures. Experience in Data Warehouse, data modeling and ETL design concepts. Have integrated SaaS systems like Salesforce, Zuora, NetSuite along with Relational Databases, REST API, FTP/SFTP... Knowledge of AWS technologies (EC2, S3, RDS, Redshift...etc.) , with the ability to translate technical issues for non-technical partners. Flexibility to work during US business hours for team meetings and collaboration.

Posted 21 hours ago

Apply

7.0 - 12.0 years

18 - 30 Lacs

coimbatore

Work from Office

Must have atleast 7+ years of experience in Data warehouse, ETL, BI projects • Must have atleast 5+ years of experience in Snowflake • Expertise in Snowflake architecture is must. • Must have atleast 3+ years of experience and strong hold in Python/PySpark • Must have experience implementing complex stored Procedures and standard DWH and ETL concepts • Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot • Good to have experience with AWS services and creating DevOps templates for various AWS services. • Experience in using Github, Jenkins • Good communication and Analytical skills • Snowflake certification is desirable

Posted 1 day ago

Apply

7.0 - 12.0 years

18 - 30 Lacs

pune

Work from Office

Must have atleast 7+ years of experience in Data warehouse, ETL, BI projects • Must have atleast 5+ years of experience in Snowflake • Expertise in Snowflake architecture is must. • Must have atleast 3+ years of experience and strong hold in Python/PySpark • Must have experience implementing complex stored Procedures and standard DWH and ETL concepts • Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot • Good to have experience with AWS services and creating DevOps templates for various AWS services. • Experience in using Github, Jenkins • Good communication and Analytical skills • Snowflake certification is desirable

Posted 1 day ago

Apply

7.0 - 12.0 years

18 - 30 Lacs

bengaluru

Work from Office

Must have atleast 7+ years of experience in Data warehouse, ETL, BI projects • Must have atleast 5+ years of experience in Snowflake • Expertise in Snowflake architecture is must. • Must have atleast 3+ years of experience and strong hold in Python/PySpark • Must have experience implementing complex stored Procedures and standard DWH and ETL concepts • Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot • Good to have experience with AWS services and creating DevOps templates for various AWS services. • Experience in using Github, Jenkins • Good communication and Analytical skills • Snowflake certification is desirable

Posted 1 day ago

Apply

5.0 - 7.0 years

5 - 15 Lacs

chennai

Work from Office

experience: 5+ years of relevant experience We are seeking a highly skilled and experienced Snowflake Lead responsible for leading the design, development, and implementation of Snowflake-based data warehousing solutions. You will leverage your deep understanding of ETL and Data Warehousing concepts to build robust and scalable data pipelines. A key aspect of this role involves direct interaction with business users to gather and clarify requirements, ensuring that the delivered solutions meet their analytical needs. Responsibilities: Leadership & Delivery: Lead a module or a team of developers in the design, development, and deployment of Snowflake solutions. Take ownership of the end-to-end delivery of Snowflake modules, ensuring adherence to timelines and quality standards. Provide technical guidance and mentorship to team members, fostering a collaborative and high-performing environment. Contribute to project planning, estimation, and risk management activities. Snowflake Expertise: Utilize in-depth knowledge of Snowflake architecture, features, and best practices to design efficient and scalable data models and ETL/ELT processes. Develop and optimize complex SQL queries and Snowflake scripting for data manipulation and transformation. Implement Snowflake utilities such as SnowSQL, Snowpipe, Tasks, Streams, Time Travel, and Cloning as needed. Ensure data security and implement appropriate access controls within the Snowflake environment. Monitor and optimize the performance of Snowflake queries and data pipelines. Integrate PySpark with Snowflake for data ingestion and processing. Understand and apply PySpark best practices and performance tuning techniques. Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames). ETL & Data Warehousing: Apply strong understanding of ETL/ELT concepts, data warehousing principles (including dimensional modeling, star/snowflake schemas), and data integration techniques. Design and develop data pipelines to extract data from various source systems, transform it according to business rules, and load it into Snowflake. Work with both structured and semi-structured data, including JSON and XML. Experience with ETL tools (e.g., Informatica, Talend, pyspark) is a plus, particularly in the context of integrating with Snowflake. Requirements Gathering & Clarification: Actively participate in requirement gathering sessions with business users and stakeholders. Translate business requirements into clear and concise technical specifications and design documents. Collaborate with business analysts and users to clarify ambiguities and ensure a thorough understanding of data and reporting needs. Validate proposed solutions with users to ensure they meet expectations. Collaboration & Communication: Work closely with other development teams, data engineers, and business intelligence analysts to ensure seamless integration of Snowflake solutions with other systems. Communicate effectively with both technical and non-technical stakeholders. Provide regular updates on progress and any potential roadblocks. Best Practices & Continuous Improvement: Adhere to and promote best practices in Snowflake development, data warehousing, and ETL processes. Stay up-to-date with the latest Snowflake features and industry trends. Identify opportunities for process improvement and optimization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of relevant experience in data warehousing and ETL development, with a significant focus on Snowflake. Strong proficiency in SQL and experience working with large datasets. Solid understanding of data modeling concepts (dimensional modeling, star/snowflake schemas). Experience in designing and developing ETL or ELT pipelines. Proven ability to gather and document business and technical requirements. Excellent communication, interpersonal, and problem-solving skills. Snowflake certifications (e.g., SnowPro Core) are a plus.

Posted 1 day ago

Apply

5.0 - 7.0 years

5 - 15 Lacs

pune

Work from Office

experience: 5+ years of relevant experience We are seeking a highly skilled and experienced Snowflake Lead responsible for leading the design, development, and implementation of Snowflake-based data warehousing solutions. You will leverage your deep understanding of ETL and Data Warehousing concepts to build robust and scalable data pipelines. A key aspect of this role involves direct interaction with business users to gather and clarify requirements, ensuring that the delivered solutions meet their analytical needs. Responsibilities: Leadership & Delivery: Lead a module or a team of developers in the design, development, and deployment of Snowflake solutions. Take ownership of the end-to-end delivery of Snowflake modules, ensuring adherence to timelines and quality standards. Provide technical guidance and mentorship to team members, fostering a collaborative and high-performing environment. Contribute to project planning, estimation, and risk management activities. Snowflake Expertise: Utilize in-depth knowledge of Snowflake architecture, features, and best practices to design efficient and scalable data models and ETL/ELT processes. Develop and optimize complex SQL queries and Snowflake scripting for data manipulation and transformation. Implement Snowflake utilities such as SnowSQL, Snowpipe, Tasks, Streams, Time Travel, and Cloning as needed. Ensure data security and implement appropriate access controls within the Snowflake environment. Monitor and optimize the performance of Snowflake queries and data pipelines. Integrate PySpark with Snowflake for data ingestion and processing. Understand and apply PySpark best practices and performance tuning techniques. Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames). ETL & Data Warehousing: Apply strong understanding of ETL/ELT concepts, data warehousing principles (including dimensional modeling, star/snowflake schemas), and data integration techniques. Design and develop data pipelines to extract data from various source systems, transform it according to business rules, and load it into Snowflake. Work with both structured and semi-structured data, including JSON and XML. Experience with ETL tools (e.g., Informatica, Talend, pyspark) is a plus, particularly in the context of integrating with Snowflake. Requirements Gathering & Clarification: Actively participate in requirement gathering sessions with business users and stakeholders. Translate business requirements into clear and concise technical specifications and design documents. Collaborate with business analysts and users to clarify ambiguities and ensure a thorough understanding of data and reporting needs. Validate proposed solutions with users to ensure they meet expectations. Collaboration & Communication: Work closely with other development teams, data engineers, and business intelligence analysts to ensure seamless integration of Snowflake solutions with other systems. Communicate effectively with both technical and non-technical stakeholders. Provide regular updates on progress and any potential roadblocks. Best Practices & Continuous Improvement: Adhere to and promote best practices in Snowflake development, data warehousing, and ETL processes. Stay up-to-date with the latest Snowflake features and industry trends. Identify opportunities for process improvement and optimization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of relevant experience in data warehousing and ETL development, with a significant focus on Snowflake. Strong proficiency in SQL and experience working with large datasets. Solid understanding of data modeling concepts (dimensional modeling, star/snowflake schemas). Experience in designing and developing ETL or ELT pipelines. Proven ability to gather and document business and technical requirements. Excellent communication, interpersonal, and problem-solving skills. Snowflake certifications (e.g., SnowPro Core) are a plus.

Posted 1 day ago

Apply

5.0 - 7.0 years

5 - 15 Lacs

bengaluru

Work from Office

experience: 5+ years of relevant experience We are seeking a highly skilled and experienced Snowflake Lead responsible for leading the design, development, and implementation of Snowflake-based data warehousing solutions. You will leverage your deep understanding of ETL and Data Warehousing concepts to build robust and scalable data pipelines. A key aspect of this role involves direct interaction with business users to gather and clarify requirements, ensuring that the delivered solutions meet their analytical needs. Responsibilities: Leadership & Delivery: Lead a module or a team of developers in the design, development, and deployment of Snowflake solutions. Take ownership of the end-to-end delivery of Snowflake modules, ensuring adherence to timelines and quality standards. Provide technical guidance and mentorship to team members, fostering a collaborative and high-performing environment. Contribute to project planning, estimation, and risk management activities. Snowflake Expertise: Utilize in-depth knowledge of Snowflake architecture, features, and best practices to design efficient and scalable data models and ETL/ELT processes. Develop and optimize complex SQL queries and Snowflake scripting for data manipulation and transformation. Implement Snowflake utilities such as SnowSQL, Snowpipe, Tasks, Streams, Time Travel, and Cloning as needed. Ensure data security and implement appropriate access controls within the Snowflake environment. Monitor and optimize the performance of Snowflake queries and data pipelines. Integrate PySpark with Snowflake for data ingestion and processing. Understand and apply PySpark best practices and performance tuning techniques. Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames). ETL & Data Warehousing: Apply strong understanding of ETL/ELT concepts, data warehousing principles (including dimensional modeling, star/snowflake schemas), and data integration techniques. Design and develop data pipelines to extract data from various source systems, transform it according to business rules, and load it into Snowflake. Work with both structured and semi-structured data, including JSON and XML. Experience with ETL tools (e.g., Informatica, Talend, pyspark) is a plus, particularly in the context of integrating with Snowflake. Requirements Gathering & Clarification: Actively participate in requirement gathering sessions with business users and stakeholders. Translate business requirements into clear and concise technical specifications and design documents. Collaborate with business analysts and users to clarify ambiguities and ensure a thorough understanding of data and reporting needs. Validate proposed solutions with users to ensure they meet expectations. Collaboration & Communication: Work closely with other development teams, data engineers, and business intelligence analysts to ensure seamless integration of Snowflake solutions with other systems. Communicate effectively with both technical and non-technical stakeholders. Provide regular updates on progress and any potential roadblocks. Best Practices & Continuous Improvement: Adhere to and promote best practices in Snowflake development, data warehousing, and ETL processes. Stay up-to-date with the latest Snowflake features and industry trends. Identify opportunities for process improvement and optimization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of relevant experience in data warehousing and ETL development, with a significant focus on Snowflake. Strong proficiency in SQL and experience working with large datasets. Solid understanding of data modeling concepts (dimensional modeling, star/snowflake schemas). Experience in designing and developing ETL or ELT pipelines. Proven ability to gather and document business and technical requirements. Excellent communication, interpersonal, and problem-solving skills. Snowflake certifications (e.g., SnowPro Core) are a plus.

Posted 1 day ago

Apply

7.0 - 12.0 years

18 - 30 Lacs

coimbatore

Work from Office

Must have atleast 7+ years of experience in Data warehouse, ETL, BI projects • Must have atleast 5+ years of experience in Snowflake • Expertise in Snowflake architecture is must. • Must have atleast 3+ years of experience and strong hold in Python/PySpark • Must have experience implementing complex stored Procedures and standard DWH and ETL concepts • Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot • Good to have experience with AWS services and creating DevOps templates for various AWS services. • Experience in using Github, Jenkins • Good communication and Analytical skills • Snowflake certification is desirable

Posted 1 day ago

Apply

11.0 - 18.0 years

0 Lacs

chennai, coimbatore, bengaluru

Hybrid

Open & Direct Walk-in Drive event | Hexaware technologies SNOWFLAKE & SNOWPARK Data Engineer/Architect in Chennai, Tamilnadu on 23rd AUG [Saturday] 2025 - Snowflake/ Snowpark/ SQL & Pyspark Dear Candidate, I hope this email finds you well. We are thrilled to announce an exciting opportunity for talented professionals like yourself to join our team as a Data Engineer/Architect. We are hosting an Open Walk-in Drive in Chennai, Tamilnadu on 23rd AUG [Saturday] 2025 , and we believe your skills in Snowflake/ SNOWPARK / SNOWPIPE/ SQL & Pyspark align perfectly with what we are seeking. Experience Level: 4 years to 18 years Details of the Walk-in Drive: Date: 23rd AUG [Saturday] 2025 Experience 5 year to 15 years Time: 9.30 AM to 4PM Point of Contact: Azhagu Kumaran Mohan/+91-9789518386 Venue: Hexaware Technologies, H-5, Sipcot It Park, Post, Navalur, Siruseri, Tamil Nadu 603103 Work Location: Open (Bangalore/ Pune/ Mumbai/ Noida/ Dehradun/ Chennai/ Coimbatore) Key Skills and Experience: As a Data Engineer, we are looking for candidates who possess expertise in the following: SNOWFLAKE SNOWPARK & SNOWPIPE SQL Pyspark/Spark Roles and Responsibilities: As a part of our dynamic team, you will be responsible for: 4-6 years of Total IT experience on any ETL/Snowflake cloud tool. Min 3 years of experience in Snowflake Min 1 year of experience in query and process data using SNOWPARK python. Strong SQL with experience in using Analytical functions, Materialized views, and Stored Procedures Experience in Data loading features of Snowflake like Stages, Streams, Tasks, and SNOWPIPE. Working knowledge on Processing Semi-Structured data What to Bring: 1. Updated resume 2. Photo ID, Passport size photo How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at AzhaguK@hexaware.com - +91-9789518386 We look forward to meeting you and exploring the potential of having you as a valuable member of our team.

Posted 2 days ago

Apply

6.0 - 8.0 years

0 Lacs

bengaluru, karnataka, india

On-site

BI Specialist Primary Skills Tableau, SQL, Tableau Prep, Advanced SQL Specialization Tableau Development: Data Specialist Job requirements Roles & Responsibilities: Engage with stakeholders, analyze the requirements and interpret them and build foundation models, dashboards & reports Design, build and maintain data pipelines for collecting and processing sales data Integrate data from various sources such as CRM, ERP & other custom sources into centralized systems for analysis Build transformations specific to Sales such as pipeline snapshots, sales funnel metrics, conversion, sales productivity, MQLs, meetings & pipeline metrics etc., Work with cross functional teams, collaborating with team and ensure to follow architectural standards & guidelines Ensure data accuracy, consistency and integrity across systems Implement data integrity, security and governance within the data warehouse Qualifications: At least 6-8 years of experience in one of the functional areas of Sales and Marketing Strong reporting skills with Tableau desktop Strong skills with Snowflake SQL or any other data platforms Expertise in using DbT Framework and building foundation models Strong SQL skills with any RDMS database Expertise in real time data warehouse model design, star & snowflake modeling, Slowly changing dimensions, snapshotting etc., Experience with Shell scripting, Python scripting & API framework is a plus Experience with GitHub, JIRA, Agile methodologies & other productivity tools Experience with CI/CD DevOps including using tools such as: GitHub, Jira Experience working directly with product owners and business partners Comfortable with ambiguity and effective with minimal guidance Show more Show less

Posted 2 days ago

Apply

5.0 - 10.0 years

0 Lacs

chennai, coimbatore, bengaluru

Hybrid

Open & Direct Walk-in Drive event | Hexaware technologies SNOWFLAKE & SNOWPARK Data Engineer/Architect in Chennai, Tamilnadu on 23rd AUG [Saturday] 2025 - Snowflake/ Snowpark/ SQL & Pyspark Dear Candidate, I hope this email finds you well. We are thrilled to announce an exciting opportunity for talented professionals like yourself to join our team as a Data Engineer/Architect. We are hosting an Open Walk-in Drive in Chennai, Tamilnadu on 23rd AUG [Saturday] 2025 , and we believe your skills in Snowflake/ SNOWPARK / SNOWPIPE/ SQL & Pyspark align perfectly with what we are seeking. Experience Level: 4 years to 18 years Details of the Walk-in Drive: Date: 23rd AUG [Saturday] 2025 Experience 5 year to 15 years Time: 9.30 AM to 4PM Point of Contact: Azhagu Kumaran Mohan/+91-9789518386 Venue: Hexaware Technologies, H-5, Sipcot It Park, Post, Navalur, Siruseri, Tamil Nadu 603103 Work Location: Open (Bangalore/ Pune/ Mumbai/ Noida/ Dehradun/ Chennai/ Coimbatore) Key Skills and Experience: As a Data Engineer, we are looking for candidates who possess expertise in the following: SNOWFLAKE SNOWPARK & SNOWPIPE SQL Pyspark/Spark Roles and Responsibilities: As a part of our dynamic team, you will be responsible for: 4-6 years of Total IT experience on any ETL/Snowflake cloud tool. Min 3 years of experience in Snowflake Min 1 year of experience in query and process data using SNOWPARK python. Strong SQL with experience in using Analytical functions, Materialized views, and Stored Procedures Experience in Data loading features of Snowflake like Stages, Streams, Tasks, and SNOWPIPE. Working knowledge on Processing Semi-Structured data What to Bring: 1. Updated resume 2. Photo ID, Passport size photo How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at AzhaguK@hexaware.com - +91-9789518386 We look forward to meeting you and exploring the potential of having you as a valuable member of our team.

Posted 2 days ago

Apply

3.0 - 8.0 years

3 - 7 Lacs

bengaluru

Work from Office

Educational Requirements MCA,MSc,Bachelor of Engineering,BBA,BCom,BCS Service Line Data & Analytics Unit Responsibilities Has good knowledge on Snowflake Architecture. Understanding Virtual Warehouses - multi-cluster warehouse, autoscaling Metadata and system objects - query history, grants to users, grants to roles, users Micro-partitions Table Clustering, Auto Reclustering Materialized Views and benefits Data Protection with Time Travel in Snowflake - extremely imp Analyzing Queries Using Query Profile - extremely important (Explain plan) Cache architecture Virtual Warehouse(VW) Named Stages Direct Loading SnowPipe, Data Sharing,Streams, JavaScript Procedures & Tasks Strong ability to design and develop workflows in Snowflake in at least one cloud technology (preferably, AWS) Apply Snowflake programming and ETL experience to write Snowflake SQL and maintain complex, internally developed Reporting system. Preferable knowledge in ETL Activities like data processing from multiple source systems. Extensive Knowledge on Query Performance tuning. Apply knowledge of BI tools. Manage time effectively. Accurately estimate effort for tasks and meet agreed-upon deadlines. Effectively juggle ad-hoc requests and longer-term projects.Snowflake performance specialist- Familiar withzero copy cloningand usingtime travelfeatures to clone table- Familiar in understandingSnowflake query profileand what each step does andidentifying performance bottlenecks from query profile- Understanding of when a table needs to be clustered- Choosing the right cluster keyas a part of table design to help query optimization - Working with materialized views andbenefits vs cost scenario- How Snowflake micro partitions are maintained and what are the performance implications wrt micro partitions/ pruningetc- Horizontal vs vertical scaling. When to do what.Concept of multi cluster warehouse and autoscaling- Advanced SQL knowledge including window functions, recursive queriesand ability to understand and rewritecomplex SQLs as a part of performance optimization" Additional Responsibilities: Domain* Data Warehousing, Business IntelligencePrecise Work Location Bhubaneswar, Bangalore, Hyderabad, Pune Technical and Professional Requirements: Mandatory skills* SnowflakeDesired skills* Teradata/Python(Not Mandatory) Preferred Skills: Cloud Platform->Snowflake Technology->OpenSystem->Python - OpenSystem

Posted 2 days ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

pune

Remote

Role & responsibilities Snowflake, DBT, AWS and Lead

Posted 2 days ago

Apply

5.0 - 9.0 years

0 - 0 Lacs

pune, maharashtra

On-site

As a Sr. ETL Developer at Madison Logic, you will be responsible for driving data integration solutions using Snowflake's cloud platform. Your primary role will involve extracting data from various sources, transforming it to align with business requirements, and loading it into Snowflake for analysis. You will write complex queries for applications and BI dashboards, optimize data retrieval and visualization processes, and work collaboratively with architecture and data teams to achieve business objectives through efficient data management and analysis. This position is an Individual Contributor (non-management) role. Your responsibilities will include crafting and optimizing sophisticated Snowflake SQL queries, diagnosing and resolving query-related issues, designing and implementing queries for applications and business intelligence reporting needs, addressing data quality challenges, utilizing data aggregation and modeling techniques, conducting in-depth data analysis, and selecting optimal data sources to meet project requirements. You will also be expected to review existing SQL queries for performance tuning, produce clear documentation, work in a team environment, and demonstrate a strong sense of accountability, adaptability, flexibility, and urgency. Basic qualifications for this role include on-site working at the ML physical office 5 days per week during the probation period, a university degree with 5+ years of practical experience or 7+ years of practical experience, 5+ years of experience in writing SQL queries, functions, and procedures, 4+ years of experience with cloud computing services (AWS), experience in data cleaning and standardizing processes, strong understanding of data modeling, and the ability to handle multiple tasks and projects simultaneously. SnowPro certification is considered a plus. Other characteristics desired for this position include being a collaborative team member, possessing strong interpersonal and communication skills, maintaining professionalism and integrity, exhibiting excellent organizational and project management capabilities, thriving in fast-paced environments, maintaining a positive attitude, and working collaboratively with a focus on results. The expected compensation for this role is a Fixed CTC of 17 LPA - 20 LPA for Sr. ETL Developers and 13 LPA - 16 LPA for ETL Developers. Madison Logic offers a mix of in-office and hybrid working environments, depending on the position. Hybrid remote work arrangements are not available for all roles. If you would like more information about perks and benefits or have any further queries, feel free to reach out to us. Your application and the information provided will be processed in accordance with our privacy policy and solely for recruitment purposes. Madison Logic is committed to fostering a diverse and inclusive work environment where all employees are valued, respected, and provided with equal opportunities for growth and success.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

You will be working in a hybrid mode at multiple locations including Bangalore, Chennai, Gurgaon, Pune, and Kolkata. With at least 6 years of experience in IT, you must possess a Bachelor's and/or master's degree in computer science or equivalent field. Your expertise should lie in Snowflake security, Snowflake SQL, and the design and implementation of various Snowflake objects. Practical experience with Snowflake utilities such as SnowSQL, Snowpipe, Snowsight, and Snowflake connectors is essential. You should have a deep understanding of Star and Snowflake dimensional modeling and a strong knowledge of Data Management principles. Additionally, familiarity with the Databricks Data & AI platform and Databricks Delta Lake Architecture is required. Hands-on experience in SQL and Spark (PySpark), as well as building ETL/data warehouse transformation processes, will be a significant part of your role. Strong verbal and written communication skills are essential, along with analytical and problem-solving abilities. Attention to detail is paramount in your work. The mandatory skills for this position include proficiency in (Snowflake + ADF + SQL) OR (Snowflake+ SQL).,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You are an experienced Senior Data Analyst with a minimum of 7-8 years of experience in data analysis roles, specifically with significant exposure to Snowflake. Your primary responsibilities will include querying and analyzing data stored in Snowflake databases to derive meaningful insights for supporting business decision-making. You will also be responsible for developing and maintaining data models and schema designs within Snowflake to facilitate efficient data analysis. In addition, you will create and maintain data visualizations and dashboards using tools like Tableau or Power BI, leveraging Snowflake as the underlying data source. Collaboration with business stakeholders to understand data requirements and translate them into analytical solutions is a key aspect of this role. You will also perform data validation, quality assurance, and data cleansing activities within Snowflake databases. Furthermore, you will support the implementation and enhancement of ETL processes and data pipelines to ensure data accuracy and completeness. A Bachelor's or Master's degree in Data Science, Statistics, Computer Science, Information Systems, or a related field is required. Certifications in data analytics, data visualization, or cloud platforms are desirable but not mandatory. Your primary skills should encompass a strong proficiency in querying and analyzing data using Snowflake SQL and DBT. You must have a solid understanding of data modeling and schema design within Snowflake environments. Experience in data visualization and reporting tools such as Power BI, Tableau, or Looker is essential for analyzing and presenting insights derived from Snowflake. Familiarity with ETL processes and data pipeline development is also crucial, along with a proven track record of using Snowflake for complex data analysis and reporting tasks. Strong problem-solving and analytical skills, including the ability to derive actionable insights from data, are key requirements. Experience with programming languages like Python or R for data manipulation and analysis is a plus. Secondary skills that would be beneficial for this role include knowledge of cloud platforms and services such as AWS, Azure, or GCP. Excellent communication and presentation skills, strong attention to detail, and a proactive approach to problem-solving are also important. The ability to work collaboratively in a team environment is essential for success in this position. This role is for a Senior Data Analyst specializing in Snowflake, based in either Trivandrum or Bangalore. The working hours are 8 hours per day from 12:00 PM to 9:00 PM, with a few hours of overlap during the EST time zone for mandatory meetings. The close date for applications is 18-04-2025.,

Posted 3 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

maharashtra

On-site

As a skilled Snowflake Developer with over 7 years of experience, you will be responsible for designing, developing, and optimizing Snowflake data solutions. Your expertise in Snowflake SQL, ETL/ELT pipelines, and cloud data integration will be crucial in building scalable data warehouses, implementing efficient data models, and ensuring high-performance data processing in Snowflake. Your key responsibilities will include: - Designing and developing Snowflake databases, schemas, tables, and views following best practices. - Writing complex SQL queries, stored procedures, and UDFs for data transformation. - Optimizing query performance using clustering, partitioning, and materialized views. - Implementing Snowflake features such as Time Travel, Zero-Copy Cloning, Streams & Tasks. - Building and maintaining ETL/ELT pipelines using Snowflake, Snowpark, Python, or Spark. - Integrating Snowflake with cloud storage (S3, Blob) and data ingestion tools (Snowpipe). - Developing CDC (Change Data Capture) and real-time data processing solutions. - Designing star schema, snowflake schema, and data vault models in Snowflake. - Implementing data sharing, secure views, and dynamic data masking. - Ensuring data quality, consistency, and governance across Snowflake environments. - Monitoring and optimizing Snowflake warehouse performance (scaling, caching, resource usage). - Troubleshooting data pipeline failures, latency issues, and query bottlenecks. - Collaborating with data analysts, BI teams, and business stakeholders to deliver data solutions. - Documenting data flows, architecture, and technical specifications. - Mentoring junior developers on Snowflake best practices. Required Skills & Qualifications: - 7+ years in database development, data warehousing, or ETL. - 4+ years of hands-on Snowflake development experience. - Strong SQL or Python skills for data processing. - Experience with Snowflake utilities (SnowSQL, Snowsight, Snowpark). - Knowledge of cloud platforms (AWS/Azure) and data integration tools (Coalesce, Airflow, DBT). - Certifications: SnowPro Core Certification (preferred). Preferred Skills: - Familiarity with data governance and metadata management. - Familiarity with DBT, Airflow, SSIS & IICS. - Knowledge of CI/CD pipelines (Azure DevOps).,

Posted 3 weeks ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Chennai

Remote

Job Title:- Sr. Python Data Engineer Location:- Chennai & Bangalore (REMOTE) Job Type:- Permanent Employee Experience :- 8 to 12 Years Shift: 2 11 PM Responsibilities Design and develop data pipelines and ETL processes. Collaborate with data scientists and analysts to understand data needs. Maintain and optimize data warehousing solutions. Ensure data quality and integrity throughout the data lifecycle. Develop and implement data validation and cleansing routines. Work with large datasets from various sources. Automate repetitive data tasks and processes. Monitor data systems and troubleshoot issues as they arise. Qualifications Bachelor’s degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role (Minimum 6+ years’ experience as Data Engineer). Strong proficiency in Python and PySpark. Excellent problem-solving abilities. Strong communication skills to collaborate with team members and stakeholders. Individual Contributor Technical Skills Required Expert Python, PySpark and SQL/Snowflake Advanced Data warehousing, Data pipeline design – Advanced Level Data Quality, Data validation, Data cleansing – Advanced Level Intermediate/Basic Microsoft Fabric, ADF, Databricks, Master Data management/Data Governance Data Mesh, Data Lake/Lakehouse Architecture

Posted 3 weeks ago

Apply

5.0 - 8.0 years

10 - 16 Lacs

Pune, Chennai, Bengaluru

Hybrid

Snowflake Developer: Mandate Skills: Snowflake, SQL Responsibility: Design, develop, and maintain Snowflake databases and data warehouse solutions. Build and optimize SQL queries for data processing and reporting. Experience with Snowflake tools such as Snowpipe, Time Travel, and Cloning. Collaborate with cross-functional teams to implement data models and pipelines. Ensure data security, quality, and compliance in Snowflake environments. Monitor and troubleshoot Snowflake and SQL systems for performance issues. Exp..-5-8YRS Location: Mumbai/Bangalore/Pune/Chennai/Hyderabad/Indore/Kolkata/Noida/Coimbatore/Bhuvaneswar Notice Period : 0-30Days Interested Candidates share your cv at Muktai.S@alphacom.in

Posted 3 weeks ago

Apply

5.0 - 10.0 years

6 - 16 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 3 weeks ago

Apply

7.0 - 10.0 years

4 - 8 Lacs

Pune, Maharashtra, India

On-site

Data profiling to identify primary keys and issues with the data. ETL to bring data onto the Cambia Data Platform, de-duplicate data, create or update dimensional data structures, and produce use case-specific output. Unit testing, functional testing, and performance testing and tuning. Interacting with the Product team to understand and refine requirements. Interacting with QA to address reported findings. Working individually and as a team to achieve our goals. Taking initiative to take on additional work if the present work stream slows down Other similar or related activities. Top 3-5 REQUIREMENTS (you don t want to see candidates without these) Expert level knowledge of Git CLI and managing git-based repositories Previous CI/CD experience in working with either Gitlab Runners, Github Actions, Circle CI, or Jenkins and configuring them into GitLab repositories Intermediate to expert knowledge of Snowflake related technologies Intermediate experience in developing and managing python code and python based web services Top 3-5 Desirements

Posted 4 weeks ago

Apply

5.0 - 10.0 years

11 - 16 Lacs

Nagpur, Pune

Work from Office

JOB DESCRIPTION Off-Shore Contract Data Engineering Role that MUST work out of an approved Clean Room facility. The role is part of an Agile Team in support of Financial Crimes data platforms and strategies, including but not limited to their use of SAS Grid and SnowFlake. JOB SUMMARY Handle the design and construction of scalable data management systems, ensure that all data systems meet company requirements, and also research new uses for data acquisition. Required to know and understand the ins and outs of the industry such as data mining practices, algorithms, and how data can be used. Primary Responsibilities: Design, construct, install, test and maintain data management systems. Build high-performance algorithms, predictive models, and prototypes. Ensure that all systems meet the business/company requirements as well as industry practices. Integrate up-and-coming data management and software engineering technologies into existing data structures. Develop set processes for data mining, data modeling, and data production. Create custom software components and analytics applications. Research new uses for existing data. Employ an array of technological languages and tools to connect systems together. Collaborate with members of your team (eg, data architects, the IT team, data scientists) on the projects goals. Install/update disaster recovery procedures. Recommend different ways to constantly improve data reliability and quality. Maintaining up-to-date knowledge, support, and training documentation QUALIFICATIONS Technical Degree or related work experience Proficiency and Technical Skills Relating to: SQL, MySQL, DBT, SnowFlake, and SAS Exposure and experience with: ETL (DataStage), Scripting (Python, Java Script, Etc), Version Controls (Git), Highly Regulated Environments (Banking, Health Care, Etc).

Posted 1 month ago

Apply

2.0 - 7.0 years

5 - 15 Lacs

Mumbai, Mumbai (All Areas)

Work from Office

We are seeking a highly skilled Data Engineer with a strong background in Snowflake and Azure Data Factory (ADF) , and solid experience in Python and SQL . The ideal candidate will play a critical role in designing and building robust, scalable data pipelines, enabling modern cloud-based data platforms including data warehouses and data lakes . Key Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines using Snowflake , ADF , and Python to support data warehouse and datalake architectures. Build and automate data ingestion pipelines from various structured and semi-structured sources (APIs, flat files, cloud storage, databases) into Snowflake-based data lakes and data warehouses . Perform full-cycle data migration from on-premise and cloud databases (e.g., Oracle, SQL Server, Redshift, MySQL) to Snowflake . Optimize Snowflake workloads: schema design, clustering, partitioning, materialized views, and query performance tuning . Develop and orchestrate data workflows using Azure Data Factory pipelines, triggers, and dataflows. Implement data quality checks , validation processes, and monitoring mechanisms for production pipelines. Collaborate with cross-functional teams including Data Scientists, Analysts, and DevOps to support diverse data needs. Ensure data integrity, security, and governance throughout the data lifecycle. Maintain comprehensive documentation on pipeline design, schema changes, and architectural decisions. Required Skills & Qualifications Bachelors degree in Computer Science, Information Technology, or a related field. 2+ years of hands-on experience with Snowflake , including Snowflake SQL,SnowSQL, Snowpipe, Streams, Tasks, and performance optimization. 1+ year of experience with Azure Data Factory (ADF) – pipeline design,linked services, datasets, triggers, and integration runtime. Strong Python skills for scripting, automation, and data manipulation. Advanced SQL skills – ability to write efficient, complex queries, procedures, and analytical expressions. Experience designing and implementing data lakes and data warehouses on cloud platforms. Familiarity with Azure cloud services , including Azure Data Lake Storage (ADLS), Blob Storage, Azure SQL, and Azure DevOps. Experience with orchestration tools such as Airflow, DBT, or Prefect is a plus. Understanding of data modeling , data warehousing principles , and ETL/ELT best practices . Experience in building scalable data architectures for analytics and business intelligence use cases. Preferred Qualifications (Nice to Have) Experience with CI/CD pipelines for data engineering (e.g., Azure DevOps, GitHub Actions). Familiarity with Delta Lake, Parquet, or other big data formats. Knowledge of data security and governance tools like Purview or Informatica.

Posted 1 month ago

Apply

5.0 - 10.0 years

6 - 16 Lacs

Pune, Chennai, Bengaluru

Work from Office

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 1 month ago

Apply
Page 1 of 4
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies