Jobs
Interviews

11 Snowflakes Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 - 0 Lacs

pune, maharashtra

On-site

You are a skilled Data QA Engineer II tasked with developing and maintaining an automated regression suite to safeguard data integrity, ensure quality, and streamline logical processes in alignment with business needs. Your responsibilities include actively participating in Agile ceremonies, analyzing requirements, devising testing strategies, and facilitating the Agile process across cross-functional teams. You will collaborate with various stakeholders such as product owners, project managers, data analysts, data scientists, and developers to enhance user stories, acceptance criteria, and provide feedback on business flows and strategies. Your role involves writing intricate SQL queries on extensive datasets to maintain data integrity throughout the ETL lifecycle. You will construct and sustain automated test coverage using PyTest/Python to validate business objectives and technical solutions for both new and existing functionalities within each sprint. Additionally, you will liaise with team leads from discovery to resolution, serving as the primary contact for all QA aspects of releases, delivering QA services, offering UAT guidance, and coordinating QA resources internally and externally for the squad. To excel in this position, you should hold a Bachelor's degree in Computer Science, Data Science, or Information Systems, with at least 5 years of experience in agile software development testing. You must possess a minimum of 3 years of expertise in database testing for relational databases (preferably Snowflake and MS SQL Server), ETL/ELT data solutions, and reporting and analytics tools. Proficiency in data warehouse concepts like star schemas, snowflakes, facts, and dimensional modeling is crucial. Your experience in crafting complex SQL queries, determining the required testing types, evaluating testability of requirements, and devising comprehensive test plans that align with business and technological solutions will be essential. Key skills for this role include API Testing, Manual Testing using tools like Postman/swagger, and automated testing with REST Assured/CURL. Proficiency in utilizing Git, GitHub, CI/CD processes, Jenkins, Docker, Jira, or other bug-tracking software is highly valued. The job location is in Pune, with a required experience range of 2 to 5 years and an annual CTC between 300,000 to 700,000 INR. If you are passionate about ETL testing, automation, manual testing, agile software development, snowflakes, SQL, and ETL Testing, we invite you to apply for this exciting opportunity. Join our team from 12th November 2024 to 30th April 2025 at the specified venue - Ground floor, IT-06 building, Qubix Business Park, Neopro SEZ, Near Cognizant Sys, Rajiv Gandhi Infotech Park - Phase-I, Hinjewadi, Pune 400065. For more information, visit our website at [https://www.itgurussoftware.com](https://www.itgurussoftware.com).,

Posted 3 days ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Pune

Work from Office

About the Role : The Director of Engineering at Abacus is a key leadership role across US, India and Nepal, responsible for shaping the technical direction of the organization, ensuring the efficient operation of engineering teams, and driving the development of cutting-edge products and solutions. This role requires a combination of technical expertise, leadership skills, and strategic vision to support the company's growth and success. Specific Duties include the following: Manage a group of Data Engineers in US, India and Nepal and provide technical guidance, mentorship and performance management Collaborate with engineering managers, tech leads, cross-functional teams and stakeholders across US, India and Nepal to ensure engineering alignment and architectural principals Stay current with industry trends, emerging technologies, and best practices in data engineering and software development A broad technical understanding of cloud platforms (AWS, Azure, GCP) and data processing platforms/tools (Databricks, Snowflake, Airbyte, DBT) Understanding the ecosystem and data lifecycle of data ingestion, transformation, MDM, data decoration and distribution Foster a culture of collaboration, accountability, innovation and continuous learning Drive innovation and identify opportunities to improve engineering processes Collaborate closely with product management and other cross-functional teams to define technical requirements and project priorities Oversee the planning, execution and delivery of engineering projects, ensuring the timely and quality completion of the projects Monitor and communicate project progress, identify risks and implement mitigation strategies What Were Looking For : Bachelor's degree, preferably in Computer Science/Engineering Demonstrated track record of successfully managing and scaling engineering teams Prior experience managing global teams across different time zones 7+ years of experience managing software teams 3+ years of data engineering Go-getter with a self-starter mindset Strong project management skills Excellent oral and written communication skills Strong analytical, problem solving, organization and prioritization skills Be able to work in US Eastern hours 3 days a week, flexible to what the business needs at a given time Be able to travel to US at least 3 times a year Be able to travel to India and Nepal campus at least once per quarter Equal Opportunity Employer As a mission-led technology company that is helping to drive better healthcare outcomes, Abacus Insights believes that the best innovation and value we can bring to our customers comes from diverse ideas, thoughts, experiences, and perspectives. We are dedicated to building diverse teams and providing equal employment opportunities to all applicants. Abacus prohibits discrimination and harassment of any type in regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. Role & responsibilities Preferred candidate profile

Posted 2 weeks ago

Apply

5.0 - 8.0 years

15 - 27 Lacs

Pune

Work from Office

We are seeking a skilled Data Engineer with hands-on experience in Azure Data Factory (ADF) and Snowflake development. The ideal candidate will have a solid background in SQL, data warehousing, and cloud data pipelines, with a keen ability to design, implement, and maintain robust data solutions that support business intelligence and analytics initiatives. Key Responsibilities: Design and develop scalable data pipelines using ADF and Snowflake Integrate data from various sources using SQL, GitHub, and cloud-native tools Apply data warehousing best practices and ensure optimal data flow and data quality Collaborate within Scrum teams and contribute to agile development cycles Liaise effectively with stakeholders across the globe to gather requirements and deliver solutions Support data modeling efforts and contribute to Python-based enhancements (as needed) Qualifications: Minimum 5 years of overall Data Engineering experience At least 2 years of hands-on experience with Snowflake At least 2 years of experience working with Azure Data Factory Strong understanding of Data Warehousing concepts and methodologies Experience with Data Modeling and proficiency in Python is a plus Familiarity with version control systems like GitHub Experience working in an agile (Scrum) environment

Posted 4 weeks ago

Apply

10.0 - 12.0 years

3 - 6 Lacs

Bengaluru, Karnataka, India

On-site

Required Qualification: Experience in preparing Data warehouse design artifacts based on given requirements Defining Standards, Building Frameworks, and Source-Target Mapping Strong analytical and data modeling skillsincluding logical and physical schema design Deep understanding of database technology and data use within analytic platforms Strong programming skills with SQL and PySpark, Experience with AWS cloud Services including EMR, Redshift/Postgres Experience in other Cloud / ETL technologies including Databricks, Snowflakes, Ab Initio Good Understanding of other programming languages Python, Java Proven experience with large, complex database projects in environments producing high-volume data Demonstrated Problem solving skills; familiarity with various root cause analysis methods; experience in documenting identified problems and determined resolutions. Experience in analyzing Performance Issues analysis and Tuning Preferred Qualifications: Master's degree in information technology, Electrical Engineering or similar relevant fields. Proven experience (10 years minimum) with Datawarehouse design, build performance tuning and optimization Very good knowledge of data warehouse architecture approaches and trends, and high interest to apply and further develop that knowledge, including understanding of Dimensional Modelling and ERD design approaches, Experience in developing streaming applications Spark Streaming, Flink, Storm,etc. Excellent conceptual abilities compared with very good technical documentation skills, e.g. ability to understand and document complex data flows as part of business / production processes, Familiarity with SDLC concepts and processes

Posted 1 month ago

Apply

3.0 - 8.0 years

3 - 8 Lacs

Pune, Maharashtra, India

On-site

About Role As a Manager, Software Engineering, you will be responsible for: Leading and managing small-scale development organizations (2+ teams), demonstrating thought leadership, cross-functional influence, and strong partnership. Driving the design and development of software solutions across multiple programming languages, adhering to secure coding standards (e.g., OWASP, CWE, SEI CERT) and implementing robust vulnerability management practices. Guiding teams in building and supporting applications utilizing open frameworks (e.g., Spring Boot, Angular) to optimize for reuse and reduce development cycles. Leveraging a deep understanding of operating system internals (Windows, Linux) to ensure the delivery of interoperable and high-performing code. Providing expert-level debugging and troubleshooting support, including the analysis of core, heap, and thread dumps to identify and resolve complex coding errors. Documenting and coaching development teams on best practices and coding guidelines, including branching strategies, peer reviews, library usage, logging standards, scanning rules, test-driven development, and error handling. Conducting technical code reviews across applications and their dependencies, identifying anti-patterns, and championing continuous refactoring initiatives. Understanding and articulating technical debt and operational issues, driving prioritization discussions with stakeholders to enhance the overall run experience of applications. Analyzing system architecture to plan for platform and infrastructure capacity (e.g., database, compute, network, storage) and driving dependency prioritization to reduce delivery lead time. Understanding customer journeys and ensuring a superior Mastercard experience by continuously reducing Mean Time to Mitigate (MTTM) for incidents and maintaining high availability (starting at 99.95%). Simplifying deployment processes and eliminating software and infrastructure snowflakes through the adoption of standardized platforms, ephemeral instances, and automation. Orchestrating release workflows and pipelines, and applying standardized pipelines via APIs to achieve Continuous Integration (CI) and Continuous Delivery (CD) using industry-standard tools (e.g., Jenkins, Bamboo, AWS/Azure pipelines, XL Release). Managing projects within the banking domain; processing knowledge is a significant plus. Technical Qualifications: Experience working in cross-functional and large-scale projects. Proven IT experience with a successful track record in managing small-scale development organizations (2+ teams), demonstrating thought-leadership, cross-functional influence, and strong partnership. Progressively grown career with proven design and development experiences in multiple languages, secure coding standards (e.g., OWASP, CWE, SEI CERT), and vulnerability management. Skills in building and supporting applications using open frameworks to achieve reuse and reduce development times (e.g., Spring Boot, Angular, others). Understands internals of operating systems (Windows, Linux) to deliver interoperable and performant code. Able to perform debugging and troubleshooting to analyze core, heap, thread dumps, and remove coding errors. Skills to document and coach team on development practices and coding guidelines (e.g., branching, peer reviews, library use, logging, scanning rules, test-driven development, error handling). Skills to undertake a technical review of code across applications and their dependencies to look for anti-patterns and promote continuous refactoring. Understands and elaborates technical debt and operational issues to drive prioritization discussions with stakeholders to improve the run experience. Understands system architecture to plan for platform and infrastructure capacity (e.g., database, compute, network, storage) and drives the dependency prioritization to reduce the delivery lead time. Skills to understand customer journeys and ensure a Mastercard good experience by continuously reducing Mean Time to Mitigate (MTTM) for incidents and ensuring high availability (99.95% as a starting point). Skills to simplify deployment and eliminate software and infrastructure snowflakes using standardized platforms, ephemeral instances, and automation. Skills to orchestrate release workflows and pipelines and apply standardized pipelines via APIs to achieve CI and CD using industry-standard tools (e.g., Jenkins, Bamboo, AWS/Azure pipelines, XL Release, others). Experience handling projects in the banking domain; processing knowledge is a plus. Corporate Security Responsibility : Every person working for, or on behalf of, Mastercard is responsible for information security. All activities involving access to Mastercard assets, information, and networks come with an inherent risk to the organization. Therefore, it is expected that the successful candidate for this position must: Abide by Mastercard's security policies and practices. Ensure the confidentiality and integrity of the information being accessed. Report any suspected information security violation or breach. Complete all periodic mandatory security trainings in accordance with Mastercard's guidelines.

Posted 1 month ago

Apply

2.0 - 7.0 years

3 - 8 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Qualifications Experience Required: 3 - 5 years being part of Agile teams 3 - 5 years of scripting 2+ years of AWS Hand on (S3, Lamda) 2+ years of experience with Pyspark or Python 2+ Experience with cloud technologies such as AWS. 2+ years of hand on with SQL Experience Desired: Experience with GITHUB Teradata, AWS (Glue, Lamda), Databricks, Snowflake, Angular, Rest API, Terraform, Jenkins (Cloudbees, Jenkinsfile/Groovy, password valt) Education and Training Required: Knowledge and/or experience with Health care information domains is a plus Computer science Good to have PrimarySkills: JavaScript, Python, PySpark, TDV, R, Ruby, Perl Lambdas, S3, EC2 Databricks, Snowflakes, Jenkins, Kafka, API Language, Angular, Selenium, AI & Machine Learning

Posted 1 month ago

Apply

2.0 - 7.0 years

40 - 45 Lacs

Chandigarh

Work from Office

Responsibilities: Design and Develop complex data processes in coordination with business stakeholders to solve critical financial and operational processes. Design and Develop ETL/ELT pipelines against traditional databases and distributed systems and to flexibly produce data back to the business and analytics teams for analysis. Work in an agile, fail fast environment directly with business stakeholders and analysts, while recognising data reconciliation and validation requirements. Develop data solutions in coordination with development teams across a variety of products and technologies. Build processes that analyse and monitor data to help maintain controls - correctness, completeness and latency. Participate in design reviews and code reviews Work with colleagues across global locations Troubleshoot and resolve production issues Performance Enhancements Required Skills & Qualifications Programming Skills Python / PySpark / Scala Database Skills Analytical Databases like Snowflakes / SQL Good to have - Elastic Search , Kafka , Nifi , Jupyter Notebooks, Good to have - Knowledge of AWS services like S3 / Glue / Athena / EMR / lambda Requirements Responsibilities: Design and Develop complex data processes in coordination with business stakeholders to solve critical financial and operational processes. Design and Develop ETL/ELT pipelines against traditional databases and distributed systems and to flexibly produce data back to the business and analytics teams for analysis. Work in an agile, fail fast environment directly with business stakeholders and analysts, while recognising data reconciliation and validation requirements. Develop data solutions in coordination with development teams across a variety of products and technologies. Build processes that analyse and monitor data to help maintain controls - correctness, completeness and latency. Participate in design reviews and code reviews Work with colleagues across global locations Troubleshoot and resolve production issues Performance Enhancements Required Skills & Qualifications Programming Skills Python / PySpark / Scala Database Skills Analytical Databases like Snowflakes / SQL Good to have - Elastic Search , Kafka , Nifi , Jupyter Notebooks, Good to have - Knowledge of AWS services like S3 / Glue / Athena / EMR / lambda

Posted 1 month ago

Apply

4.0 - 7.0 years

5 - 15 Lacs

Pune

Hybrid

Are you looking for a stable job with great benefits and pay? Consider becoming part of the Avient team! We know your time is valuable and you have a lot of job ads to review. Let us break down the important details! Position Summary Analytics Analyst: Our Avient IT team is seeking an Analyst to be part of a new analytics platform being developed for our global organization. In this role, you will focus on expanding our Snowflake data warehouse, leveraging your robust SQL skills to move analytics data from a raw state into a refined, analytics-ready state where it will be consumed by end users. This position supports a self-service analytics model by delivering well-defined and transformed datasets that provide consistency in reporting, facilitate efficient data mining, and help our business partners generate insights. Essential Functions: Collaborate with business associates and other IT resources to understand analytic needs, including localizations required by our global business. Translate those needs into technical solutions that properly move data through the Snowflake landscape. Maintain relevant technical documentation to ensure streamlined support and knowledge transfer within areas of responsibility. Support those processes as needed going forward. Prepare and execute unit and integration testing. Support user acceptance testing Provide hypercare support after each go-live. Troubleshoot and resolve analytic issues impacting the business. Other duties and projects as assigned. QUALIFICATIONS: Education and Experience: Bachelors of Science degree in Computer Science, Business, Information Systems or related business field required. 4+ years of experience in Analytics technical roles. Snowflake and SQL experience required. Experience with Qlik Replicate to ingest data to Snowflake is preferred. Experience with Agile methodology and Jira software is preferred. Ability to work independently and as a part of a global team is required. Ability to create and maintain robust documentation of IT processes is required. Strong collaboration skills within a team, across IT, and with business users is required. Strong troubleshooting and problem solving skills is required. Successful candidates will be driven to succeed while fostering a great place to work. Physical Demands : Vacancy located in Pune ,Kharadi location. About Us: Our purpose at Avient Corporation is to be an innovator of materials solutions that help our customers succeed, while enabling a sustainable world. Innovation goes far beyond materials science; its powered by the passion, creativity, and diverse expertise of 9,000 professionals worldwide. Whether youre a finance wizard, a tech enthusiast, an operational powerhouse, an HR changemaker, or a trailblazer in materials development, youll find your place at Avient. Join our global team and help shape the future with sustainable solutions that transform possibilities into realities. Your unique perspective could be the key to our next breakthrough! Avient Leadership Behaviors: We believe that all of our global employees are leaders and that the six most important behaviors for driving our strategy and culture are the same no matter if an employee is a leader of self, a leader of others, or a leader of the business. By playing to win, acting customer centric, driving innovation and profitable growth, collaborating seamlessly across Avient, and motivating and inspiring and developing others and yourself you will accelerate your ability to achieve Avient s strategic goals, to meet our customer needs, and to accomplish your career goals.

Posted 1 month ago

Apply

10 - 15 years

45 - 60 Lacs

Bengaluru

Work from Office

You'll Get To: Provide technical expertise and leadership in technology direction, road-mapping, architecture definition, design, development, and delivery of enterprise-class solutions while adhering to timelines, coding standards, requirements, and quality. Architect, design, develop, test, troubleshoot, debug, optimize, scale, perform the capacity planning, deploy, maintain, and improve software applications, driving the delivery of high-quality value and features to Blacklines customers. Work collaboratively across the company to design, communicate and further assist with adoption of best practices in architecture and implementation. Deliver robust architectural solutions for complex design problems. Implement, refine, and enforce data engineering best practices to ensure that delivered features meet performance, security, and maintainability expectations. Research, test, benchmark, and evaluate new tools and technologies, and recommend ways to implement them in data platform. Identify and create solutions that are likely to contribute to the development of new company concepts while keeping in mind the business strategy, short- and long-term roadmap, and architectural considerations to support them in a highly scalable and easy extensible manner. Actively participate in research, development, support, management, and other company initiatives designing solutions to optimally address current and future business requirements and infrastructure plans. Inspire a forward-thinking team of developers, acting as an agent of change and evangelist for a quality-first culture within the organization. Mentor and coach key technical staff and guide them to solutions on complex design issues. Act as a conduit for questions and information flow when those outside of Engineering have ideas for new technology applications. Speak in terms relevant to audience, translating technical concepts into non-technical language and vice versa. Facilitate consensus building while striving for win/win scenarios and elicit value-add contributions from all team members in group settings. Maintain a strong sense of business value and return on investment in planning, design, and communication. Proactively identify issues, bottlenecks, gaps, or other areas of concern or opportunity and work to either directly affect change, or advocate for that change by working with peers and leadership to build consensus and act. Perform critical maintenance, deployment, and release support activities, including occasional off-hours support. What You'll Bring: Bachelor's or Master's degree in Computer Science, Data Science, or a related field. 10+ years as a data engineer. 10+ years of experience using RDBMS, SQL, NoSQL, Python, Java, or other programming languages is a plus. You'll Get To: 10+ years of experience designing, developing, testing, and implementing Extract, Transform and Load (ELT/ETL) solutions using enterprise ELT/ETL tools and Open source. 5+ years working experience with SQL and familiarity with Snowflake data warehouse, strong working knowledge in stored procedures, CTEs, and UDFs, RBAC Knowledge of data integration and data quality best practices Familiarity with data security and privacy regulations. Experience in working in a startup-type environment, good team player, and can work independently with minimal supervision Experience with cloud-native architecture and data solutions. Strong working knowledge in data modeling, data partitioning, and query optimization Demonstrated knowledge of development processes and agile methodologies. Strong analytical and interpersonal skills, comfortable presenting complex ideas in simple terms. Proficient in managing large volumes of data. Strong analytical and interpersonal skills, comfortable presenting complex ideas in simple terms. Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams. Experience in providing technical support and troubleshooting for data-related issues. Expertise with at least one cloud environment and building cloud native data services. Prior experience driving data governance, quality, security initiatives.

Posted 2 months ago

Apply

8 - 13 years

40 - 45 Lacs

Noida, Gurugram

Work from Office

Responsibilities: Design and articulate enterprise-scale data architectures incorporating multiple platforms including Open Source and proprietary Data Platform solutions - Databricks, Snowflake, and Microsoft Fabri c, to address customer requirements in data engineering, data science, and machine learning use cases. Conduct technical discovery sessions with clients to understand their data architecture, analytics needs, and business objectives Design and deliver proof of concepts (POCs) and technical demonstrations that showcase modern data platforms in solving real-world problems Create comprehensive architectural diagrams and i mplementation roadmaps for complex data ecosystems spanning cloud and on-premises environments Evaluate and recommend appropriate big data technologies, cloud platforms, and processing frameworks based on specific customer requirements Lead technical responses to RFPs (Request for Proposals), crafting detailed solution architectures, technical approaches, and implementation methodologies Create and review techno-commercial proposals, including solution scoping, effort estimation, and technology selection justifications Collaborate with sales and delivery teams to develop competitive, technically sound proposals with appropriate pricing models for data solutions Stay current with the latest advancements in data technologies including cloud services, data processing frameworks, and AI/ML capabilities Qualifications: Bachelor's or Master's degree in Computer Science, Data Science, or a related technical field. 8+ years of experience in data architecture, data engineering, or solution architecture roles Proven experience in responding to RFPs and developing techno-commercial proposals for data solutions Demonstrated ability to estimate project efforts, resource requirements, and implementation timelines Hands-on experience with multiple data platforms including Databricks, Snowflake, and Microsoft Fabric Strong understanding of big data technologies including Hadoop ecosystem, Apache Spark, and Delta Lake Experience with modern data processing frameworks such as Apache Kafka and Airflow Proficiency in cloud platforms ( AWS, Azure, GCP ) and their respective data services Knowledge of system monitoring and observability tools. Experience implementing automated testing frameworks for data platforms and pipelines Expertise in both relational databases (PostgreSQL, MySQL) and NoSQL databases (MongoDB) Understanding of AI/ML technologies and their integration with data platforms Familiarity with data integration patterns, ETL/ELT processes , and data governance practices Experience designing and implementing data lakes, data warehouses, and machine learning pipelines Proficiency in programming languages commonly used in data processing (Python, Scala, SQL) Strong problem-solving skills and ability to think creatively to address customer challenges Relevant certifications such as Databricks, Snowflake, Azure Data Engineer, or AWS Data Analytics are a plus Willingness to travel as required to meet with customers and attend industry events If interested plz contact Ramya 9513487487, 9342164917

Posted 2 months ago

Apply

12 - 19 years

45 - 55 Lacs

Noida, Hyderabad, Gurugram

Work from Office

Responsibilities Lead a team of data engineers, providing technical mentorship, performance management, and career development guidance Design and oversee implementation of modern data architecture leveraging cloud platforms ( AWS, Azure, GCP ) and industry-leading data platforms (Databricks, Snowflake, Microsoft Fabric) Establish data engineering best practices, coding standards, and technical documentation processes Develop and execute data platform roadmaps aligned with business objectives and technical innovation Optimize data pipelines for performance, reliability, and cost-effectiveness Collaborate with data science, analytics, and business teams to understand requirements and deliver tailored data solutions Drive adoption of DevOps and DataOps practices, including CI/CD, automated testing etc. Qualifications Bachelor's or Master's degree in Computer Science, Information Systems, or a related technical field. 12+ years of experience in data engineering roles with at least 3 years in a leadership position Expert knowledge of big data technologies (Hadoop ecosystem) and modern data processing frameworks (Apache Spark, Kafka, Airflow) Extensive experience with cloud platforms (AWS, Azure, GCP) and cloud-native data services Hands-on experience with industry-leading data platforms such as Databricks, Snowflake, and Microsoft Fabric Strong background in both relational (PostgreSQL, MySQL) and NoSQL (MongoDB) database systems Experience implementing and managing data monitoring solutions (Grafana, Ganglia, etc.) Proven track record of implementing automated testing frameworks for data pipelines and applications Knowledge of AI/ML technologies and how they integrate with data platforms Excellent understanding of data modelling, ETL processes, and data warehousing concepts Outstanding leadership, communication, and project management skills If interested plz contact Ramya 9513487487, 9342164917

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies