Home
Jobs

1569 Snowflake Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Process Manager - AWS Data Engineer Mumbai/Pune| Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)|Management Level - PM| Travel - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles and responsibilities: Understand clients requirement and provide effective and efficient solution in AWS using Snowflake. Assembling large, complex sets of data that meet non-functional and functional business requirements Using Snowflake / Redshift Architect and design to create data pipeline and consolidate data on data lake and Data warehouse. Demonstrated strength and experience in data modeling, ETL development and data warehousing concepts Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL Technical and Functional Skills: AWS ServicesStrong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc. Programming LanguagesProficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Data WarehousingExperience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. ETL ToolsFamiliarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Database ManagementKnowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Big Data TechnologiesUnderstanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Version ControlProficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Problem-solving Skills: Ability to analyze complex technical problems and propose effective solutions. Communication Skills: Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders. Education and ExperienceTypically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. About eClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About eClerx Technology eClerxs Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law

Posted 4 days ago

Apply

4.0 - 5.0 years

9 - 19 Lacs

Hyderabad

Work from Office

Hi All , We have immediate openings for Below Requirement Role : Hadoop Administration Skill : Hadoop Administrator(with EMR, Spark, Kafka, HBase, OpenSearch, Snowflake, Neo4j, AWS) Experience : 4 to 9yrs Work location : Hyderabad Interview Mode : 1sr round virtual & 2nd round F2F Notice Period : 15 to immediate joiners only Interested candidates can share your cv to Mail : sravani.vommi@sonata-software.com Contact : 7075751998 JD FOR Hadoop Admin: Hadoop Administrator(with EMR, Spark, Kafka, HBase, OpenSearch, Snowflake, Neo4j, AWS) Job Summary: We are seeking a highly skilled Hadoop Administrator with hands-on experience managing distributed data platforms such as Hadoop EMR, Spark, Kafka, HBase, OpenSearch, Snowflake, and Neo4j. Key Responsibilities: Cluster Management: Administer, manage, and maintain Hadoop EMR clusters, ensuring optimal performance, high availability, and resource utilization. Handle the provisioning, configuration, and scaling of Hadoop clusters, with a focus on EMR, ensuring seamless integration with other ecosystem tools (e.g., Spark, Kafka, HBase). Oversee HBase configurations, performance tuning, and integration within the Hadoop ecosystem. Manage OpenSearch(formerly known as Elasticsearch) for log analytics and large-scale search applications. Data Integration & Processing: Oversee the performance and optimization of Apache Spark workloads across distributed data environments. Design and manage efficient data pipelines between Snowflake, Kafka, and the Hadoop ecosystem, ensuring seamless data movement and transformation. Implement data storage solutions in Snowflake and manage seamless data transfers to/from Hadoop(EMR) and other environments. Cloud & AWS Services: Work closely with AWS services such as EC2, S3,ECS, Lambda, IAM, RDS, and CloudWatch to build scalable, cost-efficient solutions for data management and processing. manage AWS EMR clusters, ensuring they are optimized for big data workloads and integrated with other AWS services. - Security & Compliance: Manage and configure Kerberos authentication and access control mechanisms within the Hadoop ecosystem (HDFS, YARN, Spark) to ensure data security. Implement encryption and secure data transfer policies within Hadoop clusters, Kafka, HBase, and OpenSearch to meet compliance and regulatory requirements. Manage user roles and permissions for access to Snowflake and ensure seamless integration of security policies across platforms. Monitoring & Troubleshooting: Set up and manage monitoring solutions to ensure the health of the Hadoop ecosystem and related components. Actively monitor and troubleshoot issues with Spark, Kafka, HBase, OpenSearch, and other distributed systems. Provide proactive support to address performance issues, bottlenecks, and failures. Automation & Optimization: Automate the deployment, scaling, and management of Hadoop and other big data systems using scripting languages (Bash, Python) . Optimize the configurations and performance of EMR, Spark, Kafka, HBase, OpenSearch. Develop scripts and utilities for backup, job monitoring, and performance tuning.

Posted 4 days ago

Apply

11.0 - 18.0 years

30 - 45 Lacs

Hyderabad

Work from Office

Role & responsibilities We are looking for an experienced Data Architect with deep expertise in Snowflake technologies to lead the design, development, and deployment of scalable data architectures. This role involves building robust data pipelines, optimizing data warehouses, and supporting complex data migrations, ensuring data quality, security, and governance across all layers. Preferred candidate profile Data Modeling : Star, Snowflake, Data Vault, hybrid schemas, partitioning, clustering Databases : Snowflake, Oracle, SQL Server, Greenplum, PostgreSQL ETL/ELT Tools : Informatica IDMC, DataStage Big Data Tools : Hadoop, Hive Cloud Integration : AWS Services S3, EC2, Lambda, Glue Programming Languages : Python, PySpark Schedulers : Control-M Data Security : RBAC, data masking, encryption, audit trails, compliance (HIPAA, GDPR) Automation : Advanced SQL, API integration, DevOps practices Data Governance : Data quality, lineage, cataloging, MDM, metadata management

Posted 4 days ago

Apply

4.0 - 9.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Skill-MSBI/ms sql SSIS Location-Bangalore Experince- 4 years to 9 Years JD with must have skill Must Have: TSQL, SSIS , SSRS or Informatica PC and Data warehousing Good to have : Snowflake Good knowledge of T-SQL, including the ability to write stored procedures, views, functions etc.. Good experience in designing ,developing, unit testing and implementation of data integration solutions using ETL in SSIS and SSRS reporting platform Experience with data warehousing concepts and enterprise data modeling techniques Good knowledge of relational and dimensional database structures, theories, principles and best practices Conduct thorough analysis of existing MSBI (Microsoft Business Intelligence) legacy applications and Informatica PC Identify and document the functionalities, workflows, and dependencies of legacy systems Create detailed mapping specifications for data integration and transformation processes Collaborate with business stakeholders/architects and data modelers to understand their needs and translate into technical documentation Ensure accurate documentation of data sources, targets, and transformation rules Perform data validation, cleansing, and analysis to ensure data accuracy and integrity Update the Design documents after successful code changes and testing Provide Deployment support Possess good knowledge of Agile and Waterfall methodologies Requirements: Bachelors degree in computer science, Engineering, or a related field Highly skilled at handling complex technical situations and have exceptional verbal and written communication skills 5+ years experience with understanding of data lifecycle, governance, and migration processes 5+ years experience with SSIS, SSRS (or Informatica PC) and MS SQL Server, TSQL 5+ years experience with Data Warehouse technologies 3+ years experience with Agile methodologies (Scrum, Kanban, JIRA) Nice to have experience in wealth management domain

Posted 4 days ago

Apply

4.0 - 9.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Responsibilities:Design, develop, and maintain data pipelines using Snowflake, DBT, and AWS.Collaborate with cross-functional teams to understand data requirements and deliver solutions.Optimize and troubleshoot existing data workflows to ensure efficiency and reliability.Implement best practices for data management and governance.Stay updated with the latest industry trends and technologies to continuously improve our data infrastructure.Required Skills: Proficiency in Snowflake, DBT, and AWS.Experience with data modeling, ETL processes, and data warehousing.Strong problem-solving skills and attention to detail.Excellent communication and teamwork abilities.Preferred Skills: Knowledge of Fivetran (HVR) and Python.Familiarity with data integration tools and techniques.Ability to work in a fast-paced and agile environment.Education:Bachelor's degree in Computer Science, Information Technology, or a related field.

Posted 4 days ago

Apply

6.0 - 9.0 years

4 - 8 Lacs

Telangana

Work from Office

Senior Python Developer (6 Years Experience)LocationPune/BangaloreEmployment TypeFull-time/ContractExperience6 YearsRole Overview:We are seeking a Senior Python Developer with 6 years of experience, specializing in database technologies like DB2 or Snowflake. The ideal candidate will have a strong background in backend development, data processing, and performance optimization. Experience with Java is a plus.Key ResponsibilitiesDesign, develop, and maintain scalable and efficient Python applications. Work extensively with DB2 or Snowflake for data modeling, query optimization, and performance tuning. Develop and optimize SQL queries, stored procedures, and data pipelines. Collaborate with cross-functional teams to integrate backend services with frontend applications. Implement best practices for code quality, security, and performance. Write unit tests and participate in code reviews. Troubleshoot and resolve production issues Required Skills: 6 years of experience in Python development. Strong experience with DB2 or Snowflake (SQL tuning, stored procedures, ETL workflows). Hands-on experience with Python frameworks such as Flask, Django, or FastAPI. Proficiency in writing complex SQL queries and database optimization. Experience with cloud platforms (AWS, Azure, or GCP) and CI/CD pipelines. Familiarity with version control (Git) and Agile methodologies.Good to HaveExperience in Java for backend services or microservices. Knowledge of Kafka, RabbitMQ, or other messaging systems. Exposure to containerization tools like Docker and Kubernetes.

Posted 4 days ago

Apply

6.0 - 11.0 years

5 - 9 Lacs

Hyderabad, Bengaluru

Work from Office

Skill-Snowflake Developer with Data Build Tool with ADF with Python Job Descripion: We are looking for a Data Engineer with experience in data warehouse projects, strong expertise in Snowflake , and hands-on knowledge of Azure Data Factory (ADF) and dbt (Data Build Tool). Proficiency in Python scripting will be an added advantage. Key Responsibilities Design, develop, and optimize data pipelines and ETL processes for data warehousing projects. Work extensively with Snowflake, ensuring efficient data modeling, and query optimization. Develop and manage data workflows using Azure Data Factory (ADF) for seamless data integration. Implement data transformations, testing, and documentation using dbt. Collaborate with cross-functional teams to ensure data accuracy, consistency, and security. Troubleshoot data-related issues. (Optional) Utilize Python for scripting, automation, and data processing tasks. Required Skills & Qualifications Experience in Data Warehousing with a strong understanding of best practices. Hands-on experience with Snowflake (Data Modeling, Query Optimization). Proficiency in Azure Data Factory (ADF) for data pipeline development. Strong working knowledge of dbt (Data Build Tool) for data transformations. (Optional) Experience in Python scripting for automation and data manipulation. Good understanding of SQL and query optimization techniques. Experience in cloud-based data solutions (Azure). Strong problem-solving skills and ability to work in a fast-paced environment. Experience with CI/CD pipelines for data engineering.

Posted 4 days ago

Apply

4.0 - 8.0 years

8 - 12 Lacs

Hyderabad

Work from Office

AWS Datalake Lead (India - Lead 8 to 10 yrs exp):Lead the technical design and architecture of AWS datalake and its related services ensuring alignment with customer requirements, industry best practices, and project objectives.Conduct reviews, ensure adherence to coding standards and best practices, and drive continuous improvement in code quality and performanceProvide technical support, troubleshooting problems, and providing timely resolution for Incidents, Services Requests and Minor Enhancements as required for AWS platforms and services.Adding, updating, or deleting datasets in AWS Data LakeMonitoring storage usage and handling capacity planningSchema optimization in Snowflake and AWS Data Lake for query performanceStrong communication and presentation skills

Posted 4 days ago

Apply

5.0 - 10.0 years

5 - 8 Lacs

Hyderabad

Work from Office

Key Responsibilities: Design, develop, and maintain Qlik View applications and dashboards. Collaborate with business stakeholders to gather requirements and translate them into technical specifications. Perform data analysis and create data models to support business intelligence initiatives. Optimize Qlik View applications for performance and scalability. Provide technical support and troubleshooting for Qlik View applications. Ensure data accuracy and integrity in all Qlik View applications. Integrate Snowflake with Qlik View to enhance data processing and analytics capabilities. Stay updated with the latest Qlik View features and best practices. Conduct training sessions for end-users to maximize the utilization of Qlik View applications. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience between 2-5 years as a Qlik View Developer. Strong knowledge of Qlik View architecture, data modeling, and scripting. Proficiency in SQL and database management. Knowledge of Snowflake and its integration with Qlik View. Excellent analytical and problem-solving skills. Ability to work independently and as part of a team. Strong communication and interpersonal skills.

Posted 4 days ago

Apply

6.0 - 9.0 years

4 - 7 Lacs

Mumbai, Hyderabad

Hybrid

Contract to hire Looking for 6+ years of Experiece candidate working on Data and Analytics.Advance working knowledge and exp with relationship database management system in designing schema, SQL Authoring and performance oprimization. Hands-on Experience on Snowflake is highly desirableCan write complex queries, views and Stored Procedures.Should must have hands-on experince on Python (NumPuy, Pandas etc.) Should must have working knowledge of ETL using any ETL tool such as InfotmaticaTechnical expertise and problem solving skills.Excellent communication skill and presentation skill

Posted 4 days ago

Apply

8.0 - 13.0 years

4 - 7 Lacs

Hyderabad

Hybrid

Snowflake Data Warehouse Lead (India - Lead 8 to 10 yrs exp):Lead the technical design and architecture of Snowflake platforms ensuring alignment with customer requirements, industry best practices, and project objectives.Conduct code reviews, ensure adherence to coding standards and best practices, and drive continuous improvement in code quality and performanceProvide technical support, troubleshooting problems, and providing timely resolution for Incidents, Services Requests and Minor Enhancements as required for Snowflake platforms and its related services.Datalake and Storage managementAdding, updating, or deleting datasets in SnowflakeMonitoring storage usage and handling capacity planningStrong communication and presentation skills

Posted 4 days ago

Apply

6.0 - 10.0 years

20 - 27 Lacs

Indore, Gurugram, Jaipur

Work from Office

Datapipelines. Hands-on with log analytics, user engagement metrics, and product performance metrics. Ability to identify patterns, trends, and anomalies in log data to generate actionable insights for product enhancements and feature optimization.

Posted 4 days ago

Apply

2.0 - 5.0 years

18 - 30 Lacs

Noida, Bengaluru

Work from Office

Role & responsibilities Focus on designing and developing proof of concepts (PoCs) and demonstrate the solution on a tight schedule. Utilize GenAI no code, low code, and SDKs to build robust GenAI agents that automate business processes. Work with data platforms such as Microsoft Azure, Snowflake, and integration services like Azure Data Factory to build agentic workflow Embed/integrate GenAI agents (Copilot agents) into business platforms such as Workday, Teams, etc Manage small to medium-sized projects with minimal supervision. Automation of complex processes using GenAI agents, especially using the Azure GenAI echo system Advanced Python programming Hands-on experience with data storage systems, especially Snowflake, Azure Data Factory, Azure Fabric, and Azure Synaps Building Copilot agents and embedding them into systems such as Workday, Teams, etc Preferred candidate profile Bachelors or masters degree in AI, computer science, Engineering, mathematics or a related field. 2-4 years of experience in developing and deploying AI/ML solutions to production. Hands-on experience with no code, low code, and SDKs for AI system development. Proficiency in data platforms such as Microsoft Azure, Snowflake, and integration services like Azure Data Factory. Experience with Azure Cloud, Azure AI Foundry, Copilot Studio, and frameworks such LangChain, LangGraph, MCP for building agentic systems. Strong understanding of Agile methodology and project management. Ability to manage projects independently and make decisions under ambiguity.

Posted 4 days ago

Apply

6.0 - 8.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Urgent Requirement for Pyspark,sql & aws glue Experience:6+ Years Location:Pan india Mandatory Skills pyspark,sql & aws glue Good to have skills snowflake & snowpark.

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

WHAT YOU WILL WORK ON Serve as a liaison between product, engineering & data consumers by analyzing the data, finding gaps and help drive roadmap Support and troubleshoot issues (data & process), identify root cause, and proactively recommend sustainable corrective actions by collaborating with engineering/product teams Communicate actionable insights using data, often for the stakeholders and non-technical audience. Ability to write technical specifications describing requirements for data movement, transformation & quality checks WHAT YOU BRING Bachelors Degree in Computer Science, MIS, other quantitative disciplines, or related fields 3-7 years of relevant analytical experiences that can translate into defining strategic vision into requirements and working with the best engineers, product managers, and data scientists Ability to conduct data analysis, develop and test hypothesis and deliver insights with minimal supervision Experience identifying and defining KPIs using data for business areas such as Sales, Consumer Behaviour, Supply Chain etc. Exceptional SQL skills Experience with modern visualization tool stack, such as: Tableau, Power BI, Domo etc. Knowledge of open-source, big data and cloud infrastructure such as AWS, Hive, Snowflake, Presto etc. Incredible attention to detail, with structured problem-solving approach Excellent communications skills (written and verbal) Experience with agile development methodologies Experience with retail or ecommerce domains is a plus.

Posted 4 days ago

Apply

6.0 - 11.0 years

8 - 15 Lacs

Mumbai

Work from Office

skills Python Data engineer/Python Devloper Experience 6+ Years Location PAN INDIA Job type Contract to Hire Work Model Hybrid Python Data side developers(Pandas, Numpi, SQL, data pipelines,etc.) 5-7 years of experience as aPythonDeveloperwith below skills. . Snowflake exposure Building API usingpython Microservice, API Gateway, Authentication Oauth2&mTLS Web service development Unit Testing, Test Driven Development Multi-tier web or desktop application development experience Application container Docker Linux experience,Pythonvirtual environment Tools Eclipse IDE/IntelliJ, GIT, Jira

Posted 4 days ago

Apply

6.0 - 11.0 years

8 - 18 Lacs

Bengaluru

Work from Office

Working Mode Hybrid Location Pan India PF Detection is mandatory Snowflake Administration Experience Managing user access, roles, and security protocols Setting up and maintaining database replication and failover procedures Setting up programmatic access OpenSearch OpenSearch Experience Deploying and scaling OpenSearch domains Managing security and access controls Setting up monitoring and alerting General AWS Skills Infrastructure as Code (CloudFormation) Experience building cloud native infrastructure, applications and services on AWS, Azure Hands-on experience managing Kubernetes clusters (Administrative knowledge), ideally AWS EKS and/or Azure AKS Experience with Istio or other Service Mesh technologies Experience with container technology and best practices, including container and supply chain security Experience with declarative infrastructure-as-code with tools like Terraform, Crossplane Experience with GitOps with tools like ArgoCD

Posted 4 days ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Hyderabad

Work from Office

Must Have TSQL, SSIS , SSRS or Informatica PC and Data warehousing Good to have Snowflake Good knowledge of T-SQL, including the ability to write stored procedures, views, functions etc.. Good experience in designing ,developing, unit testing and implementation of data integration solutions using ETL in SSIS and SSRS reporting platform Experience with data warehousing concepts and enterprise data modeling techniques Good knowledge of relational and dimensional database structures, theories, principles and best practices Conduct thorough analysis of existing MSBI (Microsoft Business Intelligence) legacy applications and Informatica PC Identify and document the functionalities, workflows, and dependencies of legacy systems Create detailed mapping specifications for data integration and transformation processes Collaborate with business stakeholders/architects and data modelers to understand their needs and translate into technical documentation Ensure accurate documentation of data sources, targets, and transformation rules Perform data validation, cleansing, and analysis to ensure data accuracy and integrity Update the Design documents after successful code changes and testing Provide Deployment support Possess good knowledge of Agile and Waterfall methodologies Requirements Bachelors degree in computer science, Engineering, or a related field Highly skilled at handling complex technical situations and have exceptional verbal and written communication skills 5+ years experience with understanding of data lifecycle, governance, and migration processes 5+ years experience with SSIS, SSRS (or Informatica PC) and MS SQL Server, TSQL 5+ years experience with Data Warehouse technologies 3+ years experience with Agile methodologies (Scrum, Kanban, JIRA) Nice to have experience in wealth management domain

Posted 4 days ago

Apply

5.0 - 10.0 years

12 - 18 Lacs

Pune, Bengaluru, Mumbai (All Areas)

Work from Office

Role & responsibilities 1.Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tool DBT for transformation 5.Analysis, design, and development of traditional data warehouse and business intelligence solutions. Work with customers to understand and execute their requirements. 6.Working knowledge of software engineering best practices. Should be willing to work in implementation & support projects. Flexible for Onsite & Offshore traveling. 7.Collaborate with other team members to ensure the proper delivery of the requirement. Ability to think strategically about the broader market and influence company direction. 8.Should have good communication skills, team player & good analytical skills. Snowflake certified is preferable. Contact Soniya soniya05.mississippiconsultants@gmail.com

Posted 4 days ago

Apply

5.0 - 6.0 years

7 - 8 Lacs

Pune

Work from Office

Develop and manage data solutions using Snowflake, focusing on optimizing data storage, integration, and processing. Ensure data consistency and provide analytical insights through Snowflake’s cloud data platform.

Posted 4 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hyderabad

Work from Office

Design and implement Snowflake data models, ensuring efficient data storage, retrieval, and processing. Work closely with data engineers and analysts to build robust, scalable data solutions.

Posted 4 days ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Mumbai

Work from Office

Design and optimize ETL workflows using Talend. Ensure data integrity and process automation.

Posted 4 days ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Design and develop data solutions using Snowflake, while implementing continuous integration/continuous deployment (CI/CD) pipelines to streamline development and deployment processes.

Posted 4 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hyderabad

Work from Office

Design and implement data architectures and models, focusing on data warehouses and Snowflake-based environments. Ensure that data is structured for efficient querying and analysis, aligning with business goals and performance requirements.

Posted 4 days ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Develop and manage data pipelines using Snowflake. Optimize performance and data warehousing strategies.

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies