Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
13.0 - 23.0 years
25 - 35 Lacs
Hyderabad
Work from Office
Role : Snowflake Practice Lead / Architect / Solution Architect Exp : 13+ Years Work Location : Hyderabad Position Overview : We are seeking a highly skilled and experienced - Snowflake Practice Lead- to drive our data strategy, architecture, and implementation using Snowflake. This leadership role requires a deep understanding of Snowflake's cloud data platform, data engineering best practices, and enterprise data management. The ideal candidate will be responsible for defining best practices, leading a team of Snowflake professionals, and driving successful Snowflake implementations for clients. Key Responsibilities : Leadership & Strategy : - Define and drive the Snowflake practice strategy, roadmap, and best practices. - Act as the primary subject matter expert (SME) for Snowflake architecture, implementation, and optimization. - Collaborate with stakeholders to understand business needs and align data strategies accordingly. Technical Expertise & Solutioning : - Design and implement scalable, high-performance data architectures using - Snowflake- . - Develop best practices for data ingestion, transformation, modeling, and security- within Snowflake. - Guide clients on Snowflake migrations, ensuring a seamless transition from legacy systems. - Optimize - query performance, storage utilization, and cost efficiency- in Snowflake environments. Team Leadership & Mentorship : - Lead and mentor a team of Snowflake developers, data engineers, and architects. - Provide technical guidance, conduct code reviews, and establish best practices for Snowflake development. - Train internal teams and clients on Snowflake capabilities, features, and emerging trends. Client & Project Management : - Engage with clients to understand business needs and design tailored Snowflake solutions. - Lead - end-to-end Snowflake implementation projects , ensuring quality and timely delivery. - Work closely with - data scientists, analysts, and business stakeholders- to maximize data utilization. Required Skills & Experience : - 10+ years of experience- in data engineering, data architecture, or cloud data platforms. - 5+ years of hands-on experience with Snowflake in large-scale enterprise environments. - Strong expertise in SQL, performance tuning, and cloud-based data solutions. - Experience with ETL/ELT processes, data pipelines, and data integration tools- (e.g., Talend, Matillion, dbt, Informatica). - Proficiency in cloud platforms such as AWS, Azure, or GCP, particularly their integration with Snowflake. - Knowledge of data security, governance, and compliance best practices . - Strong leadership, communication, and client-facing skills. - Experience in migrating from traditional data warehouses (Oracle, Teradata, SQL Server) to Snowflake. - Familiarity with Python, Spark, or other big data technologies is a plus. Preferred Qualifications : - Snowflake SnowPro Certification- (e.g., SnowPro Core, Advanced Architect, Data Engineer). - Experience in building data lakes, data marts, and real-time analytics solutions- . - Hands-on experience with DevOps, CI/CD pipelines, and Infrastructure as Code (IaC)- in Snowflake environments. Why Join Us? - Opportunity to lead cutting-edge Snowflake implementations- in a dynamic, fast-growing environment. - Work with top-tier clients across industries, solving complex data challenges. - Continuous learning and growth opportunities in cloud data technologies. - Competitive compensation, benefits, and a collaborative work culture.
Posted 3 months ago
5 - 10 years
3 - 7 Lacs
Chennai
Work from Office
Snowflake Developer Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills S?nowflake Developer Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills
Posted 3 months ago
4 - 8 years
7 - 14 Lacs
Pune
Hybrid
Role :Snowflake Developer Experience: 4 to 6 years Key responsibilities: Perform Development & Support activities for Data warehousing domain Understand High Level Design, Application Interface Design & build Low Level Design. Perform application analysis & propose technical solution for application enhancement or resolve production issues Perform Development & Deployment. Should be able to Code, Unit Test & Deploy Creation necessary documentation for all project deliverable phases Handle Production Issues (Tier 2 Support, weekend on-call rotation) to resolve production issues & ensure SLAs are met Technical Skills: Mandatory I n depth Knowledge of SQL, Unix & advanced Unix Shell Scripting Should have very clear understanding of Snowflake Architecture At least 4+ Year Hands on experience on Snowflake : snowsql , copy command , stored procedures , performance tuning and other advanced features like snowpipe , semi structured data load, types of tables. Hands on file transfer mechanism (NDM, SFTP , Data router etc) • Knowledge of Schedulers like TWS Certification for snowflake Good to have Python: Good to have Worked on AVRO, PARQUET files loading to snowflake - good to have Informatica: Good to have Pune Hybrid min 3 days work from office. Shift- 1 PM to 10 PM 2 Round of Interview. Hybrid min 3 days work from office. Shift- 1 PM to 10 PM 2 Round of Interview. NP: Immediate Joiners to 15 days ( only NP serving candidates) Location : Magarpatta City, Pune ( Hybrid) Excellent Communication Skills Interested Candidate Share resume at dipti.bhaisare@in.experis.com
Posted 3 months ago
6 - 10 years
15 - 27 Lacs
Pune, Chennai, Bengaluru
Work from Office
Snowflake Administration: Experience: 6-10 years Key Responsibilities Administer and manage the Snowflake data platform, including monitoring, configuration, and upgrades. Ensure the performance, scalability, and reliability of Snowflake databases and queries. Set up and manage user roles, access controls, and security policies to safeguard data integrity. Optimize database design and storage management to improve efficiency and reduce costs. Collaborate with data engineering and analytics teams to integrate data pipelines and support data workloads. Implement best practices for ETL/ELT processes, query optimization, and data warehouse management. Troubleshoot and resolve issues related to Snowflake platform operations. Monitor resource utilization and provide cost analysis for effective usage. Create and maintain documentation for Snowflake configurations, processes, and policies. Skills and Qualifications Proven experience in Snowflake administration and management. Strong understanding of Snowflake compute and storage management. Expertise in data governance, including column-level data security using secure views and dynamic data masking features. Proficiency in performing data definition language (DDL) operations. Ability to apply strategies for Snowflake performance-tuning. Experience in designing and developing secure access controls using Role-based Access Control (RBAC). Excellent troubleshooting and problem-solving skills. Strong collaboration and communication skills. Understanding of cost optimization approaches and implementation on Snowflake.
Posted 3 months ago
5 - 10 years
13 - 23 Lacs
Bengaluru, Delhi / NCR, Mumbai (All Areas)
Work from Office
Role & responsibilities Urgent Hiring for one of the reputed MNC Exp - 5 or 5+ Years Immediate Joiners only Location - Pan India except Hyderabad Mandate : Snowflake Development + Snowflake Administration Snowflake DBA
Posted 3 months ago
7 - 12 years
10 - 19 Lacs
Chandigarh
Remote
Role & responsibilities Should have minimum 6+ years of experience in snowflake DBA admin. Should manage storage SHIR, Spark pool allocation. Should have experience in working with managed private end point. Should have experience in creating maintaining, monitoring etc. firewall rules Should have good communication skill and experience in working in a Global delivery model involving onsite and offshore. Should have experience in Agile methodology. Please share your resume @ Ravina.m@vhrsol.com
Posted 3 months ago
3 - 8 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 8 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843
Posted 3 months ago
7 - 9 years
9 - 11 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills. Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 3 months ago
3 - 6 years
10 - 20 Lacs
Hyderabad, Bangalore/Bengaluru
Work from Office
Snowflake Developer
Posted 3 months ago
5 - 7 years
5 - 15 Lacs
Pune, Chennai, Bengaluru
Work from Office
experience: 5+ years of relevant experience We are seeking a highly skilled and experienced Snowflake Lead responsible for leading the design, development, and implementation of Snowflake-based data warehousing solutions. You will leverage your deep understanding of ETL and Data Warehousing concepts to build robust and scalable data pipelines. A key aspect of this role involves direct interaction with business users to gather and clarify requirements, ensuring that the delivered solutions meet their analytical needs. Responsibilities: Leadership & Delivery: Lead a module or a team of developers in the design, development, and deployment of Snowflake solutions. Take ownership of the end-to-end delivery of Snowflake modules, ensuring adherence to timelines and quality standards. Provide technical guidance and mentorship to team members, fostering a collaborative and high-performing environment. Contribute to project planning, estimation, and risk management activities. Snowflake Expertise: Utilize in-depth knowledge of Snowflake architecture, features, and best practices to design efficient and scalable data models and ETL/ELT processes. Develop and optimize complex SQL queries and Snowflake scripting for data manipulation and transformation. Implement Snowflake utilities such as SnowSQL, Snowpipe, Tasks, Streams, Time Travel, and Cloning as needed. Ensure data security and implement appropriate access controls within the Snowflake environment. Monitor and optimize the performance of Snowflake queries and data pipelines. Integrate PySpark with Snowflake for data ingestion and processing. Understand and apply PySpark best practices and performance tuning techniques. Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames). ETL & Data Warehousing: Apply strong understanding of ETL/ELT concepts, data warehousing principles (including dimensional modeling, star/snowflake schemas), and data integration techniques. Design and develop data pipelines to extract data from various source systems, transform it according to business rules, and load it into Snowflake. Work with both structured and semi-structured data, including JSON and XML. Experience with ETL tools (e.g., Informatica, Talend, pyspark) is a plus, particularly in the context of integrating with Snowflake. Requirements Gathering & Clarification: Actively participate in requirement gathering sessions with business users and stakeholders. Translate business requirements into clear and concise technical specifications and design documents. Collaborate with business analysts and users to clarify ambiguities and ensure a thorough understanding of data and reporting needs. Validate proposed solutions with users to ensure they meet expectations. Collaboration & Communication: Work closely with other development teams, data engineers, and business intelligence analysts to ensure seamless integration of Snowflake solutions with other systems. Communicate effectively with both technical and non-technical stakeholders. Provide regular updates on progress and any potential roadblocks. Best Practices & Continuous Improvement: Adhere to and promote best practices in Snowflake development, data warehousing, and ETL processes. Stay up-to-date with the latest Snowflake features and industry trends. Identify opportunities for process improvement and optimization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of relevant experience in data warehousing and ETL development, with a significant focus on Snowflake. Strong proficiency in SQL and experience working with large datasets. Solid understanding of data modeling concepts (dimensional modeling, star/snowflake schemas). Experience in designing and developing ETL or ELT pipelines. Proven ability to gather and document business and technical requirements. Excellent communication, interpersonal, and problem-solving skills. Snowflake certifications (e.g., SnowPro Core) are a plus.
Posted 3 months ago
3 - 8 years
15 - 25 Lacs
Bhubaneshwar, Bengaluru, Hyderabad
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Bangalore/Bhubaneswar/Hyderabad Required Skills, Snowflake developer Snowpipe SQL Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)
Posted 3 months ago
3 - 8 years
15 - 25 Lacs
Bengaluru, Hyderabad, Noida
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Noida/Gurgaon/Pune/Bangalore/Bhubaneswar/kochi Required Skills, Snowflake developer snowpipe Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)
Posted 3 months ago
6.0 - 10.0 years
20 - 27 Lacs
hyderabad, pune, bengaluru
Hybrid
Exp: 5 to 8 yrs Role - Permanent FTE Location: Bangalore, Hyderabad, Pune Mode: Hybrid Must Have Skills -Snowflake Administration, Snowflake Architecture & Performance Tuning, SQL, Data Platform Administration, Cloud Infrastructure (AWS/Azure/GCP) Required Candidate profile We are NOT looking for Snowflake Data Engineers OR Snowflake Developers. WE ARE ONLY LOOKING FOR SNOWFLAKE DATABASE/PLATFORM ADMINISTRATORS ONLY!
Posted Date not available
4.0 - 9.0 years
9 - 19 Lacs
bengaluru
Hybrid
Position: ETL Test Automation Engineer Location: Pune Or Bangalore Work mode: Hybrid Timings: 1 Pm to 9 PM Experience: 5 to 10 years Company: Dataceria Software solutions We have an exciting opportunity in the Data Services team for an ETL Automation Test Engineer to develop and maintain test, data validation frameworks. The Test Engineer will ensure stability of data pipelines and metadata management. Provides technical support and address data-related issues for Product, Data and Design teams. Knowledge/Skills Excellent written and verbal communication skills, along with an ability to effectively interface with both business partners and with technical staff. Ability to prioritize multiple tasks and work within timelines to meet project expectations. Strong client management, interpersonal and decision-making skills and competencies, attention to detail and the ability to be proactive, prudent and confident in pressure situations are a must. Qualifications : The candidate must have experience of the following: Bachelors or associate degree in computer science or related subject, 5 to 10 years of experience in Data Engineering and Testing roles Create and validate database stored procedures along with possible outcomes Proficiency in SQL Testing , with a strong emphasis on Snowflake DB & API Test Automation Experience with Microsoft SQL Server and Snowflake Basic to Intermediate level with Java /Python Programming for test automation Experience on Test automation using UFT/Selenium Execution of test using Jenkins or any other CICD platform Understanding of Data Driven Keyword or hybrid test automation frameworks Working knowledge of cloud computing architecture Hands on experience with GIT , Power BI reporting Familiarity with data pipeline and workflow management tools like Airflow Experience communicating to both technical and business audiences Ability to perform root cause analysis of existing models and propose effective solution Experience working with Agile methodologies required Ensuring application security through the Secure Software Development Lifecycle (SDLC ) Exceptional problem solving / analytical thinking and skills Teamwork and collaboration Responsibilities: Works in parallel with business SMEs and BA staff to create thorough, accurate and comprehensive SQL test queries and scripts Update and maintain SQL scripts and test cases as application, business rule changes Create appropriate standards and processes to ensure quality and consistency of data in the platform Analyze, Organize and validate raw data to ensure accuracy and reliability Collaborate with data analytics experts and SMEs to enhance functionality and efficiency with our data ecosystem. Mandatory: Strong experience in SQL . DB Testing / ETL Testing API Test Automation Intermediate level of Core Java/Python programming At least 5 to 10 years of experience Good to Have: UI Automation using Selenium CI/CD execution using any platform Power BI reporting Screening: Skills Rating SQL -------------------8/10 DB/ ETL Testing -------7/10 Selenium----------------7/10 API --------------------- 6/10 Testing Principles/Framework-------------8/10 CI/CD------------------------------------------7/10 Java/Python----------------------------------- 7/10 Dataceria software Solution We are looking for ETL Tester with Automation for our project in Pune or Bangalore . We are looking for immediate joiners with 15 -30 days of notice period. Job role is stronger in SQL, ETL Testing with Automation . If you are interested send your cv to careers@dataceria.com For faster screening send with the below Details Experience: CTC: ECTC: Notice period: Last working day: Current work Location: Attach the updated CV :
Posted Date not available
3.0 - 6.0 years
13 - 18 Lacs
pune, bengaluru, delhi / ncr
Work from Office
Roles and Responsibilities Design, develop, and optimize data pipelines using Snowflake . Implement data models, transformations, and ETL processes. Work with stakeholders to understand requirements and translate them into scalable data solutions. Ensure data quality, performance tuning, and security compliance. Integrate Snowflake with other cloud services and BI tools. Required Skills: 3-6 years of experience in data engineering or development roles. Strong expertise in Snowflake (warehousing, performance tuning, query optimization). Proficiency in SQL and ETL tools (e.g., Informatica, Talend, dbt ). Familiarity with cloud platforms (AWS/Azure/GCP) Good understanding of data modeling and data governance. BE/ BTech compulsory, any field Nice to Have: Experience with Python or Spark for data processing. Knowledge of CI/CD pipelines for data workflows.
Posted Date not available
6.0 - 11.0 years
6 - 12 Lacs
gurugram
Work from Office
6+ years of experience as Cloud Database Administrator. Exp in Postgres/Snowflake/Erwin/MySQL on AWS. Exp in Data modelling using tools like Erwin. Exp in AWS Aurora. Exp in Data Migration. Exp in Oracle, SQL, PL/SQL & shell scripting. Exp in ERDs. Required Candidate profile Exp in design data models- SRS/HLD/LLDs using tools like ERWIN. Exp in DB & production support tasks including Replication,SQL Optimization&Tuning,Monitoring & Database Security on large Database env
Posted Date not available
4.0 - 8.0 years
15 - 30 Lacs
gurugram
Remote
About Straive: Straive is a market leading Content and Data Technology company providing data services, subject matter expertise, & technology solutions to multiple domains. Data Analytics & Al Solutions, Data Al Powered Operations and Education & Learning form the core pillars of the companys long-term vision. The company is a specialized solutions provider to business information providers in finance, insurance, legal, real estate, life sciences and logistics. Straive continues to be the leading content services provider to research and education publishers. Data Analytics & Al Services: Our Data Solutions business has become critical to our client's success. We use technology and Al with human experts-in loop to create data assets that our clients use to power their data products and their end customers' workflows. As our clients expect us to become their future-fit Analytics and Al partner, they look to us for help in building data analytics and Al enterprise capabilities for them. With a client-base scoping 30 countries worldwide, Straive’s multi-geographical resource pool is strategically located in eight countries - India, Philippines, USA, Nicaragua, Vietnam, United Kingdom, and the company headquarters in Singapore . Website: https://www.straive.com/ Role Overview We are seeking a Data Platform Operations Engineer to join us in building, automating, and operating our Enterprise Data Platform. This role is ideal for someone with a unique combination of DataOps/DevOps, Data Engineering, and Database Administration expertise. As a key member of our Data & Analytics team, you will ensure our data infrastructure is reliable, scalable, secure, and high-performing—enabling data-driven decision-making across the business. Key Responsibilities Snowflake Administration: Own the administration, monitoring, configuration, and optimization of our Snowflake data warehouse. Implement and automate user/role management, resource monitoring, scaling strategies, and security policies. Fivetran Management: Configure, monitor, and troubleshoot Fivetran pipelines for seamless ingestion from SaaS applications, ERPs, and operational databases. Resolve connector failures and optimize sync performance and cost. DataOps/Automation: Build/improve CI/CD workflows using Git and other automation tools for data pipeline deployment, testing, and monitoring. Infrastructure as Code (IaC): Implement and maintain infrastructure-using tools like Terraform and Titan to ensure consistent, repeatable, and auditable environments. Platform Monitoring & Reliability: Implement automated checks and alerting across Snowflake, Fivetran, and dbt processes to ensure platform uptime, data freshness, and SLA compliance. Proactively identify and resolve platform issues and performance bottlenecks. Database Performance and Cost Optimization: Monitor and optimize database usage (queries, compute, storage) for speed and cost-effectiveness. Partner with data engineers and analysts to optimize SQL and refine warehouse utilization. Security & Compliance: Enforce security best practices across the data platform (access controls, encryption, data masking). Support audits and compliance requirements (e.g., SOC2). Data Quality Operations: Build and automate data health and quality checks (using dbt tests and/or custom monitors). Rapidly triage and resolve data pipeline incidents with root cause analyses. Documentation & Process: Ensure all operational procedures (run books, escalation paths, knowledge base) and infrastructure documentation are accurate, up-to-date, and easily accessible. Collaboration: Partner with Data Architects, Data Engineers, and DevOps Engineers to understand data flow requirements, troubleshoot issues, and continuously enhance platform capabilities. Required Experience & Skills 5+ years in a DataOps, DevOps, Data Engineering, or Database Administration role in cloud data environments. Hands-on experience administering Snowflake, including security, performance tuning, cost management, and automation. Strong expertise with Fivetran setup, management, and incident troubleshooting. Proficiency in dbt for ELT development, testing, and orchestration. Advanced SQL skills for troubleshooting, diagnostics, and optimization. Proficient with version control (Git) and experience designing/deploying data pipelines in a collaborative environment. Scripting skills (Python, Bash, etc.) for workflow automation, data operations tasks, and deployment pipelines. Experience with cloud platforms (AWS/Azure); knowledge of core services such as IAM, data storage, and data transfer. Strong understanding of platform reliability, monitoring, and observability (alerting, dash boarding, log analysis). Comfortable with Infrastructure as Code concepts and tools (Terraform). Experience working with business and analytics teams to translate ops support needs into scalable technical solutions. Technical Stack Required: Snowflake, Terraform, Github Actions, AWS, dbt, Fivetran Preferred Titan, Datacoves
Posted Date not available
7.0 - 10.0 years
15 - 20 Lacs
bengaluru
Work from Office
We are looking for a skilled Technology Lead with extensive experience in Snowflake, PL/SQL, and Azure to lead our cloud-based data solutions project. The ideal candidate will have a strong background in cloud data platform development, particularly with Snowflake, and a deep understanding of Azure cloud services. Experience in the financial industry and Agile environments is highly preferred. Key Responsibilities: Lead design and development of cloud-based data pipelines and data solutions, primarily using Snowflake and Azure. Build end-to-end data workflows, including data ingestion, transformation, and extract generation within Snowflake. Write and optimize complex SQL and PL/SQL queries to support business requirements. Monitor and tune Snowflake performance for efficient query execution and scalability. Troubleshoot data issues, perform root cause analysis, and provide production support. Collaborate with cross-functional teams to deliver solutions in an Agile development environment. Provide technical leadership and mentorship to the development team. Ensure adherence to best practices in cloud data engineering and security standards. Required Skills: Minimum 6 years of IT experience with at least 4 years working on cloud-based solutions. 4+ years hands-on experience with Snowflake development. Strong proficiency in PL/SQL and writing complex SQL queries. Solid experience in Azure cloud services and infrastructure. Proven ability to design and build scalable data pipelines on cloud platforms. Experience optimizing Snowflake performance, including query tuning and scaling strategies. Strong problem-solving skills related to data quality and production support. Familiarity with Agile methodologies. Experience in the Financial Industry is preferred. Preferred Qualifications: Certifications in Snowflake and/or Azure cloud services.
Posted Date not available
6.0 - 11.0 years
18 - 33 Lacs
pune, chennai, bengaluru
Work from Office
Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications.
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City