Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 8.0 years
7 - 17 Lacs
Pune, Chennai, Bengaluru
Hybrid
Role & responsibilities Primary Skill Airflow Secondary Skill Snowflake, Python Preferred candidate profile job Location Are we flexible for other locations ? Chennai/Mumbai/Pune/Bengaluru If you are interested share your cv at Muktai.S@alphacom.in
Posted 13 hours ago
5.0 - 10.0 years
20 - 30 Lacs
Hyderabad
Work from Office
About Client Hiring for One of the top most MNC!! Job Description Job Title : Snowflake Developer /Snowflake Data Engineer Qualification : Any Graduate or Above Relevant Experience : 4 to 12 Years SKILL Snowflake Python/Pyspark SQL AWS services Role descriptions / Expectations from the Role Strong experience in building/designing the data warehouse, data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers. Strong experience with building productionized data ingestion and data pipelines in Snowflake Should have good exp on Snowflake RBAC and data security. Strong experience in Snowflake features including new snowflake features. Should have good experience in Python/Pyspark. Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF) Should have experience/knowledge in orchestration and scheduling tools experience like Airflow Should have good understanding on ETL processes and ETL tools. Location : Hyderabad CTC Range : 20 LPA TO 30 LPA Notice period : ANY Shift Timing : N/A Mode of Interview : VIRTUAL Mode of Work : WORK FROM OFFICE Vardhani IT Staffing Analyst Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA. 8686127477 I vardhani@blackwhite.in I www.blackwhite.in
Posted 15 hours ago
5.0 - 10.0 years
15 - 25 Lacs
Bengaluru
Work from Office
About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Data Engineering Qualification : Any Graduate or Above Relevant Experience : 4 -10Years Required Technical Skill Set (Skill Name) : SQL, Snowflake, Python, Cloud (AWS, GCP, Azure Either) Location : Bangalore CTC Range : 1 5 LPA-30 LPA Notice period : Immediate Shift Timing : N/A Mode of Interview : Virtual Mode of Work : WFO( Work From Office) Pooja Singh KS IT Staffing Analyst Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA. pooja.singh@blackwhite.in I www.blackwhite.in
Posted 15 hours ago
9.0 - 12.0 years
1 - 2 Lacs
Hyderabad
Remote
Job Title: Data Architect Location: Remote Employment Type: Full-Time Reports to: Lead Data Strategist About Client / Project: Client is a specialist data strategy and AI consultancy that empowers businesses to unlock tangible value from their data assets. We specialize in developing comprehensive data strategies tailored to address core business and operational challenges. By combining strategic advisory with hands-on implementation, we ensure data becomes a true driver of business growth, operational efficiency, and competitive advantage for our clients. As a solutions-focused and forward-thinking consultancy, we help organizations transform their data capabilities using modern technology, reduce costs, and accelerate business growth by aligning every initiative directly with our clients core business objectives. Role Overview We are seeking a highly experienced Data Architect to lead the design and implementation of scalable data architectures for global clients across industries. You will define enterprise-grade data platforms leveraging cloud-native technologies and modern data frameworks. Key Responsibilities Design and implement cloud-based data architectures (GCP, AWS, Azure, Snowflake, Redshift, Databricks, or Hadoop)• Develop conceptual, logical, and physical data models Define data flows, ETL/ELT pipelines, and ingestion strategies Design and maintain data catalogs, metadata, and domain structures Establish data architecture standards, reference models, and blueprints Oversee data lineage, traceability, and audit readiness Guide integration of AI/ML pipelines and analytics solutions Ensure data privacy, protection, and compliance (e.g., GDPR, HIPAA) Collaborate closely with Engineers, Analysts, and Strategists Required Skills & Qualifications 8+ years of experience in data architecture or enterprise data platform roles Deep experience with at least two major cloud platforms (AWS, Azure, GCP) Proven hands-on work with modern data platforms: Snowflake, Databricks, Redshift, Hadoop Strong understanding of data warehousing, data lakes, lakehouse architecture Advanced proficiency in SQL, Python, Spark, and/or Scala Experience with data cataloging and metadata tools (e.g., Informatica, Collibra, Alation) Knowledge of data governance frameworks and regulatory compliance Strong documentation, stakeholder communication, and architectural planning skills Bachelor’s degree in Computer Science, Engineering, or related field (Master’s preferred)
Posted 18 hours ago
7.0 - 9.0 years
1 - 1 Lacs
Pune
Remote
Position Details Title: Data Engineer Location: Remote (U.S. Pacific Time preferred at least half of the time) Purpose of the Role The Data Engineer Contractor will play a critical role in delivering key milestones of the Procurement Data Lake Plan. This includes ingesting and transforming data from procurement systems, cleaning and organising it in Snowflake, and creating dashboard-ready datasets for Sigma Computing. The contractor will help ensure data reliability, reduce manual work, and enable automated insights for stakeholders across procurement, legal, and operations. Key Responsibilities Design, build, and maintain scalable ETL data pipelines. Ingest, clean, and standardise data from Coupa, Netsuite, IntelAgree, Zip, ProcessUnity, and Monday.com. Integrate data into Snowflake with appropriate schema and performance optimization. Enable real-time and scheduled analytics through Sigma Computing dashboards. Collaborate with procurement, legal, and data teams to meet milestone reporting needs. Ensure documentation of workflows, datasets, and dashboard requirements. Technical Requirements Advanced SQL for transformation and analytics use cases. Proficiency in Python or R for data wrangling and automation. Experience using Airflow or similar tools for orchestration. Strong understanding of Snowflake or equivalent cloud data warehouse. Proficiency in Sigma Computing/Tableau or similar BI tools: building dashboards, designing datasets, and user interactivity. Familiarity with Git and version control best practices. Preferred Qualifications Background in procurement, finance, or legal analytics. Experience with procurement tools like Coupa, IntelAgree, Zip, Netsuite, and ProcessUnity. Strong stakeholder engagement and communication skills. Agile and milestone-driven project delivery experience. Expected Deliverables Automated data pipelines for spend, contract, intake, and travel & expense data. Cleaned, structured datasets stored in Snowflake. Sigma dashboards that support milestone and executive reporting.
Posted 18 hours ago
6.0 - 8.0 years
37 - 40 Lacs
Kochi, Hyderabad, Coimbatore
Work from Office
Key Responsibilities: Design and implement scalable Snowflake data warehouse solutions for structured and semi-structured data. Develop ETL/ELT pipelines using Informatica IICS, dbt, Matillion, Talend, Airflow, or equivalent tools. Optimize query performance and implement best practices for cost and efficiency. Work with cloud platforms (AWS, Azure, GCP) for data integration and storage. Implement role-based access control (RBAC), security policies, and encryption within Snowflake. Perform data modeling (Star Schema, Snowflake Schema, Data Vault) and warehouse design. Collaborate with data engineers, analysts, and business teams to ensure data consistency and availability. Automate Snowflake object creation, pipeline scheduling, and monitoring. Migrate existing on-premise databases (Oracle, SQL Server, Teradata, Redshift, etc.) to Snowflake. Implement data governance, quality checks, and observability frameworks. Required Skills & Qualifications: 6+ years of experience in data engineering / warehousing with at least 2+ years in Snowflake. Strong expertise in Snowflake features such as Virtual Warehouses, Streams, Tasks, Time Travel, and Cloning. Experience in SQL performance tuning, query optimization, and stored procedures (JavaScript UDFs/ UDAFs). Hands-on experience with ETL/ELT tools like Informatica, dbt, Matillion, Talend, Airflow, or AWS Glue. Experience with Python, PySpark, or Scala for data processing. Knowledge of CI/CD pipelines, Git, Terraform, or Infrastructure as Code (IaC). Experience with semi-structured data (JSON, Parquet, Avro) and handling ingestion from APIs. Strong understanding of cloud platforms (AWS S3, Azure Data Lake, GCP BigQuery) and data lake architectures. Familiarity with BI/Analytics tools like Tableau, Power BI, Looker, or ThoughtSpot. Strong problem-solving skills and experience working in Agile/Scrum environments.
Posted 1 day ago
6.0 - 8.0 years
37 - 40 Lacs
Kochi, Hyderabad, Coimbatore
Work from Office
Required Skills & Qualifications: 6+ years of experience in Informatica ETL Development with at least 2+ years in Informatica IICS. Strong expertise in IICS CDI, CAI, CDI-Elastic, Taskflows, and REST/SOAP API Integration. Experience in cloud platforms (AWS, Azure, GCP) and working with databases like Snowflake, Redshift, or Synapse. Proficiency in SQL, PL/SQL, and performance tuning techniques. Knowledge of PowerCenter migration to IICS is a plus. Hands-on experience with Data Quality, Data Governance, and Master Data Management (MDM) is desirable. Experience in developing and deploying APIs, microservices, and event-driven architectures. Strong problem-solving skills and the ability to work in an Agile/Scrum environment. Preferred Qualifications: Informatica IICS Certification (CDI or CAI) is a plus. Exposure to Python, PySpark, or Big Data technologies is an advantage. Experience with CI/CD pipelines, DevOps practices, and Terraform for cloud deployments.
Posted 1 day ago
10.0 - 15.0 years
40 - 50 Lacs
Hyderabad
Hybrid
Envoy Global is a proven innovator in the global immigration space. Our mission combines our industry-leading tech platform with holistic service to streamline, simplify and expedite the immigration process for employers and individuals. We are seeking a highly skilled Team Lead OR Manager, Data Engineering within Envoy Global 's tech team to join us on a full time, permanent basis. This role is responsible for the end-to-end design, development, and documentation of data pipelines and ETL (Extract, Transform, Load) processes. This role focuses on enabling data migration, integration, and warehousing, encompassing the creation of ETL jobs, reports, dashboards, and data pipelines. As our Senior Data Engineering Lead OR Manager, you will be required to: Lead and mentor a small team of data engineers, fostering a collaborative and innovative environment. Design, develop, and document robust data pipelines and ETL jobs. Engage in data modeling activities to ensure efficient and effective data structures. Ensure the seamless integration of data across various platforms and systems Lead all aspects of the design, implementation, and maintenance of data engineering pipelines in our Azure environment including integration with a variety of data sources Collaborate with Data Analytics and DataOps teams and other partners in Architecture, Engineering and Devops teams to delivery high quality data platforms that enable analytics solutions for the business Ensure data engineering standards are in line with established principles of data governance, data quality and data security Monitor and optimizes the performance of data pipelines, ensuring they meet SLAs in terms of data availability and quality Hire, manage and mentor a team of Data Engineers and Data Quality Engineers Communicate clearly and effectively with stakeholders To apply for this role, you should possess the following skills, experience and qualifications: Proven experience in data engineering, with a strong background in designing and developing ETL processes. Excellent collaboration skills, with the ability to work effectively with cross-functional teams. Leadership experience, with a track record of managing and mentoring a team of data engineers. 8+ years of experience as a Data Engineer with 3+ years of experience in a managerial role Technical experience in one or more of the cloud-based data warehouse/data lake platforms such as AWS, Snowflake, Azure Synapse ETL experience using SSIS, ADF or another equivalent tool Knowledgeable in Data Modeling and Data warehouse concepts Demonstrated ability to write SQL/TSQL queries to retrieve/modify data Knowledge and know-how to troubleshoot potential issues, and experience with best practices around database operations Ability to work in an Agile environment Should you have a deep passion for technology and a desire to thrive in a rapidly evolving and creative environment, we would be delighted to receive your application.
Posted 1 day ago
5.0 - 10.0 years
15 - 20 Lacs
Pune
Hybrid
Job Description We are seeking an Analyst to be part of a new analytics platform being developed for our global organization. In this role, you will focus on expanding our Snowflake data warehouse, leveraging your robust SQL skills to move analytics data from a raw state into a refined, analytics-ready state where it will be consumed by end users. Skill / Qualifications Bachelors of Science degree in Computer Science, Business, Information Systems or related business field required 5+ years of experience in Analytics technical roles Strong experience on Tableau and Snowflake, SQL Query. Experience with Qlik Replicate to ingest data to Snowflake is preferred Experience with Agile methodology and Jira software is preferred Ability to work independently and as a part of a global team is required Ability to create and maintain robust documentation of IT processes is required Strong collaboration skills within a team, across IT, and with business users is required Strong trouble-shooting and problem solving skills is required Job Responsibilities Collaborate with business associates and other IT resources to understand analytic needs, including localizations required by our global business Translate those needs into technical solutions that properly move data through the Snowflake landscape Maintain relevant technical documentation to ensure streamlined support and knowledge transfer within areas of responsibility Support those processes as needed going forward Prepare and execute unit and integration testing. Support user acceptance testing Provide hypercare support after each go-live Troubleshoot and resolve analytic issues impacting the business Other duties and projects as assigned Benefits Competitive Market Rate (Depending on Experience)
Posted 1 day ago
3.0 - 8.0 years
0 - 3 Lacs
Bengaluru
Remote
If you are passionate about Snowflake, data warehousing, and cloud-based analytics, we'd love to hear from you! Apply now to be a part of our growing team. Perks and benefits Intersected candidates can go through the below link to apply directly and can complete the 1st round of technical discussion https://app.hyrgpt.com/candidate-job-details?jobId=67ecc88dda1154001cc8b88f Job Summary: We are looking for a skilled Snowflake Engineer with 3-10 years of experience in designing and implementing cloud-based data warehousing solutions. The ideal candidate will have hands-on expertise in Snowflake architecture, SQL, ETL pipeline development, and performance optimization. This role requires proficiency in handling structured and semi-structured data, data modeling, and query optimization to support business intelligence and analytics initiatives. The ideal candidate will work on a project for one of our key Big4 consulting customer and will have immense learning opportunities Key Responsibilities: Design, develop, and manage high-performance data pipelines for ingestion, transformation, and storage in Snowflake. Optimize Snowflake workloads, ensuring efficient query execution and cost management. Develop and maintain ETL processes using SQL, Python, and orchestration tools. Implement data governance, security, and access control best practices within Snowflake. Work with structured and semi-structured data formats such as JSON, Parquet, Avro, and XML. Design and maintain fact and dimension tables, ensuring efficient data warehousing and reporting. Collaborate with data analysts and business teams to support reporting, analytics, and business intelligence needs. Troubleshoot and resolve data pipeline issues, ensuring high availability and reliability. Monitor and optimize Snowflake storage and compute usage to improve efficiency and performance. Required Skills & Qualifications: 3-10 years of experience in Snowflake, SQL, and data engineering. Strong hands-on expertise in Snowflake development, including data sharing, cloning, and time travel. Proficiency in SQL scripting for query optimization and performance tuning. Experience with ETL tools and frameworks (e.g., DBT, Airflow, Matillion, Talend). Familiarity with cloud platforms (AWS, Azure, or GCP) and integration with Snowflake. Strong understanding of data warehousing concepts, including fact and dimension modeling. Ability to work with semi-structured data formats like JSON, Avro, Parquet, and XML. Knowledge of data security, governance, and access control within Snowflake. Excellent problem-solving and troubleshooting skills. Preferred Qualifications: Experience in Python for data engineering tasks. Familiarity with CI/CD pipelines for Snowflake development and deployment. Exposure to streaming data ingestion and real-time processing. Experience with BI tools such as Tableau, Looker, or Power BI.
Posted 1 day ago
7.0 - 12.0 years
15 - 18 Lacs
Pune
Work from Office
8+ years of experience in Python, SQL, Spark, and/or Scala Deep experience with cloud data services: GCP AWS Azure Snowflake, Databricks, or Hadoop ecosystem Apache Airflow, Prefect or Luigi Kafka, Kinesis or other streaming technologies Flexi working Work from home Health insurance Life insurance Retention bonus Leave encashment Gratuity Provident fund Course reimbursements
Posted 1 day ago
7.0 - 11.0 years
20 - 30 Lacs
Pune
Hybrid
What Youll Do The Global Analytics & Insights (GAI) team is looking for a Senior Data Engineer to lead our build of the data infrastructure for Avalara's core data assets- empowering us with accurate, data to lead data backed decisions. As A Senior Data Engineer, you will help architect, implement, and maintain our data infrastructure using Snowflake, dbt (Data Build Tool), Python, Terraform, and Airflow. You will immerse yourself in our financial, marketing, and sales data to become an expert of Avalara's domain. You will have deep SQL experience, an understanding of modern data stacks and technology, a desire to build things the right way using modern software principles, and experience with data and all things data related. What Your Responsibilities Will Be You will architect repeatable, reusable solutions to keep our technology stack DRY Conduct technical and architecture reviews with engineers, ensuring all contributions meet quality expectations You will develop scalable, reliable, and efficient data pipelines using dbt, Python, or other ELT tools Implement and maintain scalable data orchestration and transformation, ensuring data accuracy, consistency Collaborate with cross-functional teams to understand complex requirements and translate them into technical solutions Build scalable, complex dbt models Demonstrate ownership of complex projects and calculations of core financial metrics and processes Work with Data Engineering teams to define and maintain scalable data pipelines. Promote automation and optimization of reporting processes to improve efficiency. You will be reporting to Senior Manager What You'll Need to be Successful Bachelor's degree in Computer Science or Engineering, or related field 6+ years experience in data engineering field, with advanced SQL knowledge 4+ years of working with Git, and demonstrated experience collaborating with other engineers across repositories 4+ years of working with Snowflake 3+ years working with dbt (dbt core) 3+ years working with Infrastructure as Code Terraform 3+ years working with CI / CD, and demonstrated ability to build and operate pipelines AWS Certified Terraform Certified Experience working with complex Salesforce data Snowflake, dbt certified
Posted 1 day ago
2.0 - 6.0 years
5 - 8 Lacs
Pune
Work from Office
Supports, develops, and maintains a data and analytics platform. Effectively and efficiently processes, stores, and makes data available to analysts and other consumers. Works with Business and IT teams to understand requirements and best leverage technologies to enable agile data delivery at scale. Note:- Although the role category in the GPP is listed as Remote, the requirement is for a Hybrid work model. Key Responsibilities: Oversee the development and deployment of end-to-end data ingestion pipelines using Azure Databricks, Apache Spark, and related technologies. Design high-performance, resilient, and scalable data architectures for data ingestion and processing. Provide technical guidance and mentorship to a team of data engineers. Collaborate with data scientists, business analysts, and stakeholders to integrate various data sources into the data lake/warehouse. Optimize data pipelines for speed, reliability, and cost efficiency in an Azure environment. Enforce and advocate for best practices in coding standards, version control, testing, and documentation. Work with Azure services such as Azure Data Lake Storage, Azure SQL Data Warehouse, Azure Synapse Analytics, and Azure Blob Storage. Implement data validation and data quality checks to ensure consistency, accuracy, and integrity. Identify and resolve complex technical issues proactively. Develop reliable, efficient, and scalable data pipelines with monitoring and alert mechanisms. Use agile development methodologies, including DevOps, Scrum, and Kanban. External Qualifications and Competencies Technical Skills: Expertise in Spark, including optimization, debugging, and troubleshooting. Proficiency in Azure Databricks for distributed data processing. Strong coding skills in Python and Scala for data processing. Experience with SQL for handling large datasets. Knowledge of data formats such as Iceberg, Parquet, ORC, and Delta Lake. Understanding of cloud infrastructure and architecture principles, especially within Azure. Leadership & Soft Skills: Proven ability to lead and mentor a team of data engineers. Excellent communication and interpersonal skills. Strong organizational skills with the ability to manage multiple tasks and priorities. Ability to work in a fast-paced, constantly evolving environment. Strong problem-solving, analytical, and troubleshooting abilities. Ability to collaborate effectively with cross-functional teams. Competencies: System Requirements Engineering: Uses appropriate methods to translate stakeholder needs into verifiable requirements. Collaborates: Builds partnerships and works collaboratively to meet shared objectives. Communicates Effectively: Delivers clear, multi-mode communications tailored to different audiences. Customer Focus: Builds strong customer relationships and delivers customer-centric solutions. Decision Quality: Makes good and timely decisions to keep the organization moving forward. Data Extraction: Performs ETL activities and transforms data for consumption by downstream applications. Programming: Writes and tests computer code, version control, and build automation. Quality Assurance Metrics: Uses measurement science to assess solution effectiveness. Solution Documentation: Documents information for improved productivity and knowledge transfer. Solution Validation Testing: Ensures solutions meet design and customer requirements. Data Quality: Identifies, understands, and corrects data flaws. Problem Solving: Uses systematic analysis to address and resolve issues. Values Differences: Recognizes the value that diverse perspectives bring to an organization. Preferred Knowledge & Experience: Exposure to Big Data open-source technologies (Spark, Scala/Java, Map-Reduce, Hive, HBase, Kafka, etc.). Experience with SQL and working with large datasets. Clustered compute cloud-based implementation experience. Familiarity with developing applications requiring large file movement in a cloud-based environment. Exposure to Agile software development and analytical solutions. Exposure to IoT technology. Additional Responsibilities Unique to this Position Qualifications: Education: Bachelors or Masters degree in Computer Science, Information Technology, Engineering, or a related field. Experience: 3 to 5 years of experience in data engineering or a related field. Strong hands-on experience with Azure Databricks, Apache Spark, Python/Scala, CI/CD, Snowflake, and Qlik for data processing. Experience working with multiple file formats like Parquet, Delta, and Iceberg. Knowledge of Kafka or similar streaming technologies. Experience with data governance and data security in Azure. Proven track record of building large-scale data ingestion and ETL pipelines in cloud environments. Deep understanding of Azure Data Services. Experience with CI/CD pipelines, version control (Git), Jenkins, and agile methodologies. Familiarity with data lakes, data warehouses, and modern data architectures. Experience with Qlik Replicate (optional).
Posted 1 day ago
5.0 - 7.0 years
8 - 10 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Job Overview: We are seeking a highly skilled Technical Data Analyst for a remote contract position (6 to 12 months) to help build a single source of truth for our high-volume direct-to-consumer accounting and financial data warehouse. You will work closely with Finance & Accounting teams and play a pivotal role in dashboard creation, data transformation, and migration from Snowflake to Databricks. Key Responsibilities: 1. Data Analysis & Reporting Develop month-end accounting and tax dashboards using SQL in Snowflake (Snowsight) Migrate and transition reports/dashboards to Databricks Gather, analyze, and transform business requirements from finance/accounting stakeholders into data products 2. Data Transformation & Aggregation Build transformation pipelines in Databricks to support balance sheet look-forward views Maintain data accuracy and consistency throughout the Snowflake Databricks migration Partner with Data Engineering to optimize pipeline performance 3. ERP & Data Integration Support integration of financial data with NetSuite ERP Validate transformed data to ensure correct ingestion and mapping into ERP systems 4. Ingestion & Data Ops Work with Fivetran for ingestion and resolve any pipeline or data accuracy issues Monitor data workflows and collaborate with engineering teams on troubleshooting Required Skills & Qualifications: 5+ years of experience as a Data Analyst (preferably in Finance/Accounting domain) Strong in SQL, with proven experience in Snowflake and Databricks Experience in building financial dashboards (month-end close, tax reporting, balance sheets) Understanding of financial/accounting data: GL, journal entries, balance sheet, income statements Familiarity with Fivetran or similar data ingestion tools Experience with data transformation in a cloud environment Strong communication and stakeholder management skills Nice to have: Experience working with NetSuite ERP Apply Now: Please share your updated resume with the following details: Full Name Total Experience Relevant Experience in SQL, Snowflake, Databricks Experience in Finance or Accounting domain Current Location Availability (Notice Period) Current and Expected Rate location-remote,Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad Contract Duration: 6 Months to 1 Year
Posted 1 day ago
5.0 - 10.0 years
12 - 18 Lacs
Hyderabad, Bengaluru
Hybrid
Work Location : Hyderabad & Bangalore Work Timings : 1 PM IST till 10.30 PM IST Exp : 5 8 Years Key Skills : Expertise building solutions (not operation/BPO users) using any of the enterprise level reconciliation platforms (ex: TLM, Duco, IntelliMatch etc.), ETL development, Strong in working with SQL, preferably in SQL Server and Postgres.
Posted 1 day ago
5.0 - 10.0 years
18 - 27 Lacs
Noida
Hybrid
Develop interactive dashboards , reports using Tableau to visualize key business metrics.Write, optimize, and troubleshoot SQL queries to extract, clean, and analyze data. Work with Snowflake for data management, transformation, and analytics.
Posted 1 day ago
2.0 - 4.0 years
6 - 10 Lacs
Pune
Work from Office
The headlines Job Title Data Consultant (Managed Services & Support) Location Hybrid; 2 days a week on-site in our office in Creaticity Mall, Shashtrinagar, Yerawada Salary ??700,000 ??2,100,000/annum A bit about the role We're looking for passionate Data Consultants who thrive in a fast-paced, problem-solving environment to join our global Managed Services & Support team spanning India and the UK In this role, you'll help keep our live cloud data solutions operating as they should be, ensuring data pipelines run smoothly and reporting layers stay up to date You'll take a proactive approach and help identify and resolve issues before they arise while optimising technical debt for long-term stability This is perfect for someone who enjoys client interaction and is passionate about ensuring cloud data platforms perform at their best What you'll be doing Monitoring and troubleshooting live data pipelines, ensuring smooth operations and up-to-date reporting layers Managing a support queue, diagnosing and resolving issues related to ETL processes, Snowflake, Matillion, and data pipelines Proactively optimising existing solutions, identifying areas for improvement, and reducing technical debt Collaborating with senior consultants and engineers to escalate and resolve complex technical challenges Engaging with clients, ensuring clear communication, managing expectations, and providing best-practice recommendations Documenting and sharing knowledge, contributing to internal training and process improvements What you'll need to succeed SQL knowledge and experience working with cloud data platforms (Snowflake, Matillion, or similar ETL tools) Strong problem-solving skills with the ability to troubleshoot pipeline failures and connectivity issues Excellent communication skills, able to engage with both technical teams and business stakeholders Experience with support queue management systems (e g , JIRA, ServiceNow, FreshService) is a plus A proactive mindset, comfortable working under pressure in a fast-paced, client-focused environment So, what's in it for you The chance to work with cutting-edge cloud data technologies, solving real-world business challenges You can fast-track your career in cloud data support and analytics with training and development opportunities An opportunity to be part of a collaborative, international team, working across India and the UK A competitive salary, exciting career progression, and a chance to make a real impact About Snap Analytics We're a high-growth data analytics consultancy on a mission to help enterprise businesses unlock the full potential of their data With offices in the UK, India, and South Africa, we specialise in cutting-edge cloud analytics solutions, transforming complex data challenges into actionable business insights We partner with some of the biggest brands worldwide to modernise their data platforms, enabling smarter decision-making through Snowflake, Databricks, Matillion, and other cloud technologies Our approach is customer-first, innovation-driven, and results-focused, delivering impactful solutions with speed and precision At Snap, were not just consultants, were problem-solvers, engineers, and strategists who thrive on tackling complex data challenges Our culture is built on collaboration, continuous learning, and pushing boundaries, ensuring our people grow just as fast as our business Join us and be part of a team thats shaping the future of data analytics!
Posted 1 day ago
3.0 - 5.0 years
14 - 18 Lacs
Gurugram
Work from Office
Responsibilities Lead the development of a modern, modular, and flexible restaurant technology platform Lead the development and co-manage the roadmap for our HutBot platform, our in-restaurant management app Assess, build and support restaurant ordering platforms, integrating POS with third-party apps and aggregators Oversee the integration of Kiosks, Mobile Tablets, smart kitchen, delivery management systems, and BOH applications such as inventory, labor, learning management, and other employee-facing apps Develop and maintain Enterprise architecture by building integrations between different platforms and apps Minimum Requirements 10+ years of development experience managing large projects and teams with progressive career growth Development experience in Typescript/NodeJS with React framework preferred, however we may consider strong candidates with proven experience in related technologies e g Python, C# etc Familiarity with cloud technologies, with experience in AWS being a bonus, along with proficiency in infrastructure-as-code tools like Terraform Strong understanding of modern database systems, including RDS (Postgres), NoSQL (DynamoDB, DocumentDB), and analytics tools like Snowflake, Domo (GDH), and Google Analytics Experience in building and supporting restaurant ordering platforms, integration of POS with third-party apps and aggregators, Kiosks, Mobile Tablets, smart kitchen, delivery management systems, BOH applications such as inventory, labor, learning management, and other employee-facing apps Experience in managing and building Enterprise architecture by building integrations between different platforms and apps while managing long-term strategic focus and roadmaps Experience in managing large teams across multiple time zones Preferred Requirements Development experience in Typescript/NodeJS with React framework preferred, however we may consider strong candidates with proven experience in related technologies e g Python, C# etc Familiarity with cloud technologies, with experience in AWS being a bonus, along with proficiency in infrastructure-as-code tools like Terraform Strong understanding of modern database systems, including RDS (Postgres), NoSQL (DynamoDB, DocumentDB), and analytics tools like Snowflake, Domo (GDH), and Google Analytics The Yum! Brands story is simple We have the four distinctive, relevant and easy global brands KFC, Pizza Hut, Taco Bell and The Habit Burger Grill -born from the hopes and dreams, ambitions and grit of passionate entrepreneurs And we want more of this to create our future! As the worlds largest restaurant company we have a clear and compelling mission: to build the worlds most love, trusted and fastest-growing restaurant brands The key and not-so-secret ingredient in our recipe for growth is our unrivaled talent and culture, which fuels our results Were looking for talented, motivated, visionary and team-oriented leaders to join us as we elevate and personalize the customer experience across our 48,000 restaurants, operating in 145 countries and territories around the world! We put pizza, chicken and tacos in the hands of customers through customized ordering, unique delivery approaches, app experiences, and click and collect services and consumer data analytics creating unique customer dining experiences and we are only getting started Employees may work for a single brand and potentially grow to support all company-owned brands depending on their role Regardless of where they work, as a company opening an average of 8 restaurants a day worldwide, the growth opportunities are endless Taco Bell has been named of the 10 Most Innovative Companies in the World by Fast Company; Pizza Hut delivers more pizzas than any other pizza company in the world and KFCs still use its 75-year-old finger lickingood recipe including secret herbs and spices to hand-bread its chicken every day Yum! and its brands have offices in Chicago, IL, Louisville KY, Irvine, CA, Plano, TX and other markets around the world We dont just say we are a great place to work our commitments to the world and our employees show it Yum! has been named to the Dow Jones Sustainability North America Index and ranked among the top 100 Best Corporate Citizens by Corporate Responsibility Magazine in addition to being named to the Bloomberg Gender-Equality Index Our employees work in an environment where the value of ?believe in all people? is lived every day, enjoying benefits including but not limited to: 4 weeksvacation PLUS holidays, sick leave and 2 paid days to volunteer at the cause of their choice and a dollar-for-dollar matching gift program; generous parental leave; competitive benefits including medical, dental, vision and life insurance as well as a 6% 401k match all encompassed in Yum!s world-famous recognition culture
Posted 1 day ago
2.0 - 6.0 years
5 - 9 Lacs
Pune
Work from Office
Join us as a MI Reporting Engineer at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence You'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences As a part of team of developers, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions To be successful as a MI Reporting Engineer you should have experience with: Hands on experience in developing complex/medium/easy reports in Tableau, QlikView & SAP BO reports Comfortable with Extracting, transforming and loading data from multiple sources such as Teradata and Hive into BI tools Experience in Snowflake / AWS Quicksight preferrable Create performance efficient data models and dashboards Solid working knowledge of writing SQL queries in Teradata and Hive/Impala Experience in writing PySpark queries and exposure to AWS Athena Attention to details with strong analytical and problem solving skills Exceptional communication and interpersonal skills Comfortable working in a corporate environment, someone who has business acumen and an innovative mind-set Some Other Highly Valued Skills Includes High level understanding of ETL processes Banking domain experience Quantitative mind set, with a desire to work in a data-intensive environment Familiarity with Agile delivery methodologies and project management techniques You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills This role is based out of Pune Purpose of the role To design and develop compelling visualizations that effectively communicate data insights to stakeholders across the bank, influencing decision-making and improving business outcomes Accountabilities Performing exploratory data analysis and data cleansing to prepare data for visualization Translation of complex data into clear, concise, and visually appealing charts, graphs, maps, and other data storytelling formats Utilisation of best practices in data visualization principles and design aesthetics to ensure clarity, accuracy, and accessibility Documentation of visualization methodologies and findings in clear and concise reports Presentation of data insights and visualizations to stakeholders at all levels, including executives, business users, and data analysts Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate Will have an impact on the work of related teams within the area Partner with other functions and business areas Takes responsibility for end results of a teams operational processing and activities Escalate breaches of policies / procedure appropriately Take responsibility for embedding new policies/ procedures adopted due to risk mitigation Advise and influence decision making within own area of expertise Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function Make evaluative judgements based on the analysis of factual information, paying attention to detail Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents Guide and persuade team members and communicate complex / sensitive information Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave
Posted 1 day ago
8.0 - 13.0 years
2 - 30 Lacs
Pune
Work from Office
Step into role of a Senior Data Engineer At Barclays, innovation isnt encouraged, its expected As a Senior Data Engineer you will build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure To be a successful Senior Data Engineer, you should have experience with: Hands on experience to work with large scale data platforms & in development of cloud solutions in AWS data platform with proven track record in driving business success Strong understanding of AWS and distributed computing paradigms, ability to design and develop data ingestion programs to process large data sets in Batch mode using Glue, Lambda, S3, redshift and snowflake and data bricks Ability to develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Hands on programming experience in python and PY-Spark Understanding of Dev Ops Pipelines using Jenkins, GitLab & should be strong in data modelling and Data architecture concepts & well versed with Project management tools and Agile Methodology Sound knowledge of data governance principles and tools (alation/glue data quality, mesh), Capable of suggesting solution architecture for diverse technology applications Additional Relevant Skills Given Below Are Highly Valued Experience working in financial services industry & working in various Settlements and Sub ledger functions like PNS, Stock Record and Settlements, PNL Knowledge in BPS, IMPACT & Gloss products from Broadridge & creating ML model using python, Spark & Java You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills This role is based in Pune Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures Development of processing and analysis algorithms fit for the intended data complexity and volumes Collaboration with data scientist to build and deploy machine learning models Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures If managing a team, they define jobs and responsibilities, planning for the departments future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment Manage and mitigate risks through assessment, in support of the control and governance agenda Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions Adopt and include the outcomes of extensive research in problem solving processes Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave
Posted 1 day ago
8.0 - 13.0 years
2 - 30 Lacs
Hyderabad
Work from Office
About Sanofi We are an innovative global healthcare company, driven by one purpose: we chase the miracles of science to improve peoples lives Our team, across some 100 countries, is dedicated to transforming the practice of medicine by working to turn the impossible into the possible We provide potentially life-changing treatment options and life-saving vaccine protection to millions of people globally, while putting sustainability and social responsibility at the center of our ambitions Sanofi has recently embarked into a vast and ambitious digital transformation program A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions that will accelerate Manufacturing & Supply performance and help bring drugs and vaccines to patients faster, to improve health and save lives Who You Are:You are a dynamic Data Engineer interested in challenging the status quo to design and develop globally scalable solutions that are needed by Sanofis advanced analytic, AI and ML initiatives for the betterment of our global patients and customers You are a valued influencer and leader who has contributed to making key datasets available to data scientists, analysts, and consumers throughout the enterprise to meet vital business use needs You have a keen eye for improvement opportunities while continuing to fully comply with all data quality, security, and governance standards Our vision for digital, data analytics and AI Join us on our journey in enabling Sanofis Digital Transformation through becoming an AI first organization This means: AI Factory Versatile Teams Operating in Cross Functional Pods: Utilizing digital and data resources to develop AI products, bringing data management, AI and product development skills to products, programs and projects to create an agile, fulfilling and meaningful work environment Leading Edge Tech Stack: Experience building products that will be deployed globally on a leading-edge tech stack World Class Mentorship and Training: Working with renowned leaders and academics in machine learning to further develop your skillsets There are multiple vacancies across our Digital profiles and NA region Further assessments will be completed to determine specific function and level of hired candidates Job Highlights Propose and establish technical designs to meet business and technical requirements Develop and maintain data engineering solutions based on requirements and design specifications using appropriate tools and technologies Create data pipelines / ETL pipelines and optimize performance Test and validate developed solution to ensure it meets requirements Create design and development documentation based on standards for knowledge transfer, training, and maintenance Work with business and products teams to understand requirements, and translate them into technical needs Adhere to and promote to best practices and standards for code management, automated testing, and deployments Leverage existing or create new standard data pipelines within Sanofi to bring value through business use cases Develop automated tests for CI/CD pipelines Gather/organize large & complex data assets, and perform relevant analysis Conduct peer reviews for quality, consistency, and rigor for production level solution Actively contribute to Data Engineering community and define leading practices and frameworks Communicate results and findings in a clear, structured manner to stakeholders Remains up to date on the companys standards, industry practices and emerging technologies Key Functional Requirements & Qualifications Experience working cross-functional teams to solve complex data architecture and engineering problems Demonstrated ability to learn new data and software engineering technologies in short amount of time Good understanding of agile/scrum development processes and concepts Able to work in a fast-paced, constantly evolving environment and manage multiple priorities Strong technical analysis and problem-solving skills related to data and technology solutions Excellent written, verbal, and interpersonal skills with ability to communicate ideas, concepts and solutions to peers and leaders Pragmatic and capable of solving complex issues, with technical intuition and attention to detail Service-oriented, flexible, and approachable team player Fluent in English (Other languages a plus) Key Technical Requirements & Qualifications Bachelors Degree or equivalent in Computer Science, Engineering, or relevant field 4 to 5+ years of experience in data engineering, integration, data warehousing, business intelligence, business analytics, or comparable role with relevant technologies and tools, such as Spark/Scala, Informatica/IICS/Dbt Understanding of data structures and algorithms Working knowledge of scripting languages (Python, Shell scripting) Experience in cloud-based data platforms (Snowflake is a plus) Experience with job scheduling and orchestration (Airflow is a plus) Good knowledge of SQL and relational databases technologies/concepts Experience working with data models and query tuning Nice To Haves Experience working in life sciences/pharmaceutical industry is a plus Familiarity with data ingestion through batch, near real-time, and streaming environments Familiarity with data warehouse concepts and architectures (data mesh a plus) Familiarity with Source Code Management Tools (GitHub a plus) Pursue Progress Discover Extraordinary Better is out there Better medications, better outcomes, better science But progress doesnt happen without people people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen So, lets be those people Watch our ALL IN video and check out our Diversity, Equity and Inclusion actions at sanofi com! null
Posted 1 day ago
2.0 - 4.0 years
6 - 10 Lacs
Pune
Work from Office
The headlines Job Title Data Consultant (Delivery) Start Date Mid-July 2025 Location Hybrid; 2 days a week on-site in our office in Creaticity Mall, Shashtrinagar, Yerawada Salary ??700,000 ??2,100,000/annum A bit about the role Were looking for passionate Data Consultants to join our Delivery team; a thriving and fast-growing community of some of the industrys best cloud data engineers at all levels, ranging from interns and graduates up to seasoned experts In this role, you'll combine technical expertise with strong commercial and client-facing skills You'll get the unique opportunity to work with advanced tools and methodologies, develop innovative solutions, and play an integral part in delivering value to our clients In a culture that values growth, mentorship, and technical excellence, this is the perfect opportunity for a data engineer looking to make a real impact within an international, industry-leading consultancy What you'll be doing Delivering high-quality data solutions by successfully managing development tasks with minimal guidance Working with industry-leading technologies such as Snowflake, Matillion, Power BI, and Databricks, with a focus on mastering at least one toolset while expanding your expertise in others Building trusted relationships with clients, managing expectations, and finding opportunities to add value beyond project scope Contributing to internal knowledge-sharing, delivering presentations, training sessions, and thought leadership content Driving business impact by engaging with stakeholders, understanding business challenges, and translating them into data-driven solutions Leading investigations, client workshops, and demonstrations to showcase technical expertise and problem-solving skills Balancing multiple priorities effectively, knowing when to escalate issues and when to push forward with solutions independently Helping shape the Snap Analytics team by mentoring junior consultants and sharing your expertise with others What you'll need to succeed Technical Expertise Strong experience in SQL, data modelling, and ETL processes Exposure to tools like Snowflake, Matillion, Databricks, or Power BI is highly desirable A Problem-Solving Mindset The ability to identify multiple solutions, analyse trade-offs, and confidently propose the best approach Client Engagement Skills Strong communication and stakeholder management abilities, ensuring seamless collaboration with clients at all levels Analytical Thinking The capability to evaluate data solutions critically and proactively identify opportunities for optimisation Ownership & Initiative Be self-motivated and accountable, with a proactive approach to learning and personal development A 'Team Player' Mentality Willingness to contribute to internal initiatives, support colleagues, and help grow Snap Analytics as a company So, what's in it for you A chance to work with the latest cloud data platforms, shaping enterprise-scale data solutions We'll support your journey towards technical certifications and leadership roles A collaborative and supportive culture in which we believe in knowledge-sharing, teamwork, and helping each other succeed The opportunity to write blogs, contribute to industry discussions, and become a recognised expert in your field A rewarding compensation package with opportunities for progression About Snap Analytics We're a high-growth data analytics consultancy on a mission to help enterprise businesses unlock the full potential of their data With offices in the UK, India, and South Africa, we specialise in cutting-edge cloud analytics solutions, transforming complex data challenges into actionable business insights We partner with some of the biggest brands worldwide to modernise their data platforms, enabling smarter decision-making through Snowflake, Matillion, Databricks, and other cloud technologies Our approach is customer-first, innovation-driven, and results-focused, delivering impactful solutions with speed and precision At Snap, were not just consultants, were problem-solvers, engineers, and strategists who thrive on tackling complex data challenges Our culture is built on collaboration, continuous learning, and pushing boundaries, ensuring our people grow just as fast as our business Join us and be part of a team thats shaping the future of data analytics!
Posted 1 day ago
2.0 - 7.0 years
3 - 7 Lacs
Hyderabad, Pune, Mumbai (All Areas)
Hybrid
Work Experience 2+yrs Job Title Snowflake Developer Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc,BTech Service Line Data & Analytics Unit * Location - PAN INDIA
Posted 1 day ago
6.0 - 11.0 years
35 - 50 Lacs
Pune, Gurugram, Delhi / NCR
Hybrid
Role: Snowflake Data Engineer Mandatory Skills: #Snowflake, #AZURE, #Datafactory, SQL, Python, #DBT / #Databricks . Location (Hybrid) : Bangalore, Hyderabad, Chennai, Pune, Gurugram & Noida. Budget: Up to 50 LPA' Notice: Immediate to 30 Days Serving Notice Experience: 6-11 years Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.
Posted 1 day ago
5.0 - 7.0 years
7 - 9 Lacs
Hyderabad
Work from Office
We are looking for a skilled Snowflake Developer with 5-7 years of experience to join our team at IDESLABS PRIVATE LIMITED. The ideal candidate will have expertise in designing, developing, and implementing data warehousing solutions using Snowflake. Roles and Responsibility Design and develop scalable data warehousing solutions using Snowflake. Collaborate with cross-functional teams to identify business requirements and design data models. Develop and maintain complex SQL queries for data extraction and manipulation. Implement data validation and quality checks to ensure accuracy and integrity. Optimize database performance and troubleshoot issues. Work closely with stakeholders to understand business needs and provide technical guidance. Job Requirements Strong understanding of data modeling and data warehousing concepts. Proficiency in writing complex SQL queries and stored procedures. Experience with Snowflake development tools and technologies. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills.
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.
These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.
The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum
A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator
In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management
As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane