Home
Jobs

1601 Snowflake Jobs - Page 47

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

30 - 40 Lacs

Pune

Work from Office

We, at Jet2 (UK’s third largest airline and largest tour operator), have set up a state-of-the-art Technology and Innovation Centre in Pune, India. We are recruiting for an experienced and passionate Lead Test Engineer to join our growing Data Engineering program which is focused on making data assets from across our business available in cloud for advanced analytics, data science and automated decision making. These data assets drive mission critical processes across our commercial and operational teams contributing towards profitable growth, competitive advantage, and exceptional customer satisfaction. This is an exciting opportunity to help us become a leading data-drive organisation through the effective use of data. The successful candidate will play a pivotal role in the success of our Data Engineering & Analytics program working closely with colleagues across our business and technical teams. You will be testing and encompassing a wide variety of data sources and will ensure data is cleansed, well structured, and appropriately transformed to support a wide variety of use cases ranging from self-service analytics through to data warehousing and data science. We are looking for a thought-leader in the space of Cloud Data Engineering to not only enable us to deliver solutions efficiently but also evolve our Data Engineering processes and practices to ensure we continue to align with industry best-practice and take advantage of new developments in the Data Engineering landscape. Key Responsibilities: Our Test Lead’s priority is to plan, monitor and direct testing items, activities, and tasks to deliver high-quality, well-modelled, clean, and trustworthy data assets for use across the business. Our data teams best practices and industry Standards solid experience testing data products, applications Test delivery across several multi-disciplinary (on-prem, cloud, data platforms, warehouse, tableau dashboard etc) data delivery team Assessing the appropriate level of testing for each piece of work, and supporting the planning, execution and documentation of testing carried out Measuring and improving our data test practices and processes Helping develop Agile Data Testing methodologies and Automated Data Testing processes Direct line management for Test Engineers within the teams you are supporting, ensuring that all your direct reports have learning and development plans in place and that these are continually reviewed Lead the new initiatives in test automation, strategy/framework as required and focus specifically on maximizing reusability for regression. Build test plans, test scenarios, and test data to support development projects and project requirements, and design documents Develop and oversee onboarding QA Training for new hire and advance training program for experience resources. Roles and Responsibilities Technical Skills & Knowledge: Test Lead will ideally have experience working in a fast-paced Agile delivery environment with a focus on rapid delivery of business value through iterative development and testing approaches. You will also have the following skills. Advanced SQL knowledge with experience using a wide variety of source systems including Microsoft SQL Server Have prior test leadership and line management experience or have been in a senior test role with significant mentoring experience Test Certified such as ISTQB Experience of working in an Agile delivery team using automated build, deployment, and testing (CI\CD, DevOps, DataOps) Automate tests for data validation, schema verification, data transformation, and data quality checks across different stages of the pipeline. Write scripts and programs in languages such as Python, Java, or Scala to automate test execution. Use advanced features in DBT to write generic and singular test cases. Have experience in developing and implementing best practices across QA team regarding data validation, functional testing, and regression testing Have experience in data testing on one or more of the following technology platforms (cloud data platform experience preferred): Microsoft SQL Server (T-SQL, SSIS, SSRS, SSAS) Google Cloud Platform (GCS, Google Cloud Composer, GKE, Cloud Functions, BiqQuery) Snowflake Cloud Data Platform dBt Data Transformation Framework Tableau Data Visualisation Platform Experience in automated data testing or the creation of automated data testing frameworks would be highly beneficial across any of the above technology platform Relevant Experience: 9+ years’ experience in a data-related role with recent experience in the field of cloud data engineering. Experience of cloud data analytics and data warehousing would be an advantage. Soft Skills: Excellent communication skills – verbal and written Strong team and stakeholder management skills – you should have the ability to build strong relationships with people across a wide variety of teams and backgrounds Experience working with people across different geographical regions, particularly UK and US Excellent planning and time management skills to ensure projects are delivered on time and to requirement with issues being escalated and addressed effectively Exceptional presentation skills – you will be responsible for documenting solution designs for presentation and sign-off as well as delivering internal presentation to support continual development of your team Leadership & Organisation Skills: Lead a team of test engineers working across multiple projects (Senior, Mid, Junior and Graduate level Test Engineers) Set individual development goals, monitor, and manage performance, provide timely feedback and take responsibility for professional development of team members Collaborate with wider data team including Solution Architects, Specialists, Enterprise Architects, Data Scientists, Data Visualisation Specialists, Analytics Engineers, and Test Engineers Qualification & Certification: B.E./B.Tech/ MTech in IT or Computer Science from reputed institute (preferred) or master’s degree in quantitative Subjects e.g. Mathematics, Statistics & Economics

Posted 1 month ago

Apply

3.0 - 5.0 years

8 - 15 Lacs

Hyderabad

Work from Office

We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the clients organization. Key Responsibilities 1. Design and implement scalable ELT pipelines using dbt on Snowflake, following industry accepted best practices. 2. Build ingestion pipelines from various sources including relational databases, APIs, cloud storage and flat files into Snowflake. 3. Implement data modelling and transformation logic to support layered architecture (e.g., staging, intermediate, and mart layers or medallion architecture) to enable reliable and reusable data assets.. 4. Leverage orchestration tools (e.g., Airflow,dbt Cloud, or Azure Data Factory) to schedule and monitor data workflows. 5. Apply dbt best practices: modular SQL development, testing, documentation, and version control. 6. Perform performance optimizations in dbt/Snowflake through clustering, query profiling, materialization, partitioning, and efficient SQL design. 7. Apply CI/CD and Git-based workflows for version-controlled deployments. 8. Contribute to growing internal knowledge base of dbt macros, conventions, and testing frameworks. 9. Collaborate with multiple stakeholders such as data analysts, data scientists, and data architects to understand requirements and deliver clean, validated datasets. 10. Write well-documented, maintainable code using Git for version control and CI/CD processes. 11. Participate in Agile ceremonies including sprint planning, stand-ups, and retrospectives. 12. Support consulting engagements through clear documentation, demos, and delivery of client-ready solutions. Required Qualifications 3 to 5 years of experience in data engineering roles, with 2+ years of hands-on experience in Snowflake and DBT. Experience building and deploying DBT models in a production environment. Expert-level SQL and strong understanding of ELT principles. Strong understanding of ELT patterns and data modelling (Kimball/Dimensional preferred). Familiarity with data quality and validation techniques: dbt tests, dbt docs etc. Experience with Git, CI/CD, and deployment workflows in a team setting Familiarity with orchestrating workflows using tools like dbt Cloud, Airflow, or Azure Data Factory. Core Competencies: o Data Engineering and ELT Development: Building robust and modular data pipelines using dbt. Writing efficient SQL for data transformation and performance tuning in Snowflake. Managing environments, sources, and deployment pipelines in dbt. o Cloud Data Platform Expertise: Strong proficiency with Snowflake: warehouse sizing, query profiling, data loading, and performance optimization. Experience working with cloud storage (Azure Data Lake, AWS S3, or GCS) for ingestion and external stages. ' Technical Toolset: o Languages & Frameworks: Python: For data transformation, notebook development, automation. SQL: Strong grasp of SQL for querying and performance tuning. Best Practices and Standards: o Knowledge of modern data architecture concepts including layered architecture (e.g., staging ? intermediate ? marts, Medallion architecture). Familiarity with data quality, unit testing (dbt tests), and documentation (dbt docs). Security & Governance: o Access and Permissions: Understanding of access control within Snowflake (RBAC), role hierarchies, and secure data handling. Familiar with data privacy policies (GDPR basics), encryption at rest/in transit. Deployment & Monitoring: o DevOps and Automation: Version control using Git, experience with CI/CD practices in a data context. Monitoring and logging of pipeline executions, alerting on failures. Soft Skills: o Communication & Collaboration: Ability to present solutions and handle client demos/discussions. Work closely with onshore and offshore team of analysts, data scientists, and architects. Ability to document pipelines and transformations clearly. Basic Agile/Scrum familiarity working in sprints and logging tasks. Comfort with ambiguity, competing priorities and fast-changing client environment. Education: o Bachelors or masters degree in computer science, Data Engineering, or a related field. o Certifications such as Snowflake SnowPro, dbt Certified Developer Data Engineering are a plus.

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Hyderabad

Work from Office

Key Responsibilities Architect and implement modular, test-driven ELT pipelines using dbt on Snowflake. Design layered data models (e.g., staging, intermediate, mart layers / medallion architecture) aligned with dbt best practices. Lead ingestion of structured and semi-structured data from APIs, flat files, cloud storage (Azure Data Lake, AWS S3), and databases into Snowflake. Optimize Snowflake for performance and cost: warehouse sizing, clustering, materializations, query profiling, and credit monitoring. Apply advanced dbt capabilities including macros, packages, custom tests, sources, exposures, and documentation using dbt docs. Orchestrate workflows using dbt Cloud, Airflow, or Azure Data Factory, integrated with CI/CD pipelines. Define and enforce data governance and compliance practices using Snowflake RBAC, secure data sharing, and encryption strategies. Collaborate with analysts, data scientists, architects, and business stakeholders to deliver validated, business-ready data assets. Mentor junior engineers, lead architectural/code reviews, and help establish reusable frameworks and standards. Engage with clients to gather requirements, present solutions, and manage end-to-end project delivery in a consulting setup Required Qualifications 5 to 8 years of experience in data engineering roles, with 3+ years of hands-on experience working with Snowflake and dbt in production environments. Technical Skills: o Cloud Data Warehouse & Transformation Stack: Expert-level knowledge of SQL and Snowflake, including performance optimization, storage layers, query profiling, clustering, and cost management. Experience in dbt development: modular model design, macros, tests, documentation, and version control using Git. o Orchestration and Integration: Proficiency in orchestrating workflows using dbt Cloud, Airflow, or Azure Data Factory. Comfortable working with data ingestion from cloud storage (e.g., Azure Data Lake, AWS S3) and APIs. Data Modelling and Architecture: Dimensional modelling (Star/Snowflake schemas), Slowly changing dimensions. ' Knowledge of modern data warehousing principles. Experience implementing Medallion Architecture (Bronze/Silver/Gold layers). Experience working with Parquet, JSON, CSV, or other data formats. o Programming Languages: Python: For data transformation, notebook development, automation. SQL: Strong grasp of SQL for querying and performance tuning. Jinja (nice to have): Exposure to Jinja for advanced dbt development. o Data Engineering & Analytical Skills: ETL/ELT pipeline design and optimization. Exposure to AI/ML data pipelines, feature stores, or MLflow for model tracking (good to have). Exposure to data quality and validation frameworks. o Security & Governance: Experience implementing data quality checks using dbt tests. Data encryption, secure key management and security best practices for Snowflake and dbt. Soft Skills & Leadership: Ability to thrive in client-facing roles with competing/changing priorities and fast-paced delivery cycles. Stakeholder Communication: Collaborate with business stakeholders to understand objectives and convert them into actionable data engineering designs. Project Ownership: End-to-end delivery including design, implementation, and monitoring. Mentorship: Guide junior engineers and establish best practices; Build new skill in the team. Agile Practices: Work in sprints, participate in scrum ceremonies, story estimation. Education: Bachelors or masters degree in computer science, Data Engineering, or a related field. Certifications such as Snowflake SnowPro Advanced, dbt Certified Developer are a plus.

Posted 1 month ago

Apply

6.0 - 11.0 years

20 - 25 Lacs

Noida, Mumbai

Work from Office

Responsibilities: Act as Data domain expert for Snowflake in a collaborative environment to provide demonstrated understanding of data management best practices and patterns. Design and implement robust data architectures to meet and support business requirements leveraging Snowflake platform capabilities. Develop and enforce data modeling standards and best practices for Snowflake environments. Develop, optimize, and maintain Snowflake data warehouses. Leverage Snowflake features such as clustering, materialized views, and semi-structured data processing to enhance data solutions. Ensure data architecture solutions meet performance, security, and scalability requirements. Stay current with the latest developments and features in Snowflake and related technologies, continually enhancing our data capabilities. Collaborate with cross-functional teams to gather business requirements, translate them into effective data solutions in Snowflake and provide data-driven insights. Stay updated with the latest trends and advancements in data architecture and Snowflake technologies. Provide mentorship and guidance to junior data engineers and architects. Troubleshoot and resolve data architecture-related issues effectively. Skills Requirement: 5+ years of proven experience as a Data Engineer with 3+ years as Data Architect. Proficiency in Snowflake with Hands-on experience with Snowflake features such as clustering, materialized views, and semi-structured data processing. Experience in designing and building manual or auto ingestion data pipeline using Snowpipe. Design and Develop automated monitoring processes on Snowflake using combination of Python, PySpark, Bash with SnowSQL. SnowSQL Experience in developing stored Procedures writing Queries to analyze and transform data Working experience on ETL tools like Fivetran, DBT labs, MuleSoft Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero copy clone, time travel and automating them. Excellent problem-solving skills and attention to detail. Effective communication and collaboration abilities. Relevant certifications (e.g., SnowPro Core / Advanced) are a must have. Must have expertise in AWS. Azure, Salesforce Platform as a Service (PAAS) model and its integration with Snowflake to load/unload data. Strong communication and exceptional team player with effective problem-solving skills Educational Qualification Required: Masters degree in Business Management (MBA / PGDM) / Bachelor's degree in computer science, Information Technology, or related field.

Posted 1 month ago

Apply

4.0 - 8.0 years

12 - 20 Lacs

Noida, Bengaluru

Hybrid

Job Overview We are looking for a highly capable and motivated Data Engineer to join our growing data team. The ideal candidate will be responsible for designing and implementing scalable data pipelines, enabling efficient migration of large data workloads into Snowflake , and integrating with various AWS services . This role requires deep knowledge of SQL, cloud data platforms, and a strong understanding of modern data engineering practices. Key Responsibilities Design and implement robust, scalable, and secure data pipelines for ingesting, transforming, and storing data in Snowflake Execute data migration strategies from on-prem or legacy systems (e.g., SQL Server, Oracle, Teradata) to Snowflake Integrate Snowflake with AWS components such as S3 , Glue , Lambda , and Step Functions Automate data ingestion using Snowpipe , Streams , and Tasks Write clean, efficient, and reusable SQL for transformations and data quality validations Monitor and tune Snowflake performance , including warehouse usage and query optimization Implement and enforce data governance , access control , and security best practices Collaborate with data analysts, architects, and business stakeholders to define data requirements Support development of data models (Star, Snowflake schemas) and metadata documentation Required Skills & Experience 3+ years of experience in a Data Engineering role Strong hands-on experience with Snowflake in production environments Proficiency in SQL (complex joins, CTEs, window functions, performance tuning) Solid experience with AWS services : S3, Glue, Lambda, IAM, Step Functions Proven experience in data migration projects Familiarity with ETL/ELT processes and data orchestration tools (e.g., Airflow , DBT , Informatica , Matillion ) Strong understanding of data warehousing , data modeling , and big data concepts Knowledge of version control (Git) and CI/CD pipelines for data workflows Preferred Qualifications Bachelors or Masters degree in Computer Science, Engineering, Information Systems, or related field Snowflake SnowPro Certification AWS Certified Data Analytics or Solutions Architect certification Experience with scripting languages (e.g., Python , Shell ) Exposure to BI/visualization tools (e.g., Tableau, Power BI)

Posted 1 month ago

Apply

4.0 - 9.0 years

18 - 27 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Skill set Snowflake, AWS, Cortex AI, Horizon Catalog or Snowflake, AWS, (Cortex AI or Horizon Catalog) or Snowflake, Azure, Cortex AI, Horizon Catalog Or Snowflake, Azure, (Cortex AI or Horizon Catalog) Preferred Qualifications: Bachelors degree in Computer Science, Data Engineering, or a related field. Experience in data engineering, with at least 3 years of experience working with Snowflake. Proven experience in Snowflake, Cortex AI/ Horizon Catalog focusing on data extraction, chatbot development, and Conversational AI. Strong proficiency in SQL, Python, and data modeling. Experience with data integration tools (e.g., Matillion, Talend, Informatica). Knowledge of cloud platforms such as AWS or Azure, or GCP. Excellent problem-solving skills, with a focus on data quality and performance optimization. Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT's testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT's lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL, SnowPipe, bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight. Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github/Gitlab, Azure repo

Posted 1 month ago

Apply

7.0 - 12.0 years

18 - 20 Lacs

Hyderabad

Work from Office

We are Hiring Senior Data Management Specialist Level 3 for US based IT Company based in Hyderabad. Candidates with experience in Data Management and Snowflake can app. Job Title : Senior Data Management Specialist Level 3 Location : Hyderabad Experience : 7+ Years CTC : 18 LPA - 20 LPA Working shift : Day shift Description: We are looking for an experienced and highly skilled Data Management Specialist (Level 3) to contribute to enterprise-level data solutions with an emphasis on cloud data platforms and modern data engineering tools . The ideal candidate will possess hands-on expertise with Snowflake , combined with a solid foundation in data integration, modeling , and cloud-based database technologies . This role is a key part of a high-impact data team dedicated to ensuring the quality, availability, and governance of enterprise data assets. As a Level 3 specialist , the individual will be expected to lead and execute complex data management tasks , while collaborating closely with data architects, analysts, and business stakeholders . Key Responsibilities: Design, develop, and maintain scalable data pipelines and integrations using Snowflake and other cloud data technologies Handle structured and unstructured data to support analytics, reporting, and operational workloads Develop and optimize complex SQL queries and data transformation logic Collaborate with data stewards and governance teams to uphold data quality, consistency, and compliance Perform data profiling, cleansing, and validation across multiple source systems Support ETL/ELT development and data migration initiatives using tools like Informatica, Talend , or dbt Design and maintain data models , including star and snowflake schemas Ensure performance tuning, monitoring, and troubleshooting of Snowflake environments Document data processes, data lineage, and metadata within the data governance framework Act as a technical SME , offering guidance and support to junior team members Required Skills & Qualifications: Minimum 5 years of experience in data engineering, data management, or similar roles Strong hands-on experience with Snowflake (development, administration, performance optimization) Proficiency in SQL , data modeling , and cloud-native data architectures Experience working on cloud platforms such as AWS, Azure , or Google Cloud (with Snowflake) Familiarity with ETL tools like Informatica, Talend , or dbt Solid understanding of data governance , metadata management , and data quality best practices Experience with Python or Shell scripting for automation and data operations Strong analytical and problem-solving abilities Excellent communication and documentation skills For further assistance contact/whatsapp : 9354909512 or write to pankhuri@gist.org.in

Posted 1 month ago

Apply

5.0 - 8.0 years

17 - 20 Lacs

Pune

Remote

At Codvo, software and people transformations go together We are a global empathy-led technology services company with a core DNA of product innovation and mature software engineering We uphold the values of Respect, Fairness, Growth, Agility, and Inclusiveness in everything we do About The Role :We are looking for a Data & BI Solution Architect to lead data analytics initiatives in the retail domainThe candidate should be skilled in data modeling, ETL, visualization, and big data technologies Responsibilities:Architect end-to-end data and BI solutions for retail analytics Define data governance, security, and compliance frameworks Work with stakeholders to design dashboards and reports for business insights Implement data pipelines and integrate with cloud platforms Skills Required:Proficiency in SQL, Python, and Spark Experience with ETL tools (Informatica, Talend, AWS Glue) Knowledge of Power BI, Tableau, and Looker Hands-on experience with cloud data platforms (Snowflake, Redshift, BigQuery)

Posted 1 month ago

Apply

3.0 - 5.0 years

15 - 22 Lacs

Gurugram, Bengaluru

Work from Office

Exciting opportunity for a Senior Data Engineer to join a leading analytics-driven environment. You will be working on data warehousing, visualizations, and collaborative requirement gathering to deliver impactful business insights. Location: Gurgaon/Bangalore Shift Timing: 12:00 PM to 9:30 PM Your Future Employer: A high-growth organization known for delivering cutting-edge analytics and data engineering solutions. A people-first environment focused on innovation, collaboration, and continuous learning. Responsibilities: Building and refining data pipelines, transformations, and curated views Cleansing data to enable full analytics and reporting capabilities Collaborating with cross-functional teams to gather and document data requirements Developing dashboards and reports using Tableau or Sigma Supporting sprint-based delivery with strong stakeholder interaction Working with ERP data analytics and financial data sets Requirements: Bachelors degree in Computer Science, Information Systems, or related field 25 years of experience as a Data Engineer (SQL, Oracle) Hands-on experience with Snowflake, DBT, SQL, stored procedures Experience with visualization tools like Tableau or Sigma Proficiency in Agile methodology and tools like JIRA and Confluence Excellent communication, documentation, and client interaction skills Whats in it for you: Competitive compensation with performance-based rewards Opportunity to work on advanced data platforms and visualization tools Exposure to global stakeholders and cutting-edge analytics use cases Supportive, inclusive, and growth-focused work culture

Posted 1 month ago

Apply

5.0 - 9.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Python (Programming Language), AWS Glue, AWS Lambda Administration Minimum 5 year(s) of experience is required Educational Qualification : Graduate Summary :As a Snowflake Data Warehouse Architect, you will be responsible for leading the implementation of Infrastructure Services projects, leveraging our global delivery capability. Your typical day will involve working with Snowflake Data Warehouse, AWS Glue, AWS Lambda Administration, and Python programming language. Roles & Responsibilities: Lead the design and implementation of Snowflake Data Warehouse solutions for Infrastructure Services projects. Collaborate with cross-functional teams to ensure successful delivery of projects, leveraging AWS Glue and AWS Lambda Administration. Provide technical guidance and mentorship to junior team members. Stay updated with the latest advancements in Snowflake Data Warehouse and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Strong experience in Snowflake Data Warehouse. Good To Have Skills:Proficiency in Python programming language, AWS Glue, and AWS Lambda Administration. Experience in leading the design and implementation of Snowflake Data Warehouse solutions. Strong understanding of data architecture principles and best practices. Experience in data modeling, data integration, and data warehousing. Experience in performance tuning and optimization of Snowflake Data Warehouse solutions. Additional Information: The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualifications Graduate

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years or more of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Google BigQuery. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing scalable solutions to meet the needs of our clients. Roles & Responsibilities: Design, build, and configure applications to meet business process and application requirements using Google BigQuery. Collaborate with cross-functional teams to analyze business requirements and develop scalable solutions to meet the needs of our clients. Develop and maintain technical documentation, including design documents, test plans, and user manuals. Ensure the quality of deliverables by conducting thorough testing and debugging of applications. Professional & Technical Skills: Must To Have Skills:Proficiency in Google BigQuery. Good To Have Skills:Experience with other cloud-based data warehousing solutions such as Amazon Redshift or Snowflake. Strong understanding of SQL and database design principles. Experience with ETL tools and processes. Experience with programming languages such as Python or Java. Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualifications 15 years or more of full time education

Posted 1 month ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Karnataka

Work from Office

Develops and manages Oracle data solutions, integrating warehouse management, OBIEE, and ODI for business intelligence.

Posted 1 month ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Karnataka

Work from Office

Focus on designing, developing, and maintaining Snowflake data environments. Responsible for data modeling, ETL pipelines, and query optimization to ensure efficient and secure data processing.

Posted 1 month ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Pune

Work from Office

The Snowflake role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Snowflake domain.

Posted 1 month ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

5+ yrs experience in IICS, ETL methodology, Informatica Cloud & Power Centre. Hands-on with Informatica Intelligent Cloud Services. Strong DB skills (Oracle, SQL). Business analysis, data modeling, complex SQL. Nice to have: Snowflake, AI/ML

Posted 1 month ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Bengaluru

Work from Office

The Snowflake role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Snowflake domain.

Posted 1 month ago

Apply

2.0 - 7.0 years

5 - 15 Lacs

Bengaluru

Work from Office

We are looking for a highly skilled and forward-thinking AI Engineer with deep expertise in Snowflakes Cortex AI , Maya from Mendix , and strong working knowledge of large language models (LLMs) . The ideal candidate has experience developing intelligent AI agents, integrating low-code AI capabilities, and building traditional machine learning solutions. This role offers the opportunity to work at the cutting edge of AI product development and enterprise AI integration. Key Responsibilities: Design, develop, and deploy AI agents using both LLMs and traditional ML approaches Leverage Cortex AI in Snowflake to build and integrate scalable AI/ML models into data pipelines and applications Utilize Maya (Mendix AI) to embed AI features into low-code enterprise applications Evaluate and integrate major LLMs (e.g., OpenAI GPT, Claude, Google Gemini, Mistral, etc.) for various use cases including text summarization, classification, code generation, and chat interfaces Fine-tune and customize foundation models as needed for enterprise tasks Work closely with data engineering and software teams to operationalize AI models in production Monitor model performance and implement improvements and retraining strategies Ensure AI solutions meet security, compliance, and ethical standards Stay current with the latest advancements in AI, ML, and LLM ecosystems Required Qualifications: Bachelors or Masters degree in Computer Science, Data Science, Artificial Intelligence, or related field 2+ years of experience in AI/ML development, with hands-on exposure to deploying production-grade models Hands-on experience with Cortex AI in Snowflake , including model training and inference Proficiency in Maya from Mendix or similar AI-assisted low-code platforms Strong experience with Python, SQL, and AI/ML frameworks (e.g., TensorFlow, PyTorch, scikit-learn) Experience working with and integrating APIs of LLMs like OpenAI, Hugging Face, or Cohere Proven ability to design and implement AI agents or copilots using tools like LangChain, Semantic Kernel, or RAG pipelines Familiarity with MLOps principles, CI/CD pipelines, and cloud-based ML (AWS, Azure, or GCP) Preferred Qualifications: Experience in data-centric AI and vector search (e.g., Pinecone, FAISS, Weaviate) Experience in prompt engineering and LLM evaluation Knowledge of security and governance best practices for enterprise AI Contributions to AI open-source projects or publications in the field Certifications in Snowflake, Mendix, or cloud-based AI platforms

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Hyderabad, Delhi / NCR

Hybrid

8+ years of experience in data engineering or a related field. Strong expertise in Snowflake including schema design, performance tuning, and security. Proficiency in Python for data manipulation and automation. Solid understanding of data modeling concepts (star/snowflake schema, normalization, etc.). Experience with DBT for data transformation and documentation. Hands-on experience with ETL/ELT tools and orchestration frameworks (e.g., Airflow, Prefect). Strong SQL skills and experience with large-scale data sets. Familiarity with cloud platforms (AWS, Azure, or GCP) and data services.

Posted 1 month ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Mumbai

Work from Office

The Snowflake role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Snowflake domain.

Posted 1 month ago

Apply

4.0 - 9.0 years

20 - 25 Lacs

Hyderabad

Remote

Data Analyst Experience: 4 - 10 Years Exp Salary : Upto USD 3,000 / month Preferred Notice Period : Within 30 Days Shift : 4:30PM to 1:30AM IST Opportunity Type: Remote Placement Type: Contractual Contract Duration: Full-Time, 03 Months (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Good to have skills : AI/LLM, Zendesk Oyster (One of Uplers' Clients) is Looking for: Data Analyst who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us Were a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why Youll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Our mission is to create a more equal world by making it possible for companies everywhere to hire people anywhere. We believe it should be easy for any company to hire any person, no matter where they are located in the world. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 month ago

Apply

7.0 - 12.0 years

18 - 20 Lacs

Hyderabad

Work from Office

We are Hiring Senior Business Intelligence Analyst Level 3 for a US based IT Company based in Hyderabad. Candidates with 7 years of experience as Business Intelligence Analyst can apply. Job Title : Senior Business Intelligence Analyst Level 3 Location : Hyderabad Experience : 7+ Years CTC : upto 20 LPA Working shift : Day shift Job Description: We are looking for a seasoned and analytical Senior Business Intelligence (BI) Analyst to join our data and analytics team. In this Level 3 role, you will play a vital role in turning complex data into meaningful insights that support strategic business decisions. We are seeking a candidate with deep knowledge of WebFocus, solid BI development expertise, and a strong background in the financial services industry. The selected candidate will be responsible for designing, developing, and delivering BI solutions using tools like WebFocus, Tableau, and Alteryx, while ensuring data quality, performance, and usability. Key Responsibilities: Lead the design, development, and enhancement of reports, dashboards, and ad hoc analyses using WebFocus Serve as a subject matter expert on WebFocus and enterprise reporting practices to ensure performance and standardization Partner with business stakeholders to gather reporting requirements and translate them into effective BI solutions Build and maintain interactive dashboards using Tableau and Power BI, and streamline workflows using Alteryx Develop complex SQL queries to extract, validate, and transform data from multiple sources Conduct data analysis to identify trends, insights, and opportunities for business growth Uphold data governance practices, ensure data accuracy, and maintain proper documentation across BI platforms Mentor junior team members and contribute to continuous improvement of BI processes and standards Support regulatory and compliance reporting within the banking and financial domain Required Skills & Qualifications: 5+ years of experience in Business Intelligence, Data Analytics, or related roles Proven hands-on expertise with WebFocus (InfoAssist, App Studio, BI Portal) Strong understanding of financial services data, KPIs, and reporting methodologies Proficiency in Tableau, Alteryx, and SQL Experience with Power BI is a plus Excellent communication skills, with the ability to collaborate effectively with both technical and non-technical teams Strong analytical mindset, attention to detail, and ability to present data-driven narratives Experience working in Agile or Scrum environments Preferred Qualifications: Familiarity with data governance, metadata management, and data cataloging tools Experience with cloud-based data platforms (AWS, Azure, Snowflake) Knowledge of compliance and regulatory reporting in the financial sector For further assistance contact/whatsapp: 9354909521 or write to priyanshi@gist.org.in

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Chennai

Work from Office

The Digital :Python, Digital :Snowflake role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :Python, Digital :Snowflake domain.

Posted 1 month ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

The IBM InfoSphere DataStage, Digital :Snowflake role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the IBM InfoSphere DataStage, Digital :Snowflake domain.

Posted 1 month ago

Apply

3.0 - 7.0 years

14 - 20 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

Role: ETL Testing Notice Period: Immediate Joiners Work Mode: Remote Interested Candidate can share there CV to Devika_P@trigent.com Role & responsibilities Responsibilities Develop and execute comprehensive test plans and test cases for data solutions, including data pipelines, ETL processes, and data warehouses. Perform data validation and verification to ensure data accuracy, completeness, and consistency. Identify, document, and track defects and issues, and work with development teams to resolve them. Collaborate with data engineers, data scientists, and other stakeholders to understand data requirements and ensure that testing covers all necessary scenarios. Automate data testing processes using appropriate tools and frameworks. Conduct performance testing to ensure data solutions can handle expected workloads. Participate in code reviews and provide feedback on data quality and testing practices. Continuously improve testing processes and methodologies to enhance the efficiency and effectiveness of data testing. Requirements and Experience Proven experience in data testing, quality engineering Strong understanding of data engineering practices, including ETL processes, data pipelines, and data warehousing. Knowledge of SSIS, SSAS. Proficiency in SQL and experience with database management systems (e.g., MS SQL Server) Experience with data testing tools and frameworks (e.g., pytest, dbt). Familiarity with cloud data platforms (e.g., Snowflake, Azure Data Factory). Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Ability to work independently and as part of a team.

Posted 1 month ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Mumbai

Work from Office

The Digital :Snowflake role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :Snowflake domain.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies