Jobs
Interviews

2 Performance Bi Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools. With over 5 years of experience, you will be responsible for designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. Your expertise will contribute to efficient ELT processes using Snowflake, Fivetran, and DBT for data integration and pipeline development. You will write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Additionally, you will implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT, and design high-performance data architectures. Collaboration with business stakeholders to understand data needs, troubleshooting data-related issues, ensuring high data quality standards, and documenting data processes will be part of your responsibilities. Your qualifications include expertise in Snowflake for data warehousing and ELT processes, strong proficiency in SQL for relational databases, experience with Informatica PowerCenter for data integration and ETL development, and familiarity with tools like Power BI for data visualization, Fivetran for automated ELT pipelines, and Sigma Computing, Tableau, Oracle, and DBT. You possess strong data analysis, requirement gathering, and mapping skills and are familiar with cloud services such as Azure, AWS, or GCP, along with workflow management tools like Airflow, Azkaban, or Luigi. Proficiency in Python for data processing is required, and knowledge of other languages like Java and Scala is a plus. You hold a graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Your skills include data modeling, business intelligence, Python, DBT, performance BI, ETL, DWH, Fivetran, data quality, Snowflake, SQL, and more. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and contribute to building scalable, high-performance data solutions.,

Posted 2 days ago

Apply

5.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, ETL, and related tools. With a minimum of 5 years of experience in Data Engineering, you have expertise in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This role offers you an exciting opportunity to work with cutting-edge technologies in a collaborative environment and contribute to building scalable, high-performance data solutions. Your responsibilities will include: - Developing and maintaining data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes across various data sources. - Writing complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. - Implementing advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT, and designing high-performance data architectures. - Collaborating with business stakeholders to understand data needs and translating business requirements into technical solutions. - Performing root cause analysis on data-related issues, ensuring effective resolution, and maintaining high data quality standards. - Working closely with cross-functional teams to integrate data solutions and creating clear documentation for data processes and models. Your qualifications should include: - Expertise in Snowflake for data warehousing and ELT processes. - Strong proficiency in SQL for relational databases and writing complex queries. - Experience with Informatica PowerCenter for data integration and ETL development. - Proficiency in using Power BI for data visualization and business intelligence reporting. - Familiarity with Sigma Computing, Tableau, Oracle, DBT, and cloud services like Azure, AWS, or GCP. - Experience with workflow management tools such as Airflow, Azkaban, or Luigi. - Proficiency in Python for data processing (knowledge of other languages like Java, Scala is a plus). Education required for this role is a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. This position will be based in Bangalore, Chennai, Kolkata, or Pune. If you meet the above requirements and are passionate about data engineering and analytics, this is an excellent opportunity to leverage your skills and contribute to impactful data solutions.,

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies