Home
Jobs

Posted:2 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job description

  • Basic Responsibilities (Must-Haves):

    5+ years of experience in

    dashboard story development, dashboard creation, and data engineering pipelines

    .
  • Hands-on experience with

    log analytics, user engagement metrics, and product performance metrics

    .
  • Ability to identify patterns, trends, and anomalies in log data to generate actionable insights for

    product enhancements and feature optimization

    .
  • Collaborate with cross-functional teams to gather business requirements and translate them into functional and technical specifications.

  • Manage and organize large volumes of application log data using

    Google Big Query

    .
  • Design and develop interactive dashboards to visualize key metrics and insights using any of the tool like

    Tableau Power BI

    , or

    ThoughtSpot AI

    .
  • Create

    intuitive, impactful visualizations

    to communicate findings to teams including customer success and leadership.
  • Ensure

    data integrity, consistency, and accessibility

    for analytical purposes.
  • Analyse application logs to extract metrics and statistics related to

    product performance, customer behaviour, and user sentiment

    .
  • Work closely with product teams to understand log data generated by

    Python-based applications

    .
  • Collaborate with stakeholders to define

    key performance indicators (KPIs)

    and success metrics.
  • Can

    optimize data pipelines and storage in Big Query

    .
  • Strong

    communication and teamwork skills

    .
  • Ability to

    learn quickly and adapt

    to new technologies.
  • Excellent

    problem-solving skills

    .



  • Preferred Responsibilities (Nice-to-Haves):

    Knowledge of

    Generative AI (GenAI)

    and

    LLM-based solutions

    .
  • Experience in designing and developing dashboards using

    ThoughtSpot AI

    .
  • Good exposure to

    Google Cloud Platform (GCP)

    .
  • Data engineering experience with

    modern data warehouse architectures

    .



  • Additional Responsibilities:

    Participate in the development of

    proof-of-concepts (POCs)

    and pilot projects.
  • Ability to articulate ideas and points of view clearly to the team.
  • Take ownership of

    data analytics and data engineering solutions

    .



  • Additional Nice-to-Haves:

    Experience working with

    large datasets and distributed data processing tools

    such as

    Apache Spark

    or

    Hadoop

    .
  • Familiarity with

    Agile development methodologies

    and version control systems like

    Git

    .
  • Familiarity with

    ETL tools

    such as

    Informatica

    or

    Azure Data Factory


  • Mock Interview

    Practice Video Interview with JobPe AI

    Start Data Interview Now
    cta

    Start Your Job Search Today

    Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

    Job Application AI Bot

    Job Application AI Bot

    Apply to 20+ Portals in one click

    Download Now

    Download the Mobile App

    Instantly access job listings, apply easily, and track applications.

    coding practice

    Enhance Your Python Skills

    Practice Python coding challenges to boost your skills

    Start Practicing Python Now

    RecommendedJobs for You

    Udaipur, Rajasthan, India

    Mumbai Metropolitan Region