How to develop an analytics framework

Frameworks help in structuring the analytics efforts and organising metrics and measurements

Shobha Bhagwat
Bootcamp

--

In today’s world while we are grappling with the data explosion, we have to find a way to structure the analysis of the data for finding insights. Particularly when we have to automate the tracking of a product’s performance, an analytics framework is crucial. A framework gives metrics context. It helps to identify the key metrics for a business as well as identify the various factors that drive the key metrics.

So what are the key metrics for a product? That depends on where the product is, in its life cycle. The product team for a new product may want to aggressively focus on user acquisition and app downloads. Whereas, for an established product, retention and increasing profitability may be of focus. The idea of having a standardised framework for key metrics would still be useful in all the cases.

To understand how to develop a framework for measuring a product performance, we consider an example here of a product like YouTube — essentially an app where a user comes and watches videos. For this product, the key metrics that tracked are —

  1. Daily Active Users ( number of users watching video on the app daily)
  2. Time spent per day by daily active users

The focus metrics or the key metrics are the most important or the top level metrics. Then there are more granular metrics which complement the focus metrics. Any change in the lower level metrics drives the focus metrics. Individual teams/features can focus on specific granular metrics, thereby driving the overall key metrics through consolidated efforts.

In this case, the framework can be built as follows —

Workflow for the Analytics Framework

1. FOCUS METRICS -

Daily Active Users (DAU), in this example, is the total number of users watching a video on the app on any given day. DAU will show the trends of active usage over a period of time. However, relying only on DAU may be misleading since new users may be onboarding on the app and trying to watch a video, driving DAU up, since it may be a low-friction action. The real usage or “stickiness” of the users can be measured by how engaged they are on the platform. Hence, we club DAU and avg. time spent by the DAU user on the app as our focus metrics. While the DAU may be volatile, tracking its rolling 7-day average may give a more general trend. The overall DAU user base would be segmented in cohorts by user demographics, new/resurrected/existing users and app/OS versions.

2. LEVEL 1 METRICS -

Level 1 metrics either directly contribute to Focus metrics or act as a check to make sure that over-enthusiasm in improving Focus metrics doesn’t bring down the overall health of the product. In our case the Level 1 metrics that contribute to active usage can be organised under Google’s HEART framework as follows –

a) Happiness — Percentage of DAU who like/share a video

This metric would indicate how many of the daily users actually like the content and find value in the platform (likes) and how many of the users are ready to refer the content to their friends (shares).

b) Engagement — Time spent per day by active users

This metric would be a strong indicator how the active users are interacting with the app. The time they spend on any given day on the app would be a key metric that would give insight into the user experience of the app.

c) Adoption — Percentage of newly onboarded users per day

New users who join the app on any given day by the DAU. Adoption trends over a period of time would indicate product growth opportunities. Adoption can be tracked even for new features in the app as count of users who use the new feature of any given day.

d) Retention –

Retention rate indicates how successful the app is in pulling the users to the platform. We can track 7-day retention as what percentage of the overall users returned to the app day-by-day in the past one week. The churned users are percentage of users who never returned to the platform after, say, 7 days. For churn rate, we need to set an interval (e.g. 7 days) of no activity, post which the user is marked as churned.

Stickiness can also be measured by DAU/ MAU (active users aggregated by month).

e) Task Success -

Task success would indicate that users have been able to perform key actions on the app without any issue.

3. LEVEL 2 METRICS –

The Level 2 metrics are of lowest granularity and drive the L1 metrics and thus the overall key metrics. L2 metrics are actually L1 metrics further segmented by user age, gender, state, mobile OS or specific user action/ product feature. In our case the L2 metrics under each L1 metric are as follows –

a) Percentage of DAU who like/share a video

This can be further segmented into -

  • percentage users who share content
  • percentage users who like content

across genres, user demographics and new/existing users. Thus we can identify which genres are more popular with which age group of users of which gender. We can also track content virality based on aggregated count of likes/shares for different content creators. This way we can get the impact of content quality on user satisfaction of the app — whether they are finding relevant content on the platform.

b) Time spent per day by active users

The following metrics contribute to the time spent by every user per day on the app-

  • Total videos started by the user
  • Total videos which were finished watching by the user/ total videos started (completion rate)
  • Total videos replayed by the user/ total videos started (replay rate)
  • Time spent/session
  • Session length (addition of time spent watching videos plus time spent on browsing for content etc.)
  • Total sessions/day

These can be further grouped by cohorts of gender, age and state and content genre and content creator so as to identify target audience for different genres/creators based on their completion rate and replay rate. Segregation based on video length will give insights into target audience for longer vs. smaller videos (e.g. younger audience may prefer smaller videos). Segregation by the age of the users on the platform would be useful in certain scenarios (e.g. new users may spend less time on the app since they are still exploring their niche content).

c) Percentage of newly onboarded users per day

We can categorise new users into -

  • brand new users (first time onboarding)
  • resurrected users (users who returned to the platform after having churned).

We can segment the new users based on their age, gender and state. This would help in identifying product growth opportunities with the underperforming segments. By segmenting the onboarded users across their mobile version, OS & app version, we can identify any glitches in the onboarding flow, by drop in trends, across particular segment due to technical factors.

d) Retention –

We can track the churn rate across user demographic to identify which user group finds most value in the platform and which are the genres/ content creators with the highest retention rate. Tracking retention rate across mobile version, OS & app version will help identify long-standing technical glitches/ bad UX in those segments.

e) Task Success -

Through metrics like crash rate, total videos started/session vs. total videos completed/session, we can identify if users are successful in doing the main action (watching videos) to find value in the app. If the crash rate go up suddenly across particular app version/OS for continuous sessions, it may indicate a release related technical issue.

Example — Consider a case where across 2 weeks, daily active users have gone up by 20% and time spent per user has gone down by 5%.

1. One scenario could be that a tech influencer may have written about the app which may lead to large scale download. Now these new users would drive up the DAU however since they are new on the platform, they may still be finding their relevant content and may not be viewing content for long. Thus, since time spent per user is a ratio with DAU as denominator, though DAU increases the time spent decreases since the new users cohort doesn’t show the same stickiness as the “veterans”. This can be identified from the above framework by analysing the trends of “new/existing” user cohorts of DAU over the two weeks and the “time spent/session” by each of the cohort.

2. Another scenario may be a very viral content (e.g. a new web series) may be pulling new users to the platform driving up the DAU but the new users are spending time primarily on the viral content only. So the ratio of time spent/DAU is dropping due to disproportionate upward trend in the numerator as compared to the denominator of the ratio. This can be verified by analysing the likes/shares/replays of new users and their time spent per session (& number of sessions) across different content genre and content creators cohorts. Sudden spikes can help to zero in on the reason for virality.

3. One more case may be that the app has been launched in a new state/region driving the DAU up. However due to the limited regional content on the platform (since regional creators too would be new to the app), the time spent for these new users may be less. This can be verified by the new users/region cohort and analysing the number of sessions and time spent/session for the users in the newly launched state.

4. In case of multiple events like ban/sanction on competitor apps driving new users to this app, release of new version of the app and improvement in content search functionality in the app, we can separate the effect of each of these events through the framework. To investigate the effect of new version of the app on user experience, we can compare the trends in session lengths, number of sessions, time spent and video completion rate across the new and old versions of the app. In case of any technical glitches ( say for video loading issues) the video completion rate, the time spent and session length will go down whereas there may be a spike in number of sessions since the user may be trying to refresh/start a new session in the event of video buffering.

In case of issues in the search ranking functionality (or with content quality), the time spent and video completion rate may decrease and user browsing time may increase with the user trying to search for required content.

Takeaways

  • One way of structuring the key product metrics would be having few high level/focus metrics which are further segmented and driven through more granular metrics.
  • The framework should be simple and should be able to clearly explain the impact of product features, technical and user behavioural factors on overall product health.

--

--