Thanks to the Analytics team now we have a bunch of places to track our game’s engagement. To be precise, I know of 3:
- Developer Stats (Average Visit Length Dashboard)
- Engagement Tab under Analytics (Average Session Time Dashboard)
- Performance Tab under Real-Time (Session Time Dashboard)
These dashboards, all show completely different values. 1 and 2 are fairly consistent between each other. They show a difference of about 2-3 minutes, but it’s constant so at least we can assume wrong reporting.
The newest Performance one, is by far the wackiest, with values that don’t make any sense to me (and I assume for most users).
There are times where “Total” is between “Computer”, “Tablet” and “Phone”. This would make sense, since “Total” is probably the average of all sessions.
But then there are times where it’s much higher than everything else:
Make it make sense.
The values are also much higher than dashboards 1 and 2. Even if I pick the 1th percentile. The percentiles in general, do not seem to make much sense. I would expect the 99th percentile to be the longest 1% of sessions. SO for the 1th percentile am I getting the session that’s higher than the lowest 1% sessions? That shows a very high number (16 mins). It’s impossible that 99% of our sessions are higher than 16 mins when our average (as reported by us) is around 15.
The performance dashboard also has no bucketing. If I pick a date range of 30 days, you are showing me literally every hour of those 30 days. I would expect some transformation of the data points there. As a result, my laptop starts heating extremely fast when I pick a big date range.
So to conclude this post with useful action items:
- Why are the metrics different in 3 places? Can we make them the same?
- How does the “Total” in the performance dashboard work?
- Did I interpret correctly the percentiles that you are providing in the performance dashboard?
- Can we add some transform of datapoints when big ranges are selected? The data is noisy and doesn’t really make sense without transforms.
Expected behavior
I would expect all dashboards to show similar values. If not, we should at least get some explanation of how these metrics are measured and why are they so different from each other.