Misunderstood Google Analytics metrics – Stop killing useful and engaging pieces of content – Watch this podcast by Andrew Hood, Data & Analytics Specialist
This video podcast gives you in an insight into the most misunderstood Google Analytics metrics and explains why you should stop killing useful and engaging pieces of content.
Why not learn about the top analytics misinterpretations and the top questions to ask when analysing your own data today?
Click below to download the Mind Map for this Video Podcast
Speaker 1: (00:07)
I think none of these metrics are wrong, but some of these metrics are very prone to being misinterpreted. So for example, we take metrics like number of visitors to the site. There’s quite a good assessment of volume of different devices that have been on that site. However, when you look at something like time on page, a lot of people aren’t aware that, the time on page is never counted on the last page of the user’s journey and will always show zero. And if they’re not aware of that technical limitation that can result in people making some very poor conclusions about what content they kill on their site, for example. We might look at things like bounce rate often, there’s a sense that bounce means that people have immediately left a page. Having arrived actually just means that after 13 minutes they had not interacted further with that site.
Speaker 1: (00:59)
So it’s sometimes less an issue with the metrics being bad or good, but more making sure that they’re being interpreted correctly and people aren’t making bad decisions on the back of a misinterpretation of particularly goes when these metrics are taken then circulated in reports that might go around to business and might be picked up by someone that doesn’t have the background to understand what they mean or don’t mean. Visits the definition that metric is, is not widely understood. And the key thing to understand is that if somebody is in active on your site for 30 minutes their visit will end. So it’s a very short term measurement of response. And when you start to look at things like conversion rate of proportion of visits that did something, that doesn’t mean that they never did it. It meant that they might’ve stopped to have a cup of tea, or they may have been on another tab doing something else.
Speaker 1: (01:53)
So in this modern world where people look at websites, whilst they’re on their mobile phones on trains, we can’t really assume that every task must be completed without stopping for 30 minutes. So the reality is we can’t actually measure the act of exiting a website. So what we actually do when we count exits it is we assume somebody has exited. If they want a page and 30 minutes later, they have done nothing on our site. They could still have it open in a tab. I think time on page is firstly, a very inaccurate measure because time on page is not counted, ff that page is the final page of the user’s journey. And secondly, even if we could count time on page accurately, you’d have to ask why that was a measure of success, and if do we really want people to be very slow or fast in how they interact with us or do we actually just want them to achieve key tasks?
Speaker 1: (02:51)
So I think it’s quite a false measure of success unless you are getting people to watch a video. Top three questions; first question, is that the right metric to be focusing on in the first place? So does it have any relationship to the business at all? Just because it can be measured it doesn’t necessarily mean that it’s of any value. Second thing, How is that metric defined? So are there any limitations in how it’s captured that would end in an interpretation and then if it looks like a number has changed, was that actually statistically significant? So pause was the number that you changed, what’s that actually meaningful? Or are you just measuring randomness.