Sep 04, 2018
The spending of online advertising is expected to grow from USD $266 billion in 2018 to nearly USD $376 billion in 2021, as per a report by eMarket...
Mar 06, 2017
by Rakhi Chowdhary
Web analytics has been a key component of my role here at envigo. While we work on generating greater return for the dollar, accurate and reliable web analytics remains the key tool for control and optimisation of any campaign.
Accurate web analytics is the result of many things working correctly - a suitable web analytics package (usually Google analytics), correct configuration, the use of the right reports which have the relevant metrics and dimensions - each of these components need to be correct insights are generated.
I will use the analogy of a crime novel here.
Also, as in a crime novel, sometimes the data is misleading without anyone even realising it. That can be one of the toughest scenarios to identify, isolate and solve to get back to correct data.
Here are two recent issues that I had to solve:
Just after a site migration done for a digital magazine website, there was a spike noted in direct traffic apart from a drop in bounce rates.
There was some guarded optimism at the client’s office - it seemed to them that the website had reduced bounce rates to close to zero (!) and also increase site’s direct traffic.
I was unimpressed.
Was there something wrong?
While the web design was clearly an improvement, i did not think that the website bounce rate reduction was correct.
I also noted that the homepage bounce rate had stayed flat (even though the homepage had also been designed. It was clear that something was up.
Let me walk you through the approach and findings.
I rounded up the usual suspects.
Self referral exclusion : Self-referral traffic was checked and eliminated as a probable contributor
HTTP to HTTPS traffic: I found a few pages that existed both as http and https. Any HTTPS to HTTP traffic gets counted as direct. While this was a contributing issue, its impact was minimal.
Spam bots : Rechecked the spam bot list and got some new bots added. Again it was a contributing element with minimal impact.
I took a look at the 10 top landing pages for direct traffic for a period immediately after the site launch.
I did not consider it likely that individual story pages were being typed by users to enter the site. It was clear that there was a problem in the story pages.
I also noticed that the data for another similar section - slideshows - was looking strange. The average time on the page had dropped, but the bounce rate had also improved.
Review of data sources
This website had universal and classic tags in parallel. While moving from one tag to another, it is advisable to have a short period where both tags exist to study the difference in counting. This was a fortunate coincidence. If two tags had not been there, it might have taken a bit longer to find the problem.