A hundred robots walk into a store. Each one looks at two bananas and walks out. Would you start stocking more bananas just because they visit every day?
The cookie prohibition introduced by the Information Commissioner two years ago had a predictable result. Internet users are no safer, and it harmes the statistical insights of socially responsible companies. The traffic indicators became much less clear, making it difficult to determine with certainty which online ad led to a higher response.
And now, robots
A few months later, web bots from scammers began to affect the statistics. We became aware of this when several websites we manage experienced an unusual increase in traffic, ranging from 15% to 100%, depending on the type of site and its previous traffic.
Due to the sudden increase, which had no apparent clear cause (no special advertising campaigns or other activities), we conducted a detailed review of the analytics. Among the referrals, we found websites like:
- 100dollars-seo.com
- best-seo-offer.com
- buttons-for-website-com
- ...
Visits from these websites appear legitimate, but they are actually caused by web bots - small programs that simulate clicks on website links.
The goal? To trick uninformed website owners into visiting these portals and paying for more traffic.
Simple solution? Exclude these sources?
There are some concrete suggested solutions on how to limit this problem. However, the challenge is that these sources are not constant.
An interesting option would be to create a filter based on the visitor's country, since most of these bots come from Russia. However,this wouldn't be advisable for a successful international company, as it would limit valuable traffic from that country.
The scammer's model will likely prove successful and evolve into new variants. We can expect that in the coming years the list of fraudulent domains will become endlessly long, sometimes with only a single letter distinguishing a legitimate domain from a fraudulent one. Maintaining lists of invalid domains will therefore be challenging.
If you want, you can install a solution on your web server that ensures the maintenance of these lists.
This phenomenon is growing in complexity, and you can read about its complexity in 207 in this analytical article.
You can contact us and we can implement a basic filter in your web analytics. :)
A small (or big) error in collected data
If you have cookie permission enabled but you website is also targeted by fraudulent bots, your web analytics is probably quite confused, The brakedown might be:
- 35% of real visitors are not in the data (cookies disabled)
- 46% of real visitors are in the data
- 19% of visits are bots
The number of page views or clicks on a certain link is therefore only an approximation of the actual picture.
Even if we manage to filter out bot traffic, a big problem remains: the unknown share of users without cookies.
Usefulness of web analytics?
But this does not mean that analytics is useless. With proper settings and monitoring, a good web administrator can;
- reduce the amount of bot traffic in the sample to a level that does not significantly affect the results;
- prepare data reports both with and without bots and compare the results;
- understands the activities of registered user and, above all, correctly interpret the findings.
Such results will be relevant and useful. We can still trust web analytics, but precisely because of the importance of accuracy, it is wise to involve experts. So you don't end up buying too many bananas. :)