Google Analytics is an amazing tool. Half of every website in the world uses Google Analytics.
Among its many uses, Google Analytics was designed to track every person that visits your website and to relate to its creator what content is popular and viewed on a regular basis. Beyond the bare bones basics, Google Analytics will track individual data about specific visitors such as their geographic location and behavioral data- such as page views and exit pages.
However, even the most seasoned Google Analytics user experiences weird or abnormal data in their reports from time to time. For example, a blogger might see a huge peak in visitors from Russia but each visitor only hits your homepage and spends very little (if any significant) time on your page. Unfortunately, your popularity in Russia probably isn’t growing at a rapid rate. You are experiencing the unfortunate effects of virtual bots or spiders.
So, what exactly are bots and spiders?
Google loves to be in the know. In fact, to make sure that they are constantly up to date and have every single website ever created categorized and listed, the search engine literally crawls through the Internet at a perpetual rate and indexes them based on their secret algorithm.
Google implements this strategy through the use of bots and spiders. (Think: a computer program that literally surfs the web at a never-ending pace!) I mean really, why hire thousands of people to do data entry when you can write a simple application that never stops working and never asks for any sick days?
Beyond Google bots, there are thousands of other apps that are made to crawl right into your website. Many webmasters have opted to use apps that validate their site’s cross-browser capability to make sure their website is jiving with all major browsers and low and behold, those apps all use bots and spiders to get the job done. Other bots and spiders might be external to your own purposes and to Google. Many competitors will use bots for competitive SEO analysis. While it is nice to use Google for search engine purposes and personal bots for compatibility or competition reasons, the pervasive use of bots and spiders can create a lot of fake data in your Google Analytics account.
However, the goal should not be a total blockage of all bots and spiders from your actual website. They are necessary tools! But you can stop collection data about their visits which in turn is boosting your Google Analytics reports with fake and unusable data.
If you want to stop collecting data from bots and spiders in your Google Analytics account, you will have to activate a built-in feature made especially for this purpose. Unfortunately, this feature does not turn on in a defaulted mode. You will need to manually set it for each of your websites.
What does this process entail?
Google Analytics has made it easy and has released an option inside your view setting that the user can activate in only one click.
Go to the report view settings, located in Admin > View setting, and make sure the option for “Exclude all hits from known bots and spiders” is checked. Keep in mind that this option is not retroactive which means it will only exclude data from future reports.
The official bots and spiders list is compiled under the IAB’s “International Spiders & Bots List” and is updated automatically every monthly. Unfortunately, this list has not been made available to the general public- it can only be accessed through third party programs.
You may encounter a little drop in traffic by activating the “stop collecting data from known bots” feature but you will have way more accurate data. Good data will move your site forward. Bad data will not help you improve or enhance your site. Use the improved data to get a true picture of what content is popular, where your readers come from, and what areas of your Internet presence you can improve upon.