As a web design and user experience company it is important to us to present out clients with accurate web traffic data. Unfortunately there are a both spam bots and monitoring services that access your site and skew your analytic results. We will focus on setting up filters today to attempt to only show results of human eyeballs viewing the site not computer generated spam or monitoring services. This will more accurately reflect how real people interact with your website. Computer generated traffic is generally shorter visits less pages visited and a higher bounce rate. Removing the spam data will help you properly analyze your results and help you make changes to your website to increase your conversions and marketing goals.
We’ll use our website as an example. Below is a screen shot of the referral traffic for the past month.
We enabled a spam filter half way through the month around June 3rd and you can see the overall daily sessions decreased fairly dramatically. This graph shows how much inaccurate data your site could be receiving.
Looking at the data in more detail you can see the various references that drive traffic to your site. The first two on the list don’t make much sense for a web and software development company. When I go to the URL listed it appears they are just trying to sell their services. They use Google Analytics to drive traffic to their sites by creating false referral records and in turn false analytic data.
You can also see the time on the site for two of them is 0:00. So they don’t even record being on the site as opposed to a person from a Yelp referral who spends over 4:00 on the site per visit.
To remove the inaccurate data from your results there are a couple of steps to take. The first step is to create a new view for your property. This is google's recommended method and provides you with a view of all data and an additional with just the accurate data. Within a short period of time you will see the difference in the results and realize how inaccurate your results have been.
On the admin page on the far right column in the View Dropdown select Create new view.
Give the view a name. I typically just use Main View.
Once the view is created on the left side select All Filters. Then Select Add Filter.
Give the filter a name. I use Exclude Spam as that is purpose of the filter. Then select Custom and Exclude option. For the Filter Field Select Campaign Source. In the Filter Pattern field enter in the URLs that are spam and be sure to include a “” before any period. You can add as many Spam URLS as necessary just separate them with a “|”. Select the views you would like to add this spam filter to and click Save.
During testing I have found that the validation link doesn’t always work properly when you have more than 3 URLs in your filter.
However view does properly work.
Now you have a view that improves the accuracy of your analytic data moving forward. This way won’t retroactively change you data but after a couple of weeks you will see differences between your primary view and the new view. You can see our data below our overall sessions have almost been cut in half but the length of each session is longer and the bounce rate is significantly decreased. This data can used this more accurate data to assess your website and make informed decisions.