How to ensure your traffic is accurate in Google Analytics

If you’ve delved into the world of web analytics, you have most likely heard of Google Analytics (if you haven’t, then I think you need to get a spell checker or something because something’s not quite right). Launched in 2005, it quickly became the most widely used web analytics service on the web, with GA4 taking over from July 1st 2023.

If you’re a beginner, you may look at your data and feel a little overwhelmed to say the least. There are so many figures in front of you, it’s hard to know what’s most important and what’s not. You also need to ask yourself – Is this data accurate? Can I trust it? What could go wrong?

Take a look at our top tips to prepare your analytics, prevent and reduce spam traffic and provide the most accurate data. 


We all know the old classic, fail to prepare and prepare to fail. There are some things you should do before you begin analysing data.

  • Create a cleaner view of your analytic property where you can apply filters to ensure you have accurate data. Remove any referral spam, internal traffic and various other flawed dimensions.
  • However, you want to ensure that you have an unfiltered view too. That way, in the event that one of your filters is too aggressive or incorrect, you have this data to fall back on.
  • If your business only trades within a specific country, this can be applied within your filtered view or appear on its own view so that you can look at the traffic that matters.


What could go wrong?

Distracted Dan applied some filters to his Google Analytics traffic. He got rid of all possible flawed dimensions and was sure that no spam traffic could get through. His client had a quick look at how their site was doing. They could see they had under 10 views that month which they weren’t happy about, and they were looking at Dan as the culprit.

He had added too many filters and now even relevant traffic was being filtered out. He forgot to add an unfiltered view. He had a lot of fixing to do.

If he had created an unfiltered view, the problem would have been fixed in seconds.



Spam traffic happens, as we’re all aware of, but there are things you can do to prevent it.

    • Ensure that ‘Exclude all hits from known bots and spiders’ is enabled within your view’s settings to enable ‘bot’ filtering. 
    • Create filters to exclude your own internal traffic by IP. If your sales team processes orders through the site, you can create a separate view to prevent diluting conversion rates and behavioural metrics.
    • If you have many views that you would need to prevent bot/spam traffic on, rather than setting up filters over and over again, you can create a segment which only includes your own site’s hostname(s) or a set list of approved common referral sources/relevant countries. This segment can be shared across views and be used to generally improve the accuracy of your data without a large time investment.


What could go wrong?

Distracted Dan saw that his client’s conversion rate had gone up 50% in January. Amazing, he thought. They’ll love that.

However, he looked into the data and found that all sales had come from his work’s address. Oh. They weren’t actually orders from customers, they were from when his colleague completed those user tests earlier in the month.

If he had created the correct filters to exclude any internal traffic, his data would have been more accurate.



Don’t worry, there are things you can do to remedy spam – it’s never too late!

    • If you find that you have bot/spam traffic within your data, you can filter these out. You can do this by using segments that allow you to accurately review your analytics and ideally find a single identifier for this unwanted traffic. This could be referrer / country / hostname, or a combination of these, that can be excluded within the segment.
    • For days where there are spikes of traffic, adding annotations to your analytics as a  reminder to yourself or to help make other users of your Google Analytics aware of it is extremely beneficial. This works whether it is compared to the previous year or a period long after the data was initially tracked.
    • Once you have identified bot/spam traffic, these can be added as new filters to your view to prevent any further data coming from these bad sources.


What could go wrong?

Systematic Sophie was looking through some data at the end of the month ready for her monthly meeting with her client. She saw that on Tuesday there were 240 views, Wednesday there were 2450 but it was back down to 210 on Thursday. Hm. She asked Distracted Dan if he had any idea what had happened.

Now that he thought about it, he did remember something causing that a few weeks ago but had forgotten to write a note about it.

Sophie had to analyse the data again and figured out it was because the client was mentioned on the local radio that day, causing a spike in views. 

If Distracted Dan had added a note to remind all users of this fact, it would have saved a lot of time.


Although these are just a few tips, luckily we are experts in analysing web traffic in order to ensure it is accurate, from real users and not warped by spam. Get in touch with us today. 

View More

Contact Us

The Old Smithy,
Church Street,
SY11 2SP

01691 662712

Privacy Policy