Lessons Learned from SEO Tests that Didn’t “Win” โ€“ Whiteboard Friday

  • Post author:
  • Post category:SEO

We love to talk about winning SEO tests, like those wonderful instances where you run an A/B test and you see positive impact. In todayโ€™s episode, though, Will is going to discuss the losing tests: those with negative results โ€” or no results โ€” where you couldn’t prove an impact.ย 

These test results are, in fact, where you can likely find the most valuable insights.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi, Moz fans. My name is Will Critchlow. I’m the founder and CEO at SearchPilot. We run tons of SEO tests, and if you’ve ever seen me speak on one of these before or on a bigger stage, you have probably heard me talk about a lot of winning tests, those nice situations where you run an A/B test and you get an uplift and you get to celebrate. Today, we’re going to be talking about losing tests. So these can be the negative ones or the ineffective changes, the ones where you just couldn’t prove an impact in either direction.

So this is fundamentally that situation where you find an insight. It might be keyword research. It might be from technical auditing of the site, whatever it might be. You have a theory. You have a hypothesis or something that is going to benefit your website. You implement the change as a result, and you fall flat on your face. You fail spectacularly, and your test result data looks a little bit like this.

Now, this is actually quite an exaggerated case. A lot of the failures that we see are -2%, -3%, or just flat line, and those -2% and -3% type ones can be really hard to pick up without scientifically controlled testing, which is what we focus a lot of our time on, on really big websites. They can really add up. If you are continuously rolling out those little negative changes through the course of the year, it can really be a drag on your SEO program as a whole. But they can get lost. You roll out that change, and it can get lost in the noise, the seasonality, other sitewide changes, Google algorithm updates, things your competitors get up to. That’s what we’re trying to spot and avoid.

What can you learn?ย 

So what can we learn from losing tests, and when can they benefit us as a business? Well, one of the, perhaps, counterintuitive benefits is the drop in effort that you might be asking of your engineering team. If you have all these ideas and previously you’re asking your team to build all of them, but if you run tests and you find that some of your ideas were negative, some of them were ineffective and weren’t going to benefit you, you’re now only asking your product and engineering team to maintain the ones that turn out to have a positive SEO impact. We’ve seen that be up to an 80% drop in SEO tickets for engineering. So that’s one business case right there.

But, of course, sometimes your tests look like this, and so actually the business case is about avoiding those negative impacts on your website.ย 

Tactical examples

So I’ve got a couple of tactical examples that I thought would be good to run through that might be useful in your situations as well.

The first one is a case of removing SEO text. So we’ve seen many cases where think, say, a category page on an e-commerce website, for example. You’ve got a bunch of product listings, and then somewhere down at the bottom of the page, there’s a bit of copy. Maybe it’s in a div, seo_text. Maybe it’s a really small font, gray, not exactly white on a white background, but clearly not designed for human eyes. We have run some experiments where we had situations like that, with pretty poor-quality text on category pages. We tested removing it and actually saw a statistically significant drop in organic visibility, which is a shame, because we know that this isn’t high-quality text, we know it’s not where Google wants us to be, and yet removing it was a bad idea.

One of the things we can learn from that is say, firstly, don’t throw the baby out with the bathwater. You can’t just knee-jerk react to Google’s PR and all these kinds of things and say, “Well, best practices say this. Let’s just do it straightaway.” You can’t do that without testing because you might be hurting your website. But it does point to a direction of potential future improvement, because if having terrible text is better than no text, having good text might be even better. So one of the things that you benefit from with a losing test is you get to learn, and so you get to point the direction to some insights that might be positive for you in the future.

The other example I’ve got for you here, you might be wondering what on earth this is, and art is not my strong point. This is an Easter egg. Trust me, this is an Easter egg. We saw an example of a website, that operated across the whole of Europe, multiple different country/territory websites, testing adding seasonal offers. So in this case, it was about Easter travel, Easter breaks, Easter flights, those kinds of things. The keyword research had suggested that there was demand for this, that the audience is searching in this kind of way, and yet, adding those offers to the page was negative, and that was very surprising. What it turned out was going on here was that it was diluting the quality of that page for the things that were the bread and butter of those landing pages. So, yes, it was ranking better for some Easter travel-type related searches, but it was doing worse for the bulk of traffic of just trips to city name or whatever it might be, and the net impact was negative. That’s the kind of thing you can only pick up by searching.

So I hope you’ve enjoyed this little journey into losing SEO tests and what we can learn from them. My name is Will Critchlow. I’m at SearchPilot. You can find me on Twitter, @willcritchlow. Look forward to chatting to you soon. Take care.

Video transcription by Speechpad.com