SEO testing: Shifting from reactive to proactive strategies

A/B testing for SEO is different than typical UX/CRO testing. Despite how quickly search evolves and best practices change, some changes must be tested before rolling out.

Testing can help justify further investment or help prevent a potentially negative impact. 

Why we need to test SEO strategies 

Testing and proving a strategy is more important than ever as companies are looking really closely at the ROI of marketing in general and especially SEO.

Decision-makers are looking for ways to justify their spend. When it comes to SEO, ROI is fairly ambiguous.  Whether we’re trying to forecast the impact of our recommendations or report on the impact made after the fact.  

As SEOs, we rely on KPIs like organic traffic and organic share of voice or rankings. But looking at these metrics in a vacuum misses the mark. A single data point doesn’t reflect the big picture of how our efforts impacted things like expertise, authority or trustworthiness.

What’s worse, with the rollout of GA4 and a brand new attribution model, our numbers and historical data are more convoluted than ever. 

Testing is your ticket to certainty and confidence in the new age of search.

How to get buy-in

The uncomfortable truth is, that even as SEO experts, we don’t always know what is best. We throw around phrases like “best practices” rather than concrete terms or rules for a reason. Algorithms are secret and are constantly changing and evolving. 

Testing SEO strategies and tactics can help prevent failure or, at a minimum, mitigate the risk of potential harm. It can mean fewer requests for engineering support and fewer deployments and subsequent rollbacks. This creates more time to optimize growth opportunities. 

It also provides what most stakeholders want–hard data on our impact–which can improve communication around SEO efforts. 

Get the daily newsletter search marketers rely on.


Methodology and a growth framework

A typical (admittedly simplified) SEO process may look something like this:

Recommend: Make a recommendation based on best practices or past experience.

Implement: Implement changes to live site. 

Analyze: Report on the impact, if it is detectable. 

But testing for SEO is all about iteration. Ideate, optimize, test, refine. Repeat.

With SEO split testing, the process evolves to look something like this:

Ideate: Create a hypothesis to test.

Group: Categorize and define your control and variable page sets.

Implement: Implement changes in the variable group. 

Monitor: Monitor changes and progress over time.

Analyze: Evaluate the efficacy of the test.

Refine: Based on the data and your analysis, refine and analyze further as needed. 

Implement: If results are favorable, roll out the variant to the entire page set. 

It’s important to highlight the second step: Group. 

SEO A/B testing varies from CRO/UXO testing because it requires a group of pages. A CRO test divides users between two versions of a single page. You cannot implement an A/B test for SEO on a single page. SEO split tests require splitting a whole group of templatized or extremely similar pages into A and B groups. 

Almost any page element can be tested. It all depends on your specific business and website goals. Testing may change if you are working to improve click-through rates vs. grow organic traffic as a whole vs. positively impact user experience on the site.

A simple SEO A/B test may involve testing title tags and meta descriptions that drive higher click-through rates. Or maybe H1s or CTAs can be tested to improve engagement and/or conversions. More advanced tests may involve changes to things like page layout or site structure and internal linking. Even things like breadcrumbs and product filters or naming conventions can be great things to test for SEO efficacy.  

That said, not all websites are conducive to A/B testing. Your site needs substantial traffic and a significant number of templated pages. A number of tools require a threshold of over 100k organic visits per month or 500k total visits per month. For example, an ecommerce website is prime for testing with a large number of category pages and product-specific pages. A multi-location website is another great example for testing, assuming there are a large number of similar location-specific pages. 

A framework for running manual tests

To get started with whatever you decide to test, follow this scalable workflow:

Ideate: Formulate your hypothesis as a testable statement, but keep it simple. Think of this in terms of three key elements –  IF + THEN + BECAUSE.

Group: Define groups of pages that have the same template, same traffic, same user behavior. The more similar the pages are in terms of format and purpose, the better. And the more historical data you have to work from, the more accurate your hypothesis and the more successful your results.

It’s imperative that the pages being tested have sufficient traffic evenly distributed across the group. For example, for an ecommerce site, all product pages included in a test should have a minimum of 1,000 combined visits per day, evenly distributed across the set. 

Identify key areas for testing based on business goals and user behavior. Prioritize tests to maximize impact and minimize risks.

Define methodology: Set clear expectations for all aspects of the test such as implementation method, duration, definition of success, etc. Split your subset of pages into a control group and a variant group. 

Monitor: Set up a tracking dashboard using a tool like Google Data Studio. This is your testing secret weapon – easy access to your data makes monitoring and analyzing your test frictionless. Google Data Studio also has the added bonus of easy cloning and customization for each dataset and test.

Implement: Give the pages that SEO makeover and watch them sparkle!

Analyze: It can take a while for the full impact of any change to be realized. Monitoring consistently might feel tedious, but it is worth it. Make sure you leave your test running for at least a few weeks to see a statistically significant improvement. It is unlikely you will see an immediate impact with SEO testing — it takes time. 

Determine next steps: Weigh your results against the level of effort and resources required to roll out an update and identify the best course of action.

You can determine a winner by demonstrating the impact of the testing on SEO performance and business outcomes. Was the variant more successful than forecasted? Was the variant more successful than the actuals of the control? 

Don’t be afraid to keep iterating to improve performance. 

Dig deeper: Framework for running manual SEO tests

Advanced SEO testing using Google Ads

With a little extra budget, SEOs can utilize Google Ads to test tactics quickly and efficiently. 

Metadata testing

Title tags and meta descriptions are essentially organic ad copy. Testing that organic ad copy – metadata – has historically been a long-term process. Whether approached as implementing best practices site-wide or through A/B testing, it just takes time. Using paid search ads we can expedite the process. 

Follow this step-by-step process for testing metadata with Google Ads:

Identify what pages you want to test. A great place to start for this test is identifying poorly performing pages in terms of organic click-through rate or rankings. 

Once you have identified what page(s) you want to focus on, you can start testing. Using responsive search ads you can test different title tag/headline variations and different description/ad copy variations. The landing page for every ad should be the page you’re testing optimization on.

Create an organic sandbox campaign for all SEO tests so they’re in one place and easy to manage. This metadata test would be a single ad group within the campaign. 

You’ll need a minimum of three headlines, but you can enter up to 15. The number you should use is dependent on how broad your keyword theme is, however, at least five variations is recommended. The more unique each title tag/headline, the better.

Because the goal is to test specific copy, avoid using Dynamic Keyword Insertion in the headlines and avoid pinning positions. 

You’ll need a minimum of two descriptions, but you can enter up to 4. For a more comprehensive test, you should use all four available. 

Similar to A/B testing, this method tends to work better with higher volume queries – so that the ad shows – which often means priority pages on the site where you’re targeting high volume keywords.  

According to Google, “Over time, Google Ads will test the most promising ad combinations and learn which combinations are the most relevant for different queries.”

Once you have a winner you can use the key elements of the ad copy to influence your title tag and meta description. Which keywords performed better in the headline? Which descriptor terms and messaging performed better in ad copy?

Tools to make testing easier

There are a ton of popular tools for CRO testing, but not all of them are equally effective for SEO testing. Below are some great options focused on SEO considerations.

SearchPilot is a personal favorite as far as SEO testing tools go. Unlike some other options, SearchPilot was designed specifically for SEO testing. They boast server-side testing with easy implementation with no engineering or development required. They also have integrations for almost every platform, CMS or CDN.

SplitSignal is now part of Semrush Enterprise, but this is another great option that was created specifically for SEO. It makes SEO A/B testing easy and doesn’t require development or engineering. It’s easy to set up and uses Google’s Causal Impact model to help analyze results and determine a winner.

Although Optimizely was not designed for SEO specifically, it is another solid option for A/B testing for SEO. With Optimizely Experiment, you can run tests focused on optimization or personalization. The interface is easy to use and this tool also requires little to no development or engineering resources.

Build confidence in your SEO strategy

SEO split testing doesn’t have to be confusing and difficult.

Embracing a culture of experimentation and iteration means SEOs can better adapt to shifting trends and validate strategies. Finally. We can take the fundamental approach outlined here and scale it to more advanced testing and techniques – with or without fancy tools. We can give the people what they want: sound ROI data. 

Gone are the days of the stereotypical SEO answer, “It depends…” 

Today, we embrace a new era of “Our tests showed…”