CRO

Split Testing

A practical UX and optimisation method for comparing fundamentally different experience directions with controlled live traffic.

How to use split testing to compare distinct versions of an experience, measure performance reliably, and choose the stronger direction with confidence.

01 August 20144 min read

Quick take

If you want to compare completely different approaches, split your traffic and measure what wins.

What it is

Split testing is a UX and method where users are divided into groups and shown entirely different of a page, , or experience.

Unlike , which usually tests small changes, split testing often compares significantly different designs or structures.

Each is hosted separately, and is distributed between them.

is measured using defined metrics such as , , or task completion.

The focus is on identifying which overall approach performs better.

The goal is to make informed decisions when choosing between fundamentally different directions.

Split testing is most useful when the decision is between distinct strategic directions, not minor interface tweaks.

When to use it

Use this method when you are comparing big changes.

It is most useful when:

You are testing completely different designs or layouts
You are exploring different product directions
You want to validate a major redesign
You have enough traffic to support meaningful comparison
You need clear performance data

It is less useful when:

you are testing small or minor changes
traffic is limited
you are still in early exploration
Split testing is often used in optimisation and redesign scenarios.

Key takeaway

Use split testing when you need behaviour evidence to choose between materially different experience approaches.

How to run it

Set up properly.

Before you start, be clear on the variations you are testing, the success metrics, and how will be split.

Ensure each is stable and functional.

Run the method.

Split testing is controlled and comparative.

Create separate of the experience. Divide between them. Run the test over a defined period. Collect data. Keep conditions consistent.

Avoid introducing additional changes during the test.

Capture and make sense of it.

The value comes from clear comparison.

After the test: compare across , assess , identify the stronger approach, and apply learnings to future design decisions.

Use this to guide major direction.

What to look for

Focus on:

Performance
Which version performs better
Behaviour
How users interact with each approach
Metrics
Conversion, engagement, or completion rates
Differences
What drives the performance gap
Impact
Scale of improvement or decline

Where it goes wrong

Most issues come from:

If the test isn’t controlled, the outcome is unreliable.

insufficient traffic
testing too many differences at once
unclear success metrics
ending tests too early
misinterpreting results

What you get from it

Done properly, this method gives you:

clear direction between different approaches
data-driven decisions for major changes
reduced risk in redesigns
insight into what drives performance

Key takeaway

It helps you choose the right path with confidence.

Get in touch

If this sounds like something you need, we can help you run split tests that give you clear direction and in your decisions.

No guesswork. No assumptions. Just that shows what works.

FAQ

Common questions

A few practical answers to the questions that usually come up around this method.

What is split testing in UX?

It is a method for comparing completely different of an experience.

When should you use split testing?

Use it when testing major design or structural changes.

How is it different from A/B testing?

Split testing compares larger, more distinct variations.

What can you test?

, full pages, , or product directions.

Does split testing improve UX?

Yes. It helps validate big decisions with real .

LET'S WORK TOGETHER

Ready to improve your product?

UX, research and product leadership for teams tackling complex digital services. The work usually starts where things have become harder than they need to be: unclear journeys, inconsistent products, competing priorities, or teams trying to move forward without a clear direction. I help simplify the problem, shape the right next step, and turn complexity into something people can actually use.

Previous feedback

Will Parkhouse

Senior Content Designer

01/20