A/B Testing Could Make A Massive Difference To Your Site’s Effectiveness

StoryformI’m a fan of opinionated design. The most innovative designs tend to be the result of smart designers solving problems in novel ways. The solution often comes at a problem from an angle that hasn’t been tried before. Sometimes it works, sometimes it doesn’t, but that’s true of everything innovative.

I tend to look down on design entirely driven by the numbers. Design dictated by metrics and analytics may be effective, but it can lack daring, flair, innovation, and soul.

Many designers feel the same way. Their taste and problem-solving abilities are honed over years of training. They’re loathe to turn to anything as unsubtle and blunt as analytics to shape their creative work. It’s an understandable bias, but, if followed to extremes, it can damage the effectiveness of a site.

It’s damaging because no web designer, however smart, can predict the outcome of the interaction of human users with a site. Design principles and designer taste aren’t written in stone, and they don’t always — or even often — survive encounters with real users. Designers, developers, and marketers must work together to create web experiences that optimize for real-world use.

A/B testing — also called split testing or multivariate testing depending on the number of alternates tested — is a key part of moulding web experiences to user behavior.

The principle is simple: users are shown web pages with small variations from the existing page or from a reference page. Each page has a purpose: to gather contact information or sell a product, for example. The variations are tested to see which is most effective at achieving the page’s job.

A recent article from Basecamp illustrates the point nicely. When Basecamp rebranded from 37Signals, they embarked on a major overhaul of their homepage. They replaced the sign-in form and call-to-action with a text-heavy page that focused on the benefits of their main product — a project management application. Over the next year the site saw a decline in sign-ups. Nothing catastrophic for the business, but a decline over enough time to result in a multimillion dollar revenue difference. In an attempt to rectify the problem, the company implemented an A/B test —vsome visitors saw the existing site and some a new version with a resurrected contact form. Signups immediately increased by 16% on the variation.

The lesson learned?

“We didn’t A/B test this change, which meant it took a long time to notice what happened. An A/B test of the new marketing site vs. old, conducted back in February 2014, would likely have caught the lower performance of the redesign within a couple of weeks.”

Basecamp is a thoughtful company and had good reason for the changes they made in 2014, but had they A/B tested, they’d have seen that those ideas didn’t hold up well in real-world use.

Google Content Experiments

There are lots of different ways to implement A/B testing on your site, but one of the easiest to use is Google Content Experiments, part of Google Analytics. In brief, you create a one or more variations of a page, embed a JavaScript snippet on them, set a goal for the test, and add the variation URLs in the Content Experiments interface. Google will take care of the rest. It’s a bit more complex than that, but Content Experiments has some great documentation to walk you through the process.

I’m all in favor of bold innovative design, but design that fails to do its job is bad design. With A/B testing, designers and site owners can be innovative and right.

Matthew Davis is a technical writer and Linux geek for Future Hosting.

Dedicated Server Special

Take advantage of our Double RAM offer on the E3-1230v2 4 x 3.30GHz+HT server! Only $134.95 per month. Managed and Unmanaged options available at checkout.

GET STARTED