Last year Audigy debuted our conversion rate optimization services, which aimed to increase digital conversions. We’ve seen some fairly dramatic improvements as a result, but even with the positive numbers we’re seeing, there is often still confusion about processes. A large portion of it is A/B testing, a process that can seem fairly simple but requires several layers of technique to do properly.
How Do You Run an A/B Test?
First things first: Let’s take a step back and identify what A/B testing is and how it can help you drive conversions. As the name implies, A/B testing (sometimes known as multivariate testing) is the process of making two slightly different versions of the same thing and seeing which is more likely to do what you want them to do. You make a guess about an improvement and run a test to see if it pans out the way you had expected.
Does this process sound familiar? Perhaps it reminds you a bit of high school science classes. That’s because the same scientific method applies: You start with an observation, make a hypothesis, run a test, and come to a conclusion.
There are several ways to gather the information you need to make an educated guess. We use a number of web tools, analytics dashboards, and sometimes just old-fashioned guess-and-check work to inform many of our decisions. Some of the tools (like Google Analytics) gather incredible quantitative data that tell us something definitively. Other tools require a bit of intuition; for example, heat-mapping tools help by visualizing where people are interacting on a given page (if everyone clicks the same button, or if people don’t tend to scroll past a certain point on the page). You’d be surprised by the insights we’ve gotten just by asking a stranger to look at a site for five seconds and tell us what they remember about the page.
Here’s a hypothetical example of what this process might look like: Using Google Analytics, we observe that there are two pages making most of the conversions on our website, but one of them converts twice as much as the other. We also observe that the high-converting page has a pretty strong call to action while the other does not. So we hypothesize that the call to action is causing people to convert. Next we run a test: We make a copy of the page we want to improve conversions on. Half the people who go to that page will see the original version, and the other half will see a version with a strong call to action. After several weeks, we look at the data — conversions have improved! We can therefore conclude that the improved text bumped up conversions. Importantly, we then decide to go back to using only one version of the page — the one that converted better — for 100% of the people who visit.
That’s the heart and soul of the process, and the beauty is that it’s endlessly replicable for so many different kinds of goals. If you can think about it, you can probably test it.
Problems With A/B Testing
Of course, just like in any kind of marketing, there’s no silver bullet or guarantee. And there are problems that can occur. One of the trickiest is knowing when to stop testing: Run it too long and you might miss out on other opportunities; cut the test too early and you may not have enough data to make an informed decision. This is especially true with sites that get very low traffic (which is the case for a lot of audiology practices). It can be easy to jump the gun and make a decision when there simply isn’t enough data to back up your claim. If your site has too little traffic, then it might not make sense to test at all; it could take ages to gather enough data to form an accurate conclusion. With that in mind, it’s worthwhile to invest your attention where you know you can learn the most and do the most good.
It’s also possible to start with the wrong questions or misinterpret data — in the example above, was it really the call to action that improved conversions on our page? Or was it the 20%-off coupon and the picture of an adorable kitten that we put right next to the form field? For the record, we’d buy anything with a kitten offering a good deal, and we’ll bet we’re not the only ones.
Because of this, testing ought to be an ongoing process. When you’re confident with a result, implement it, but don’t presume that you’re done making your page perfect. People are fickle (especially on the web), and what makes for great conversions today might be old news tomorrow, as well as a turn-off for potential patients. Plus, who’s to say that even if you made some improvement, you can’t make additional improvements later on?
What to Expect
The process makes sense, but savvy business owners will always be curious what kind of results they’ll get before they invest. Unfortunately, the whole nature of testing is that you don’t quite know what the results will be. If we knew the future, we wouldn’t be testing, right?
But the prospects are promising. Recently, a test that we ran for just under two months came back with a 330% increase in clicks to a conversion page — more than three times as many people clicked through to a place where they could make an appointment. More often, you can expect more modest changes that might not seem like much at first but will accrue over time. Other times, you’ll find that a hypothesis simply wasn’t right; your actions didn’t do anything or, worse, decreased your results. That’s never a loss, though. Knowing that something doesn’t work will keep you from making that mistake in the future — which brings us to our final point.
Not Everything Is Important to Your Marketing Strategy
With so much data available, it can be easy (and overwhelming) to assume that you have to figure out everything all at once. When you first start using Google Analytics, it often seems like a cruel riddle too difficult to solve. The real secret is that not everything is worth deciphering. You probably don’t need to devise a test to find new ways to funnel visitors to a page about your HIPAA compliance. And some people (let’s face it, the majority) who visit your site are simply not going to convert, so there’s no sense in testing to find out how to change that.
So what should you be testing? That all depends on what you see already, and what you would like to see. Does your analytics dashboard show you that you’re getting a ton of traffic from Facebook but it isn’t converting? You might run a test to see if a new content approach can help. Or do you have a feeling that there’s a gold mine of patients out there interested in your new, state-of-the-art hearing testing equipment? Try updating the copy on your conversion page to reflect that, and see if it does the trick. Does a heat map show you that visitors are only scrolling about halfway down the homepage? Experiment with moving your call to action to the top to make sure they see it.
Interested in A/B Testing for Your Site?
These processes take a lot of patience, technical savvy, and good luck in order to work well. If you have the time and ability to do the testing, however, you’ll be very pleased with the results. (It’s fun and exciting to see your hypotheses turn out to be right!) Fortunately, if you need a bit of help, Audigy is here to set you on the right path. Get in touch and ask about our digital marketing services, including our conversion rate optimization program. We’d love to chat with you about how we can help!