A/B and multivariate testing is a bit of a hot topic in online marketing circles. It’s been around for a long time (since the early days of direct mail marketing) but seems to have picked up steam in recent years, especially as robust yet free options like Google Website Optimizer have made these techniques more accessible to marketing managers. Heck, there are even some tremendous web portals – like www.whichtestwon.com – that are devoted to sharing test results and other valuable content for folks with a keen interest in this particular marketing niche.
But what most of these tools and sites fail to touch on is the value that A/B testing can have on the success of your link building outreach efforts.
And how do you go about implementing an A/B test for link building? It’s really quite simple. Hopefully, this short story will drive the idea home:
Back in the day, I had the pleasure of working with a guy named Bob Generale (I call him “little Bobby” but that’s neither here or there). In my humble estimation, he is one of the greatest pure link builders I have ever met. Anyhow, we both used to agree, and preach to the rest of our SEO team, that one of the keys to link building outreach is brevity. In other words, keep your initial outreach messaging as short and sweet as possible. The less words, the better (this mainly applied to email outreach, but it can also apply to outreach done via phone or via social channels).
The thing is that we had some other really solid link builders who had plenty of success sending outreach emails that were fairly verbose, and so they often chose to stick with wordier outreach, since it seemed to work fine for them. After all, if it ain’t broke, don’t fix it, right?
Over the years, that interplay made me realize that testing different versions of outreach could be valuable, since we all know that no two link building niches are the same. Mind you, in this particular example, the variable to be tested was the word count (short vs long) but there are many other variables worth testing, such as the subject line of your email (Note: A/B testing for email outreach is similar to A/B testing for traditional email marketing in a lot of ways).
What I quickly realized, and put into action, was the idea of performing organized and regimented tests (e.g. make sure you send both versions equally and randomly) to see which variation of an outreach email worked best in terms of eliciting an initial response as well as leading to the ultimate goal; a secured link. And the result was impressive. By simply testing two slightly different versions of an initial outreach email over the course of a few dozen (or hundred) emails, I was able to uncover specific nuances that improved my chances of eliciting a response (and corresponding engagement that led to inbound links).
Here are a few things to keep in mind if you decide to try this out for yourself:
- Due to the relatively low amount of actual outreach instances, the sample size is small, which renders these tests a notch below true scientific/statistical viability
- This should be an iterative process. In other words, keep testing different variables. That will lead to the best results. Don’t make the mistake of thinking that the “winner” of your first test is the best variation possible.
- Since one of the hallmarks of successful email outreach is originality and personalization, refrain from creating variations that are completely cookie cutter in nature. Always reserve the right to stray slightly from your templates in order to personally connect with your intended recipient.
- Make sure the site and/or page you’re trying to build links for is worthy of being linked to, otherwise, no amount of A/B testing will save your outreach from failure.