Landing Page A/B Testing in Eloqua? Yes, please.
“Test, test, test!”. More than just a bizarre marketing mantra that will stay with you forever once you’ve read this article, this is a genuine way of getting to know your audiences better.
Let’s be honest for a moment: we, marketeers, sometimes think we already know everything about our audiences simply because we persistently think about them. But the truth is we often tend to project our own beliefs onto the content we create, instead of mirroring our customers’ preferences and reality.
A/B testing, or split testing, offers us the chance to see which components of our marketing endeavors perform well when facing our target audience, which parts perform even better or, conversely, underperform and should be reconsidered.
From emails and landing pages to forms and calls-to-action, from content length to email send time or frequency, anything can be tested and improved.
Constantly questioning your ideas and putting them to the test may seem like a burden, but it’s quite the opposite if you change the perspective. While building something based on what you assume is “known” may or may not bring the expected results, investing that little extra time and effort into the testing phase offers you essential insight. And this can help you better tailor your future strategies and assets for increased conversion rates. An informed marketeer is a wiser marketeer for sure.
When you think about A/B testing run through Eloqua, emails are usually the first assets that come to mind. Whether it’s the out-of-the-box A/B testing option offered by the Simple Email campaign builder or the more advanced test version implemented on the Campaign Canvas with the help of shared filters, Eloqua offers you the tools to decide upon various email elements and to be intentional about your trial.
But what about running A/B landing page tests through Eloqua?
The methods to achieve this type of split testing may not be as obvious as in the emails’ case. Read on to learn about an innovative solution we found to help a client identify the ideal landing page layout and maximize registration rates for a virtual event addressed to multinational audiences.
The Challenge
The online event we’re discussing has an undeniable top 3 spot in terms of importance and exposure in our client’s agenda. So, its success had to match its budget, high-profile speakers, and the quality content delivered.
An extensive online campaign was developed around the event, but our main focus was to run the A/B landing page testing, to build all the assets that would allow us to conduct this marketing automation experiment in Eloqua, and, finally, to share the results to the entire event’s benefit.
The Method
We built two landing page versions, a form, three pre-show emails, one piece of dynamic content, and all the automation behind these assets. Next, we connected the test to the first email, while the others were scheduled for launching after the verdict.
To make sure our test results were relevant, we adopted a rather scientific approach, driven by a few principles we recommend for any kind of A/B testing:
1. One variable at a time
Many landing page elements are eligible for testing, but for this project, the general landing page layout was chosen as the testing variable.
On the one hand, the A page version was drafted in line with what the brand had used in the past: rather illustrative, starting with a featured image that would tie in all the other content elements together and make the message more inviting.
The B landing page version, on the other hand, displayed a more simplified design, replacing the featured image with a punchier header font and alignment.
All other page components remained the same for the two versions: the copy, the speaker details, the form placement, and its number of fields. Both versions had great responsive designs, but only one was going to be the winner.
2. Divide and conquer
Out of the entire contact database, we selected a segment of around 500+ thousand contacts that best fit the ideal event participant profile.
Because the client wanted to quite literally “speak the preferred language of each audience member”, we took the segmentation process up a notch and created sub-segments based on region and language. This way, we were able to conduct the A/B test for each sub-segment and thus reach rather interesting results.
Using the power of automation, we split the traffic between the two registration page versions. To do so, we added CTAs leading to the test pages to dynamic content as part of the first email invite so that roughly 50% of each sub-segment would see the A version, while the rest would only have access to the B version.
To ensure a random division of contacts, we added wildcard pattern rules inside the dynamic content piece, as depicted in the figure below:
Note: In a “*1?” wildcard pattern, “*” can be any succession of characters, and “?” can be replaced by any character. Contacts whose Eloqua Contact ID doesn’t match the pattern can only see the default dynamic content.
3. Tests take time
You can’t expect to get valid A/B test results overnight, but as the old saying goes, “Good things come to those who wait”. For our pages, we waited a week before drawing any conclusions and taking further steps, although we permanently monitored the process to make sure it went as smoothly as expected.
4. Metrics do the trick
To be able to proclaim a winner in A/B testing, you need to first decide upon the metrics you’ll be taking into consideration at the end of the experiment.
Our experience taught us that it’s always best to begin with the end in mind because even if it’s fun to be spontaneous and creative, when it comes to data and metrics, you really don’t want to improvise. Since we used emails to drive traffic towards a page hosting a form, we concluded that a percentage between unique form submissions and unique visits was the ideal way to measure success.
5. Use what you’ve learned
You can run an A/B test out of pure curiosity, but when you’re working on a time-sensitive event involving various resource types, the true gain comes from being able to use this new knowledge to your advantage.
When you balance the investment required against the invaluable benefits offered, A/B testing turns out to be quite a cost-effective marketing solution. Once we determined the winner page version for our initial test, we added it to the additional event-related emails. Moreover, we even extended its use, making the winner page version available to audiences outside Eloqua through different channels - social media, online ads, or newsletters.
Besides, what’s rather striking is that when we replicated the test across twelve language-based sub-segments, we haven’t had the same winner. This is solid proof that although people might be inclined to consume the same content, they certainly prefer to consume it differently.
Conclusions
Whether you’re planning a similar landing page A/B testing or you’re focusing on a different type of Eloqua asset, the principles that guided our initiative can be of great help.
Although a testing project can seem daunting at first, start small and let yourself be surprised by the results and how they might either contradict your initial thoughts or confirm your assumptions. The truth is, there aren’t many marketing solutions as cost-effective and revealing as A/B testing, so don’t underestimate this great power and implement it into your own strategy.