Digital Marketing: Why are you ignoring the best part of your MVT data?

Can you believe it’s already the middle of February?  Hopefully you’re already well on your way to marketing success in 2014. 

 

As we approach Valentine’s Day, discussions around what do with holiday data have been cropping up again.  In the world of content optimization, holidays can bring a flurry of crazy web traffic and inconsistent test results.  It’s not unusual for some of our HP Optimost customers to see traffic explode or come from unexpected demographic slices.  We won’t debate the merits of running tests during changes in traffic patterns here (another time, perhaps), but suffice it to say that some of our retail customers (particularly those selling romantic items like chocolates, roses, or pizza) will be seeing some, well, interesting data around Friday. 

 

But even multivariate tests run in completely mundane conditions bring back interesting data, and not always for the reasons you might expect. I talked about some surprising test results in my previous blog post (read why AT&T’s 82% lift was so intriguing here) but wanted to dig a little deeper into the topic of hidden value.

 

Review_Analytics-_616x254.jpg

 

Uncovering your data’s golden insights

So what does MVT data tell you?  It starts by identifying the winning creative in each test: the version of your website (or email copy, or call center scripts, or any manner of digital content) that drives the most conversions, or it might tell you what to avoid—what actually drops conversion rates.  Sometimes, after running a test, the winning creative[1] is incredibly obvious. Knowing this, marketers can make the change to their site, sit back, and enjoy the sweet result of higher conversion rates.

 

But that’s not the most interesting part of your data.  If you’re just looking for an obvious winner and then zooming on to the next test, you’re missing out on the best stuff. The real golden insights are hidden in your data. They take a little extra time and effort to uncover, but they’re worth taking the time to find.

 

Digging into your data

HP Optimost’s A/B and multivariate testing data tells you more than just which creative won or lost. It helps you evaluate why a creative performed as it did and—most importantly—in what context. 

 

One way to do this in Optimost is by applying filters to your reports: demographic, locational, behavioral (like page clicks), and other filters all provide a deeper understanding of your resultsThe filtered data helps you (or your Optimost Managed Services team) understand in what other contexts similar results might occur. It tells our customers how to continue driving successful, optimized content to even more specific, targeted audiences.

 

Demonstrating data filters

Let’s say your test reveals one creative that, when filtered just to data from US visitors, appears to be an obvious winner. It outperforms the other creatives by a reasonable amount. This graph shows creative #28 driving more lift than any other options tested with most of the creatives doing at least a little better than the control. (The vertical line at 0% lift represents the control.)

 

DM1.png

 

But let’s say you filter your data down to UK visitors only.

 

DM2.png

 

All of a sudden, the conversion rates look way different.  Creative #28 is still up there, but it’s no longer the best.  And even more concerning, most of the creatives did worse than the control. Notice how most of the bars are to the left of the middle line at 0%?  If deployed as the winner, they could hurt business by decreasing conversion rates. Take a look at creative #24, for example: it was a decent choice for the US crowd, but is one of the worst for the UK.

 

To summarize, a winning creative might perform amazingly well with one audience but totally bomb with another.  And unless you’re digging in and evaluating your test data for more than just pure, simple conversion rates, you’re going to miss golden insights like this.

 

When, why, what, and how
Understanding when (and why) something works (or doesn’t) helps marketers make informed content targeting decisions going forward. The graphs above tell us, for example, exactly what content really resonates with (or irks) two significant audiences.  Next time the company wants to run a targeted campaign to the UK, having this insight on hand will make that campaign more successful from the start. 

 

So it’s true, MVT optimization with solutions like HP Optimost helps marketers identify the most effective, targeted content; but don’t stop there.  Dig in and let your data reveal all the valuable insights that come from filtering it from different angles. Understanding why your data is what it is can be the key to driving more conversions going forward.

 

#HPDMB

 
Edited by Robin Hardy


[1] In case it’s not clear, “creatives” refer to each variation of your content that’s being tested.  So if you’re trying to figure out whether button color affects sign ups, you might have one creative with a red button, one with a blue button, and so on. The “control” creative is just the version of your page that you start with.

 

Leave a Comment

We encourage you to share your comments on this post. Comments are moderated and will be reviewed
and posted as promptly as possible during regular business hours

To ensure your comment is published, be sure to follow the Community Guidelines.

Be sure to enter a unique name. You can't reuse a name that's already in use.
Be sure to enter a unique email address. You can't reuse an email address that's already in use.
Type the characters you see in the picture above.Type the words you hear.
Search
Showing results for 
Search instead for 
Do you mean 
About the Author
Jenny Ryan is a Director of Product Marketing at HP Autonomy. She’s devoted to helping brands learn to love their data, understand their au...


Follow Us
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation