THE RKGBLOG

Test Idea for Display Ads

A recent iProspect Study demonstrated the inter-relationship between display ads and paid search. According to the study: 31% of users respond to display ads by clicking on the ad itself, but another 27% respond by doing a search for the retailer’s brand name. 0ne-third of those who respond to display ads through either mechanism end up making a purchase from that retailer if they were already familiar the brand, whereas “only” 14% bought if they didn’t know the company before seeing the display ad.

Wow, those are amazing numbers! They don’t jibe at all with the data we’ve seen. 33% click through rate for display ads? 33% conversion rate?!? What the heck?

So, I started digging. Turns out this study was a survey of self-reported behavior from invited participants. Odd, doesn’t iProspect have actual data to study?

Measuring the incremental value of any marketing vehicle is challenging and display ads are doubly so. With search, you know the person searched for your products and visited your site through your ad. The connection is pretty clear.

The connection between seeing, for example, a CDW display ad on CNET and then buying from their site is much more tenuous. Some fraction of those folks would buy from CDW anyway, and the fact that an ad was waved in front of them had no influence whatsoever on their behavior.

Behavioral targeting exacerbates this problem. By showing ads to folks who’ve already been to your site, and giving credit to display impressions, the display ads get credit for all sales from frequent buyers, which is obviously absurd! I’d wager that the vast majority of those sales would happen without the display ads.

We have an idea for a test that might actually get at the question of incremental value.

{Ryan Gibson, our Director of Marketing, thinks this wasn’t his idea, but can’t remember where he heard it and can’t find anyone in the blog-o-sphere writing about it. Someone deserves credit for this, maybe Ryan, maybe not.}

A Test Idea for Evaluating the Incremental Lift of Display Ads*

The question is: how much do the impressions by themselves influence behavior?

Try running the following A/B test:

  • Version A is the advertiser’s display ad.
  • Version B is an ad for the advertiser’s favorite non-profit charity.

Track the post-interaction behavior of those folks who saw version A vs the folks who saw version B. Any improvement in subsequent behavior for version A could be reasonably credited to the display ad impression.

However, the display impressions only get credited with the difference between the two versions, because it’s awfully hard to claim that people were equally motivated to buy from Acme by an ad for the Red Cross.

Would any advertiser willingly bankroll this test knowing that half of the money is effectively a charity donation? Maybe, maybe not.

Perhaps Google would be willing to cover the costs of the charity ads to prove the value of Display?

Perhaps a display ad agency would be willing to cover the costs to prove the value?

Perhaps Forrester, ComScore, or Shop.org would sponsor the test as a service to advertisers?

RKG hopes to find a willing partner or three to try this test. If it turns out that display ads are far more valuable than the click-through data suggests we will gladly start offering display ad management to our offerings. If not…

*Note: this test is geared for online resellers more than manufacturers. Coke will never be able to measure the direct impact of an ad campaign. Clearly the impact of Sharp Aquos commercials won’t be seen on their website. This test is more for the folks using display ads for direct marketing purposes.

Technorati Tags: ,

Comments
6 Responses to “Test Idea for Display Ads”
  1. Ken Treske says:

    George – love the article. The test methodology you describe is rare – but it is being done by some. Comscore and a few others (I think Yahoo recently published a one-time study) conducted a few one-off tests.

    We think it is the best way to really understand true incremental effect. We actually run this exact test for every one of our clients on a continuous basis. There’s really no other way to get the kind of insight all the parties need to make smart business decisions. If you want more info drop me a line and I can forward you a few more details.

    Thanks for raising the issue.

  2. Great to hear, Ken. I’m very interested to learn more about your research.

  3. Dylan says:

    hi George,

    That is a very interesting test you suggested.

    I have been trying to figure out the value of view-through conversion on both Display and Remarketing campaigns.
    The question of what percentage of those view-through conversions would have happened regardless of the ad is very puzzling.

    For display campaigns, I also thought about running a test where only a few states will be served ads. Then after enough data is collected, I can compare the pre and post campaign revenues for test states and control states. (after deducting the click through revenue for the test states – in order to see if there are any meaningful incremental revenues from people who either went directly to the site or performed brand searches after “viewing” the banner).

    For Google remarketing campaigns, I saw a fairly strong correlation between view-through conversion per impression and click through revenue per click dollar spent. The data set I used come from the top manual and automatic placements performance data. Depending on the number of data sets I used, top 10, or top 20, or top 30, the correlation number ranges from 0.26 to 0.89. In other words, there is a positive correlation between the quality of clicks and number of view throughs per impression.

    SO THE QUESTION IS: for placements that have high view throughs per impression, did the conversion happen because people who would have bought anyways tend to visit those placements OR because those placements attract “buyer” demographic visitors that were “influenced” by the banners and thus performed brand search (or direct) to convert on the website?

    George, what is your company’s view on view through conversions on display and remarketing campaigns respectively (since display and remarketing has very different view through/impression ratio)?
    And what percentage of the display and remarketing view through conversions get attributed to the banners? 5%? 30%?
    I know different verticals will have different numbers depending on the product nature, days to purchase and etc, but I’d love to see you elaborate on this topic or point me to an article that has addressed my questions.

    Sincerely

    Dylan

  4. Dylan says:

    thanks for approving me comment. I am really curious and interested in hearing your thoughts! Thanks for being the thought leader you are in our industry.

  5. Dylan, thanks so much for your fabulous comments and really tough questions.

    First: I like the geo-targeting approach to the testing conceptually, but the challenge is often: Is the display ad spend in that geography big enough relative to the size of that market to be able to “see” the lift with confidence above the level of statistical noise? If you normally get $1,000,000 per day in sales from NYC plus or minus 10%, would a $10,000 display buy show you anything? Doubtful.

    Second: The correlation between CTR and view-through conversions is fascinating, and not surprising. If an ad catches someone’s eye and is interested what do they do next? Some will click, some will type in the url or search for the brand. Hence a higher CTR shows the ad is engaging, and would likely lead to more VTC as well.

    Last piece. The answer will vary of course. One data point we saw recently in our data was this: 90% of VTC had some non-navigational interaction with another marketing channel between the view and the conversion, but 10% of the VTC the user ended up on the website within 1 minute of the impression (with no click on the display ad). Pretty compelling evidence of the reality of VTC. That’s just one data point, but I think the order of magnitude is in that neighborhood.

    George

  6. Dylan says:

    Thanks for the quick feedback.

    First: I agree with you on the first point. So I am launching a split test serving regular remarketing/display ads vs public service ads (a charity of our choice :)…)

    Second: Makes sense. But the question becomes:
    Did the high CTR, VTC placements/sites do well because they attract the right demographic and thus influenced their purchase process? In other words, if it wasn’t for retargeting, these converted customers wouldn’t have come back to our site through direct, search or other channels.

    OR

    These high CTR, VTC sites did well because our “would have been buyers anyways” customers tend to visit these type of sites and the remarketing ads only served as a reminder/branding, and didn’t really influence customer’s purchase decision.

    Last piece: The data is great, 10% 1 minute VTC sounds awesome. Is it display or remarketing? And what percentage came from 1 hour VTC?