Reporting

We Put Last-Click Attribution to the Test, and the Results Will Surprise You

Written by Tim Edmundson

Apr 18, 2018 7 Min Read

There’s a hard-to-kill myth in digital advertising — the last click is king. The last-click attribution model is seen by advertisers as the ultimate arbiter when it comes to giving credit where it’s due. There’s just one problem with that: it’s wrong. Yes, last clicks are important and obviously deserve recognition, but to give them all the credit is a narrow view of attribution that can actually hamper your ability to further connect with your audience.

The truth is, the ad that gets the last click doesn’t deserve all the credit for a conversion. It deserves some, of course, but to give it all is akin to working on a group project and only giving the person who hands it in an A.

In this article, we’ll cover how:

> Google’s view of attribution has evolved, and the rest of the industry should follow suit.

> SteelHouse has promoted going beyond the click for some time, and our proprietary technology, Verified Visits, does just that.

> In testing Verified Visits, we saw sizable increases in performance when compared to last-click, with triple digit percentage increases in ROAS and click ROAS, and large drops in CPA and eCPA.

> Advertisers that insist on last-click attribution end up overpaying for performance.

Let’s explore why advertisers can expect these kind of results when they get away from last-click attribution.

Google’s Data Attribution Model

We’ve been beating the drum about going beyond the click for some time, and thankfully, so has Google. As essentially the industry leader, they have the power to bring change to the ad world. And we say it’s well overdue.

Google’s data-driven attribution, or DDA, takes a different approach when it comes to assigning credit for a conversion for its ads. From Google’s AdWords blog, DDA works by using conversion data to calculate the actual contribution each marketing touchpoint had along the conversion path. Then by comparing the paths of customers who convert to those who don’t, DDA determines which touchpoints are making an impact.

Google has been tracking DDA’s performance since it first launched in May of 2016, and have since discovered that when compared to last-click attribution, DDA delivered more conversions at a similar cost. This is huge, because it suggests last-click attribution is leaving performance on the table. By tracking a customer’s true purchasing journey, brands have a better understanding of what is truly driving conversions, and the credit is more accurately accounted for.

SteelHouse Verified Visits

We firmly believe in attribution that tells the full story, which is why we created Verified Visits. Verified Visits is SteelHouse’s proprietary technology that measures any user visits to your site following the guaranteed in-view display of your ads, in a window of time defined by you. It’s an attribution model that goes beyond the click.

We’ve been tracking the performance of Verified Visits when compared to click-based attribution models, and the results have been impressive. We tested a hypothesis — would focusing on ad clickers yield better results, or would an emphasis on Verified Visits prove a better approach? Our results were compelling: performance was stronger on Verified Visits.

Here’s how we did it:

> We started by splitting all available publisher sites into two lists: one that favored Verified Visits (VV) sites, and another that favored click-heavy sites.

> For VV, we primarily focused on sites with low CPAs, click CPA, and a large number of impressions. CTR was also given a small consideration. We then weighted each metric based on what we perceived to be most effective, then ranked the sites based on that criteria.

> For our click-based list, we made CTR the primary focus, with a low CPA and high number of impressions also bearing some weight (though not as much). We weighted and ranked this list as well.

> For the VV list, we served ads to the top 20% of the ranked sites and saw a huge spike in total conversions, but NOT click conversions. The campaigns saw significant jumps in ROAS (as high as 128%), and big reductions in both click CPA and eCPA (dropping a significant 69%). Upon digging into our data, we discovered strong-performing sites on this list were high quality inventory, including sites like the BBC, NY Times, Forbes, and more.

> For the click list, we saw increases in click conversions and revenue, but the overall performance was not as strong as VV. The quality of inventory was not stellar either, as sites with high click performance tended to have poor user experiences (marginally non-brand-safe, ad heavy, promoting conspiracy theories, etc.), and were not household names.

With this test, we discovered two effective ways of driving performance. Verified Visits, however, outperformed the click-based model and still hit clients’ goals while pairing their ads with premium publishers and content.

Our testing led us to include two options for brands in the SteelHouse Advertising Suite, one that focuses on clicks, and the other on Verified Visits. Depending on a brand’s preference, they can select either and still see excellent performance. Based on our data, we can’t recommend the Verified Visits approach enough. And best of all, Verified Visits performance can be validated in 3rd party analytics platforms like Google Analytics.

Better Attribution Gives a Bigger Picture

Last-click attribution creates tunnel vision, and can actually limit your overall marketing performance because it ignores integral parts of your marketing mix. Your campaign’s performance doesn’t boil down to the last action taken, and you need an attribution model that takes that into account.

So whether it’s SteelHouse Verified Visits or something like Google’s DDA, take your understanding of your campaigns a step further, and go beyond the click.