Jun 22011

Averages Lie: Part 27

Okay, it probably isn't actually part 27, but it is a recurring theme at RKG Blog to point out that good data analysis is the foundation of good marketing and bad data analysis is a waste of time. Thanks to Tim Peter for inspiring this latest edition.

One source of the problem is fixating on the wrong "KPIs".

For example, many folks seem to spend time worrying about CPCs and whether they're spending more or less per click on average day to day. This is wrong-headed on several levels, but let's just dive into one.

For simplicity sake, let's reduce the world to two keywords: wagawaga and flimflam. The data from Day 1 is below:

Notice that wagawaga drives much more traffic than flimflam, but flimflam has the much higher CPC. Now let's assume the bid management is rational and that the reason Acme is willing to pay $5.00 per click on "flimflam" is that the traffic is actually worth that much to them.

If we look at this simple, two keyword scenario in aggregate you see that Acme's paid search program as a whole drove 1,100 clicks for $650 for an avg CPC of $0.59.

Now let's say the smart paid search marketer knows that on Day 2 the value of traffic on the term wagawaga is going to be higher than normal because of a promotion, or because it's the traditional kickoff day of wagawaga season, or whatever. Anticipating that the value of traffic will be ~25% higher than normal, she bids the keyword wagawaga up by 25% to capture a larger share of traffic and maintain advertising efficiency while doing so.

The results from Day 2 are below:

The Director of Marketing storms into the paid search manager's office

"Why, on a day when we were supposed to be pushing harder, did our average CPC actually fall?!? Heads must roll!"

To add to the picture, let's extend this to some other KPIs that aren't:

This adds fuel to the Director's fire: "CPCs are down, our conversion rate and average order size dropped...you've got some serious explaining to do!!!"

Thankfully, in this simple two keyword campaign, the explanation is easy to see. We did in fact push harder on wagawaga, sales and costs increased appropriately, everything in fact went great on Day 2. And, not only is it easy to see, it takes no time to pull the information.

Fast forward to the program with hundreds of thousands of keywords and the answers to questions about fluctuations in conversion rates, CPC, AOV, CTR, Avg Position, etc. become much more difficult to see and take hours upon hours to isolate. Moreover we don't really care what happens to any of these KPIs, because, in fact they are not key performance indicators.

To my thinking a Key Performance Indicator is something I need to care about for its own sake. Sales volume, the number of quality leads, expenses...these matter because they impact the P & L statement directly. Conversion Rates, CTR, AOV, and many many other metrics like page views, time on site, unique visitors, etc. are useful diagnostic measures that can help us identify problems and opportunities, but we don't care about them for their own sake, or at least we shouldn't.

Success in almost any venture demands making the best use of finite resources, and time is one of those finite resources. Responding to false alarms is one of the great sources of inefficiency and distraction in an organization. Some of this can be eliminated by keeping focused on the true KPIs that impact the CFO's world. If those numbers are in line with expectations and look reasonable, fluctuations in second-order statistics can be investigated or ignored depending on other priority tasks.

RKG's platform includes all kinds of warning flags to alert our analysts to anomalous behavior, but these warnings are calibrated to recognize some level of normal statistical variance and only throw flags when the variance is statistically meaningful. Too many 'false alarms' means the system isn't tuned properly.

The more time spent engaged in activities that drive the numbers and the less time spent explaining variances in secondary indicators the better the program will be over the long term.


12 Responses to "Averages Lie: Part 27"
Jim Novo says:
KPI's that are not really "Key" are way too common in web analytics. Unfortunately, not enough mangers are insisting web analytics folks create a linkage to profit - the ultimate KPI. Why this is allowed to happen is a great mystery, but as we discussed at your conference, the root cause is probably compensation that in one way or the other is tied to the very same KPI's that are not Key. I can't find any other logical explanation for this strong defiance of KPI reality; it must be behavioral economics, people do what they are incented to do, even if they know it's irrational. CRM suffered from a very similar problem. The $64,000 (Million? Billion?) dollar question is this: What can we do about fixing these perverse incentive programs? Thanks for inviting me to speak at your great conference! Very smart and highly capable crowd.
Jim, thanks for your commentary on the blog, and superb talk at our Summit. Part of the problem is a desire for simplicity: "your job is to make conversion rate go up." Turning off all your online marketing aimed at new customers will achieve that, to your own detriment. By trying to simplify the irreducibly complex we end up with absurdities. I read a terrific article on education titled: "Keep it complicated" which pointed out that dumbing down the complexities of history or government to get students to regurgitate information on tests makes the subject matter dull and the kids disinterested. I do think individualized performance bonuses tend to create silos rather than promoting teamwork, and that is a big piece of the problem as you point out.
Jordan says:
What do you guys use to weed out "false alarms"? We're currently looking at using the control chart method (a point beyond 3σ flags an alarm, nine consecutive points above or below 0σ flags an alarm, at least 2 consecutive points beyond 2σ flags an alarm) mixed with checking the probability of an event to try to weed out false alarms. Do you have any suggestions? Thanks!
Hi Jordan, You folks are following the right path...or at least the path we went down. Tuning the alarm sensitivity levels is about striking a balance: too sensitive and you get tons of false positives; not sensitive enough and problems fester for longer than you'd like. There is no right answer, I don't think, just a matter of finding what feels like a good balance. Good luck!
Jim Novo says:
Perhaps what could be tried is a "tiered" work and reward system, which I have seen in operation at several analytically-intense companies. Essentially, it acknowledges the failure to develop this ideal system: a universal, company-wide set of of self-reinforcing KPI's, none of which conflict with each other - e.g. success in a service KPI does not hurt but helps drive success in a Marketing KPI. Rather, the tiered system gives up on KPI conflict *but at the same time* measures it and rewards progress towards the self-reinforcing KPI set. A simple example would be having separate front end / sales / conversion and back-end / profit / expense analytics and reward schemes. Nothing really changes for management of the front end; conversion / sales are king and rewards allocated appropriately. However, trailing the front end by about 3 - 6 months is an analytics effort aimed at getting to the profit KPI's, which takes into account returns, service costs, etc. Additional carrots (or sticks?) are then allocated based on the true financial performance of the efforts. This approach prevents people from "gaming the system" and damaging the company, as often happens when only front-end KPI's are used, because the back end analytics are fed back to the front end in a continuous improvement loop. It's tough to get away with ratcheting up campaigns that drive high return rates or poor quality customers when profits are being tracked on the back end - and you can score extra bonus by optimizing the entire system, not just part of it.
Jim, this sounds like your next book!
David Burdon says:
I started business life as a marketing statistician 34 years ago. Working very closely with the production and logistics functions I was part of a team that built the most profitable business in its market category. In terms of numbers of people we were swamped by our larger competitors. But the competitors kept insisting on measuring the wrong things and taking the wrong remedies. We measured quality and aimed to eliminate wastage. We measured customer satisfaction. We paid our workforce 20% more than the going rate and lower levels of staff turnover and absence. We expected a price premium on our products. Our competitors were always looking to improve their performance but couldn't understand how our output efficiency and our profit margins were significantly higher. We took our cost advantage and applied it to improved quality and higher unit levels of marketing expenditure. In the end most of our competitors exited what was a huge market because their accountants and financial advisers insisted that better returns could be made elsewhere. Most measures in corporate financial packs are meaningless but the executives involved in producing them carry on regardless.
David, thanks for your commentary. You've provided a great example of the power of tracking meaningful metrics and the danger of tracking the wrong ones. Thanks for stopping by!


Check out what others are saying...
[...] Averages Lie: Part 27, Rimm Kaufman [...]
[...] Michie helps dispel some important KPI myths in his PPC KPI post “Averages Lie: Part 27″. In it, he demonstrates a scenario where PPC results [...]
[...] and aggregates lie, as I’ve argued many times, yet they are also an essential tool of managing a paid search program. Even the manager in the [...]
[...] Averages and aggregates lie, as I’ve argued many times, yet they are also an essential tool of managing a paid search program. Even the manager in the weeds cannot look at the most granular performance data and make any sense of it. We have to look at data aggregated over time, by category, by subcategory, by match type, by geography, manufacturer, the list goes on and on. To really understand what’s going on we have to study the data six ways to Sunday to get the complete picture, but that can create problems for reporting out results to senior management. [...]

Leave A Comment