Sergio Garcia won The Masters. He didn’t get to take a mulligan. But just because we can, doesn’t mean we should. This is how a conversation with me might go:
“Where were you today?” ….“Playing golf.”
“What did you hit?” … “75”
“Very good. Did you take any mulligans?”…“25 of them”
All of a sudden, your golf game doesn’t look so good. Your measurement is not an accurate reflection of reality. And that deceptions hurts our options going forward.
What is a mulligan? The definition is an extra stroke allowed after a poor golf shot, not counted on the scorecard.
We do this to our own performance metrics all of the time, taking out stuff that we don’t want to count. I was walking through a factory in Malaysia several years ago and they talked about their lead times, which sounded very impressive for their industry at only a few days. As we walked through the factory, we came to a certain room where half-finished product was stored and re-prioritized before going through the rest of the factory. It alone was several days inventory. I asked them how this fit with their factory performance on lead time and their response was “oh, this inventory doesn’t count against our lead time.” In other words, they were kidding themselves about their performance. It wasn’t nearly as good as they told people, nor as good as they told themselves.
Perhaps the most dangerous form of this is in the financial “one-time charge.” We roll-up our collection of managerial mistakes, whether they are expansions into the wrong market or mismanaging growth of resources, so we package it all up and make it a one-time charge to our financials, which we do report, but we also report how things look without it. So we say “earnings were $0.53 per share, but minus the $154M one-time charge, our earnings were actually $2.38 per share.” Is this reality, or manipulation? Well, it’s legal manipulation. There was a study of AT&T back in the 1990s where they demonstrated that over a long time period, AT&T looked like they made money, but if you truly paid attention to the one-time charges, they lost money year after year.
When you measure on-time performance, is it to the customer-request date, or did you change the date 3 times along the way after letting your customer know you couldn’t ship when they wanted? What are you really measuring, your actual shipping performance or your ability to convince the customer that its ok that you’re late? So the numbers is lower. So what? It is what it is, and there are valid reasons for it. Fix what you can, and don’t spend too much time on that which you can’t.
This happens in many small ways. We focus on the mulligans, or the reasons that our results don’t really count. But this doesn’t help us improve. We need to know where we really stand. Reality is reality, and abstraction is not reality. The more we manipulate our metrics to try to reflect reality, are we really taking away our ability to improve.