Make Your Point > Archived Issues > GOODHART'S LAW
Send Make Your Point issues straight to your inbox.
While we're talking about Goodhart's law, see if you can define these:
In 1975, the economist Charles Goodhart wrote: "Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes." People often paraphrase his idea like this: "When a metric becomes a target, it ceases to be a good metric."
Part of speech:
Since it's rare, you'll probably want to gloss it for your readers (define it as you use it), like in both examples below.
"Goodhart's law states that when a measure becomes a target, it ceases to be a good measure. In other words, if you pick a measure to assess performance, people find a way to game it. To illustrate, I like the (probably apocryphal) story of a nail factory that sets 'Number of nails produced' as their measure of productivity and the workers figure out they can make tons of tiny nails to hit the target easily."
Explain the meaning of "Goodhart's law" without saying "metrics fail because people game them" or "statistics quit being useful because people manipulate them."
(Source)
Try to spend 20 seconds or more on the game below. Don’t skip straight to the review—first, let your working memory empty out.
1.
The near opposite of Goodhart's law would be the idea that you can
|