If you accept that you will make some errors, you’ll probably make fewer errors overall.

In this post, I wrote down some ideas about why people don’t trust algorithms (by which I mean sets of decision-making rules). I speculated that people don’t trust algorithms in part because of a desire to maintain control over their lives; we want our decision-making to matter.

But the research pointed to the idea that people don’t trust algorithms because they hope for perfection in their decision-making. If you accept a set of rules, it’s likely that they’ll be wrong at least in some cases, and the whole point of accepting rules is that you don’t change them in order to compensate for their defects. So almost any algorithm will inevitably be wrong, at least sometimes.

Here’s a paper by Hillel Einhorn, “Accepting Error to Make Less Error”, that talks more about this. Einhorn breaks decision-making into two approaches, the clinical, and the statistical.

  • The clinical aims for perfect prediction, and seeks to develop a causal model of what is going on in order to predict perfectly. Imagine using data about a car to tune its engine, based on a detailed understanding of exactly how an engine works.
  • The statistical model doesn’t aim for perfect prediction, and doesn’t try to develop a model of why things happen the way they do. But in many cases it will predict better, because a reasonable causal model may not exist. Imagine trading stocks. It’s impossible to explain (or to predict) many moves in the market. But a simple algorithm, such as investing in an index fund, will work well over the long term.

Einhorn says that both approaches have their merits, and they depend on your model of reality.

  • If nature is a system, and we can know that system, it’s better to make predictions, based on developing and refining a systematic understanding (clinical).
  • If nature is random, or unknowable, it’s better to pick an algorithm in advance (statistical).

Gerd Gigerenzer puts this a different way.

If risks are known, good decisions require logic and statistical thinking. If some risks are unknown, good decisions also require intuition and smart rules of thumb.