Sometimes a lot of data can be meaningless; at other times one single piece of information can be very meaningful. It is true that a thousand days cannot prove you right, but one day can prove you to be wrong

Consider a turkey that is fed every day. Every single feeding will firm up the bird’s belief that it is the general rule of life to be fed every day by friendly members of the human race. On the afternoon of the Wednesday before Thanksgiving, something unexpected will happen to the turkey. It will incur a revision of belief. The turkey is fed by the butcher for a thousand days, and every day the turkey pronounces with increased statistical confidence that the butcher “will never hurt it”—until Thanksgiving, which brings a Black Swan revision of belief for the turkey.

Its confidence increased as the number of friendly feedings grew, and it felt increasingly safe even though the slaughter was more and more imminent. Consider that the feeling of safety reached its maximum when the risk was at the highest! But the problem is even more general than that; it strikes at the nature of empirical knowledge itself. Something has worked in the past, until—well, it unexpectedly no longer does, and what we have learned from the past turns out to be at best irrelevant or false, at worst viciously misleading.

turkey .001

The figure above provides the prototypical case of the problem of induction as encountered in real life. You observe a hypothetical variable for one thousand days. It could be anything (with a few mild transformations): book sales, blood pressure, crimes, your personal income, a given stock, the interest on a loan, or Sunday attendance at a specific Greek Orthodox church. You subsequently derive solely from past data a few conclusions concerning the properties of the pattern with projections for the next thousand, even five thousand, days. On the one thousand and first day—boom! A big change takes place that is completely unprepared for by the past.

The biases we can derive from the turkey paradox are:

  • Mistaking absence of evidence and evidence of absence
  • Overconfidence in using the past to predict the future
  • The illusion of understanding (or how everyone thinks he knows what is going on in a world that is more complicated – or random – than they realize


  1. Black Swan. Nassim Taleb