Sunday, January 15, 2012

Kahneman’s Thinking, Fast and Slow

Everyone is reading Daniel Kahneman's "Thinking, Fast and Slow" (2011). A passage in a recent New Zealand Court of Appeal decision, the details of which are currently suppressed, raises questions about the right way to think about propensity evidence.

The case citation is [2011] NZCA 645 and the date of the decision is 14 December 2011. I will call it X v R. I will also adapt the quotation from para [34] of the judgment to comply with the order suppressing identifying particulars of the appellant:

"... it is an unlikely coincidence that Mr X, twice within a year, would be the hapless and innocent victim of being apprehended driving a car with [other people in it and also with evidence of criminal offending in it]. The evidence goes directly and cogently to the key issue: did Mr X know of the [items] found in the car he was driving on [the second occasion]?"

While the conclusion that the evidence had sufficient probative value to be admissible is intuitively correct, this form of reasoning entails several thinking errors of the kind that Kahneman discusses.

There is a tendency to draw strong conclusions from incomplete information (the "what-you-see-is-all-there-is" error). We are not told anything about the frequencies that matter in the above case: how frequently do people who are guilty of the present sort of offending have a previous recent incident of this sort of police apprehension, compared with how frequently do people who are innocent of offending of the present kind have a recent previous such apprehension?

There is a substitution error: we tend to answer difficult questions by answering a much simpler related question. Here it is easy to answer the question about the recent apprehension and to apply that answer to the more difficult question of guilt on the present occasion. This is closely related to another error.

Base-rate neglect is the error of neglecting statistical likelihoods in favour of accepting what could be causally possible. The other evidence in the case, relating directly to guilt on the present occasion, may significantly affect the strength of our tendency to see a causal connection between the first apprehension and the second.

Another error is the halo effect, or in the present context what might be called the devil's horns effect. Having learnt something bad about the defendant's behaviour on an earlier occasion, we are tempted to overemphasise this when we consider his present guilt.

Further, there is the narrative fallacy: we are tempted to accept what we can build into a story that makes sense, although the events may in reality be unconnected. The defendant may have been innocent on the earlier occasion through lack of knowledge of the presence of the things in the car. The coincidence may be real, but it does not suit the story we are tempted to build in which we cast the defendant as a recidivist.

This is similar to another error, the representativeness bias. Where only partial information is available we lean heavily on stereotypes.

For a review of Kahneman's book summarising these and other thinking errors, see the article in the New Zealand Listener, January 21-27, 2012, by David Hall.

In the above case, where the issue was the defendant's knowledge of the presence of the things in the car, it is easy to build a narrative in which the defendant, being ignorant on both occasions, was simply associating with people who were both his friends and offenders. That too would be a combination of thinking errors.

Courts too often make assumptions about likelihoods without inquiring into occurrences in the real world. The correct approach is Bayesian, as Kahneman – a leading psychologist and Nobel laureate – recognises. But that requires the effort of careful analytical thought rather than our preferred instinctive assessment of circumstances.