Intuitive Probabilities – Conjunction Malfunction

In a recent post I wrote about Vic, who might not look like a Christian, but probably is one. The Vic example reminded me of a famous study of unintuitive probabilities done in 1983. Amos Tversky and Daniel Kahneman surveyed students at the University of British Columbia using something similar to my Vic puzzle:

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.
Which is more probable?

A.    Linda is a bank teller.
B.    Linda is a bank teller and is active in the feminist movement.

All's well that endsAbout 90% of students said (B) was more probable. Mathematicians point out that, without needing to know anything about Linda, (A) has to be more probable than (B). Thinking otherwise is the conjunction fallacy. It’s simple arithmetic. The probability of a conjunction, P(A&B), cannot exceed the probabilities of its constituents, P(A) and P(B), because the extension (possibility set) of the conjunction is included in the extension of its constituents. In a coin toss, the probability of heads has to exceed the probability of heads AND that it will rain today.

Putting numbers to Linda, one might guess there’s 1% probability that Linda, based on the description given, is a bank teller, but a 99% probability that she’s a feminist. Even so, 1% is still a bigger number (probability) than 1% AND 99%, which means 1% times 99% – which is a tad less than 1%.

So why does it seem like (B) is more likely? Lots of psychological and semantic reasons have been proposed. For example, in normal communications, we usually obey some unspoken principle of relevance; a sane person would not mention Linda’s marital status, political views and values if they were irrelevant to the question at hand – which somehow seems to have something to do with Linda’s profession. Further, humans learn pattern recognition and apply heuristics. It may be a fair bit of inductive reasoning based on past evidence that women active in the feminist movement are more likely than those who are not to major in philosophy, be single, and be concerned with discrimination. This may be a reasonable inference, or it may just prove you’re a sexist pig for even thinking such a thing. I attended a lecture at UC Berkeley where I was told that any statement by men that connects attributes (physical, ideological or otherwise) to any group (except white men) constituted sexism, racism or some otherism. This made me wonder how feminists are able to recognize other feminists.

In any case, there are reasons that student would not give the mathematically correct answer about Linda beyond the possibility that they are mathematically illiterate. Tversky and Kahneman tried various wordings of the problem, pretty much getting the same results. At some point they came up with this statement of the problem that seems to drive home the point that they were seeking a mathematical interpretation of the problem:

Argument 1: Linda is more likely to be a bank teller than she is to be a feminist bank teller, because every feminist bank teller is a bank teller, but some bank tellers are not feminists, and Linda could be one of them.

 Argument 2: Linda is more likely to be a feminists bank  teller than she is likely to be a bank teller, because she resembles an active feminist more than she resembles a bank teller.

In this case 65% of students chose the extension argument (2), despite its internal logical flaw. Note that argument 1 explains why the conjunction fallacy is invalid and that argument 2 doesn’t really make much sense.

Whatever the reason we tend to botch such probability challenges, there are cases in engineering that are surprisingly analogous to the Linda problem. For example, when building a fault tree (see fig. 1), your heuristics can make you miss event dependencies and common causes between related failures. For example, if an aircraft hydraulic brake system accumulator fails by exploding instead of by leaking, and in doing so severs a hydraulic line, an “AND” relationship disappears so that what appeared to be P(A&B) becomes simply P(A). Such logic errors can make calculations of probability of catastrophe off by factors of thousands or millions. This is bad, when lives are at stake. Fortunately, engineers apply great skill and discipline to modeling this sort of thing. We who fly owe our lives to good engineers. Linda probably does too.

– – –

Fig. 1. Segment of a fault tree for loss of braking in a hypothetical 8-wheeled aircraft using FTA software I authored in 1997. This fault tree addresses only a single Class IV hazard in aircraft braking – uncontrolled departure from the end of the runway due to loss of braking during a rejected takeoff. It calculates the probability of this “top event” as being more remote than the one-per-billion flight hours probability limit specified by the guidelines of FAA Advisory Circular 25.1309-1A, 14CFR/CS 25.1309, and SAE ARP4754. This fault tree, when simplified by standard techniques, results in about 200,000 unique cut sets – combinations of basic events leading to the catastrophic condition.

Segment of a fault tree for uncontrolled runway departure of an 8-wheeled aircraft

– – –

Uncertainty is an unavoidable aspect of the human condition- Opening sentence of “Extensional Versus Intuitive Reasoning” by Tversky and Kahneman, Oct. 1983 Psychological Review.

  1. Leave a comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: