Monday, December 21, 2015

Thinking Is Hard



Thinking is hard. One of the books I’ve recently read, Daniel Kahneman’s Thinking Fast and Slow (2011), has convinced me that it’s harder than I thought. Aristotle and other ancient writers tried to help by making rules for logical argument, but the thinking we do most of the time doesn’t fit into the deductive logic scheme of those books. In 1620 Francis Bacon came up with a helpful list of how our thinking goes wrong. His book is called Novum Organum (Aristotle’s book on logic had been called Organon—the instrument; Bacon thought he’d come up with a new instrument). It detailed a lot of mental habits that get in the way of clear thinking. We imagine more order in the world than is actually there, for example. We cling to ideas and see only the evidence that confirms them. Our observations tend to be skewed toward what we want to be true. The sort of problems Bacon warned against are the reasons drug testing has to be double blind; that is, neither patient nor doctor knows whether the patient’s pills are actually the medicine being tested or blanks.
            Kahneman discovered that narratives often override our sense of the way chance and probability work. The way a story is told can trump our knowledge of numbers. A simple example: we are more likely to go under the knife if the surgeon emphasizes the 90% of his patients who survive rather than the 10% mortality rate. We are likely to ignore the phenomenon known as regression to the mean in any series of events or numbers. Kahneman quotes an Israeli pilot instructor who says “when I praise someone for an exceptionally good performance, he almost always does worse the next time, but when I chew out someone for a bad job, he always does better the next time.” He thinks it works better to criticize than to reward, but the results in each case are merely reversion toward the mean from extremes of good or bad performance of the student pilots.
            We tend to classify according to a narrative story rather than our knowledge of  the real probability that object or person A belongs in category B. An outstanding case is the Linda Problem, in which respondents go for plausibility when asked about probability (the tendency to answer a less-difficult question than the one asked is one of the drawbacks of thinking fast). Linda is single, 32, outspoken and bright. She majored in philosophy and as a student was deeply concerned with issues of discrimination and social justice. Subjects are given this information and told to rank the following scenarios in order of probability:
A  Linda is a teacher in elementary school
B  Linda is a bank teller
C  Linda is an insurance salesman
D  Linda is a bank teller who is active in the feminist movement,
Most respondents rank D as more likely than B, defying logic, since there are more bank tellers than there are bank tellers with a limiting characteristic.
            We also tend to misconceive of the way chance works in particular ways. Given a six-sided die with 4 green and 2 red faces, rank in order of probability these sequences coming up:
A  RGRRR
B  GRGRRR
C  GRRRRR
As in the Linda Problem, most respondents will say B is more likely than A, even though there are two possibilities for a previous throw in A and thus it is more likely.
            Kahneman lists a number of other pitfalls to our thinking. He believes armed with his information, we can be wary of quick decision-making, slow down our thinking, and eliminate some of these “cognitive biases” as he calls them. But after reading through four hundred pages of his examples, I’m fearful that we can’t easily avoid such problems thinking fast or slow.
            A number of Kahneman’s thinking errors come about through our failure to think through the way numbers work. A general ignorance of simple mathematical rules and probabilities is what the mathematician John Allen Paulos has complained about for many years in books such as Innumeracy (1988) and A Mathematician Reads the Newspaper (1995). Paulos believes that such ignorance is dangerous for the welfare of individuals and for public policy making. But he wasn’t the first to try to show people how looking more carefully at numbers can keep us from exaggerated or misleading advertising claims, political demagoguery, or overt attempts to defraud. Such attention is also the message of Darrell Huff’s little 1954 classic, How to Lie with Statistics. Huff is especially good on the way graphs and pictorial representations of statistics can distort the real meaning of the numbers, but he gives general advice about how to confront and question a statistic by asking who it’s coming from, how that person or organization knows, what might be missing, whether it makes sense, and so on. Huff’s tone is light and his book is illustrated throughout.
            Some books can help with the rigors of thinking, and I don’t mean yesterday’s throwaway self-help scribbles but books that have stood the test of time and usefulness. One of my favorites is Henning Nelms’s Thinking with a Pencil (1957). Nelms argues that drawing can help one think through a problem as well as communicate information easily to others. We can use a drawing to collect information (as we’re listening to directions, for example), and Nelms illustrates that even a mathematical formula can be depicted through a simple geometric drawing. Though it is not one of his 700 illustrations in the book, the often-used drawing illustrating Pythagoras’s square of the hypotenuse theorem comes to mind here. In a chapter called “Visualizing Numerical Data,” Nelms points out that any quantity can be expressed as a number and any number can be graphically represented by a length or an angle or a set of images. He provides much technical detail about handling graphs, in the process giving us the same warnings about the mishandling of graphs as Huff does. Nelm says even if you don’t think you can draw, you are capable of using a pencil to help you think through many problems with doodles or simple shapes.
            Probably the most thorough book on thinking through a problem is George Pólya’s How to Solve It (1945), which I think has been continuously in print since he wrote it. Pólya’s book presents a heuristic, a method for solving problems that is first laid out in schematic form at the end of his table of contents and then shown in operation with several examples. The method is illustrated primarily though not exclusively with examples from mathematics and geometry; Pólya clearly believes it has a wider application. He invites us to divide the task of solution into understanding the problem, devising a plan, carrying out the plan, and checking the result. Most of the attention goes to the first and second of these steps. In understanding the problem, we need to ask what is the unknown, what are the data, and what is the condition. Can we draw a figure to illustrate the problem? (Nelms would say “yes!”) If the problem is abstract, try looking at a concrete example. In devising a plan, Pólya suggests we ask ourselves whether we have seen this problem or an analogous problem before, or whether we have seen a problem with this unknown before. He suggests we try to restate the problem. Can we perhaps solve a related problem, a more general problem, or part of the problem? Have we used all the data? Can we imagine a solution and work backward? In carrying out the plan, he advises us to check each step and attempt to prove its correctness. In checking the whole solution, he asks whether we could derive the result in a different way, and whether our solution or its method could be used to solve another problem.
            In his first chapter Pólya imagines a teacher prompting a student through these steps in the solution of several problems. Another chapter is arranged as a dictionary explaining and exemplifying key terms of the heuristic such as analogy and condition. This glossary is somewhat idiosyncratic and intended to be read through rather than merely consulted; Pólya introduces people into the glossary such as Descartes and Leibniz who were important in the history of heuristics, and some of the terms are phrases or questions such as Did you use all the data? Though some of the examples may be beyond the reach of non-technical readers, Pólya’s style is clear and precise and his book very readable. Where Kahneman’s book may stagger your confidence in human thought processes, Pólya will reassure you that a patient, organized approach will get you to clear thinking.