Thinking, Fast and Slow for Lawyers and Clients
© Neil Hamilton
(Published Feb. 20, 2012 in Minnesota Lawyer
Can we improve our abilities both to make good judgments for ourselves and to help clients make better judgments? Psychologist Daniel Kahneman, a Nobel Laureate in economics, recently published a book, Thinking, Fast and Slow (2011) that synthesizes many decades of empirical work on how the mind works. The book provides a wealth of insight both on how each lawyer can improve his or her decision-making as well as on how, in the lawyer’s counseling role, a lawyer can help clients to improve their decision-making.
Kahneman and his deceased coauthor Amos Tversky spent their careers exploring how the human mind works in predictable ways to make errors of judgment. The good news is that they conclude humans are fundamentally reasonable and rational. “Most of our judgments and actions are appropriate most of the time.” The bad news is that if “most of our judgments and actions are appropriate most of the time,” then there is still a lot to error in our judgments. Going back to good news, there are abundant opportunities to make better decision for ourselves and to help clients make better decisions.
A general theme of Kahneman’s book is that we are over-confident in how well we think and make judgments. We have an exaggerated sense of how well we understand the world. Empirical research consistently shows that we make many systematic errors of judgment from cognitive biases, fallacies and illusions.
So what does Kahneman’s empirical work have to do with our work as lawyers? A good lawyer wants both to minimize his or her errors of personal and professional judgment and, as a counselor, to help clients to minimize their errors of judgment. If a lawyer understands the systematic errors of judgment that the human mind makes, he or she can take steps to minimize them.
System 1 and System 2 Thinking
From his empirical research, Kahneman argues that human reasoning is distorted by systematic biases, and one major source of such errors is the distinction between what he calls System 1 and System 2 thinking. System 1 and System 2 are metaphors or contrivances, not anatomical places or pathways, that Kahneman has created to help us understand how the mind works. The intuitive, largely unconscious and automatic System 1 does the fast thinking, and the effortful System 2 does the slow evaluative and reasoned thinking, monitors System 1, and exercises control over System 1 as best it can with its limited resources. System 1 develops over time as a product of learned patterns of association and retained memory to enable a person to create quick drafts of reality and to act in real time. It is especially sensitive to threats where immediate action may be necessary so System 1, for example, can immediately detect fear in others’ eyes and anger in the voice of others. Essentially, system 1 is the ability, developed over a lifetime, to recognize patterns and causal interpretations of events in a fraction of a second so that the person can produce an adequate solution to a challenge in real time.
System 2 thinks slowly; it evaluates and it reasons. It is essential for tasks that require comparisons, ordered reasoning and choice. Kahneman’s empirical data indicate that while we believe our System 2 is principally in control making reasoned judgments, in reality our System 1 thinking is more common. The problem is that System 2 has limited resources for concentrated cognitive work and self-control, and it gets depleted. As System 2 gets depleted, it weakens in its ability to monitor and control thoughts and actions suggested by System 1. A depleted System 2 regularly provides overly quick rationalizations for System 1 intuitions and biases.
System 1 Errors of Intuitive Judgment
While System 1 as “a machine for jumping to conclusions” gets it right most of the time, its quick and automatic search for causal interpretations of events is sometimes quite wrong. It often creates causal stories out of very dubious raw material.
Kahneman’s empirical research reveals a great number of systematic errors of System 1 judgments from cognitive biases, fallacies and illusions. Examples include anchoring effects, the optimistic bias and the planning fallacy, framing effects, the halo effect, the “Florida” effect, the focusing illusion, outcome bias, and availability bias. I discuss only the first two of these here.
Anchors or Arbitrary Reference Points
One of the most robust and reliable results of experimental psychology is that when people consider a particular value (called an anchor) for an unknown quantity before estimating that quantity, the estimates of the unknown quantity tend to be close to the particular value (the anchor) they heard before the estimate occurred. For example, the experimental data are clear that even if people are aware of the effects of an anchor, the anchor still influences them more than they know or want. For example, in an experiment, real estate agents were asked to assess the value of a house that was coming on the market, but the agents did not know the actual listed price. They visited the house and studied a comprehensive booklet of information that included an asking price. Half of the agents saw an asking price that was substantially higher than the eventual listed price, and half saw an asking price that was substantially lower. Each agent was then asked to give his or her opinion about a reasonable buying price and the lowest price at which the agent would sell the house if the agent owned it. The agents were then asked about the factors that affected their judgment. The agents took pride that the asking price was not one of the factors they considered. They said they ignored it. Yet the anchoring effect index was 41%, only slightly lower than the anchoring effect index of the asking price in the same study of business students with no real estate experience. Moving first to create an anchor in a single-issue negotiation over price has a powerful effect. A lawyer needs to know that any initial number on the table from the opposing side has had a strong system 1 effect on the client, the lawyer, and on the decision maker like a judge. If the stakes are high, the lawyer has to mobilize System 2 to combat the anchoring effect.
The Optimistic Bias, the Planning Fallacy, and Inside and Outside Views
Kahneman finds that “Most of us view the world as more benign than it really is, our own attributes as more favorable than they really are, and the goals we adopt as more achievable than they are likely to be. We also tend to exaggerate our ability to forecast the future, which fosters optimistic overconfidence. In terms of its consequences for decisions, the optimistic bias may be the most significant of the cognitive biases.” The planning fallacy, one of the manifestations of the optimistic bias, describes the phenomenon that, in making plans and forecasts for future projects, we tend to overestimate benefits and underestimate costs. We tend to make forecasts and plans that are “unrealistically close to best-case scenarios.” For example, in 2002, a survey of Americans remodeling their homes reported that on the average, they had expected the job to cost $18,618, and they ended up paying in average of $38,769. The solution is again to enlist System 2 to consult statistics of similar situations and projects to take the “outside view” rather than to assume the “inside view” that uses meager initial evidence about a project to extrapolate optimistically about the future.
A Lawyer’s Role in Thinking, Fast and Slow
Kahneman provides an empirical map and language to understand distinctive patterns of System 1 errors of judgment for both lawyers and clients. Although humans are not irrational in most cases, “they often need help to make more accurate judgments and better decisions.” Specifically, each lawyer should have a personal board of directors (my December, 2011 column) to provide this help to make an accurate diagnosis of possible System 1 judgment failures and to suggest System 2 interventions that limit the damage of bad judgment. Of course this is the independent and candid and honest counsel that lawyers give to clients. Lawyers can give particularly valuable counsel in situations where the client’s System 2 resources are depleted which will cause System 2 too quickly to endorse the client’s System 1 response without reasonable checking and reasoning.
Kahneman also suggests that experts like lawyers can improve his or her System 1 skills to minimize judgment errors through (1) careful attention to minefields like the anchor effect where System 1 judgments are most prone to error, and (2) practicing System 1 decisions and then seeking immediate feedback on the decision to learn from mistakes. For example, a trial will require many System 1 decisions for a litigator, but whenever possible, a newer lawyer can ask for immediate feedback on System 1 decisions. A lawyer can also counsel his or her client about both of these strategies.