Litigate or settle? Lessons from Daniel Kahneman for litigation practice
Deciding whether or not to litigate a claim is an investment decision. This is very obvious for a litigation funder, but is equally true for corporations and individuals. If I had to recommend one book for learning about this kind of decision making, it would without any doubt be Daniel Kahneman’s Thinking, fast and slow.
By: Rein Philips
Based on decades of research, Kahneman, psychologist and Nobel Prize winner for economics, demonstrates that the way we make decisions is often at odds with the complexity of the problems facing us in the modern world. Clearly, deciding whether or not to litigate is a complex problem, whereas making the wrong decision tends to be costly. Indeed, Thinking, fast and slow made me aware of mistakes I had been making as a lawyer for years that may well have cost my clients money.
Talking about this book to lawyers and inhouse counsels I noticed that many have it on their bookshelf at home, but are still waiting for the appropriate holiday to read it. Looking at my own bookshelf I can sympathize with that argument, which is why I wrote this article. Of course it can in no way be a replacement for reading the book itself. It is still recommended to schedule that holiday soon.
Here follows a non-exhaustive list of examples of common errors dealt with in Kahneman’s book applied to litigation practice:
- Paying too much to settle a weak claim for fear of the very small risk of incurring a much larger loss (loss aversion).
- Settling a strong claim for too little out of a desire for certainty (certainty effect).
- Unwittingly allowing a ridiculously high or low initial offer on the part of the counterparty to determine the basis for settlement negotiations (anchoring).
- Underestimating the time and costs associated with litigation (planning fallacy).
- Rejecting a reasonable settlement because there is a very small possibility of the claim being rejected completely, or, on the part of the claimant, there is a very small possibility of a much higher sum being awarded (possibility effect).
- Overestimating the chances and/or proceeds associated with a case (overconfidence).
- Continuing with litigation because so much has already been invested in it (sunk-loss basis).
- Less chance of the court accepting your arguments because the statements of the case are poorly presented (visual layout) (cognitive ease).
These and other common pitfalls in decision making discussed in Thinking, fast and slow that are particularly relevant to litigation practice will be discussed below. Avoiding them begins with acknowledging their existence and taking certain simple precautions.
The fast and the lazy
The two key roles in Kahneman’s book are played by two modes of thinking: System 1 and System 2. System 1 represents our automatic, intuitive judgement, which operates effortlessly and largely unconsciously. System 1 sees at a glance whether a person is angry or happy (or pretends to be), automatically produces the answer to 2 + 2 =?, and enables us to drive almost unthinkingly on an empty road. System 1 allows us to respond appropriately to situations without having to consider them consciously. When someone greets us, we reciprocate; when someone throws a snowball in our direction, we avert our face.
System 1 operates automatically. Kahneman calls it the “associative machine” or “a machine for jumping to conclusions”. Working associatively, it draws almost instant conclusions about all kinds of situations and all sorts of individuals and problems by fitting them into the ideas active in the individual at that moment. This is what I will call in this article “feelings”, “intuition”, or “gut feeling”.
System 2 is the type of thinking that takes some effort. Deploying system 2 literally costs us energy. For example, solving the multiplication question “23 x 37 =?” in your head is a job for System 2. Try it. System 2 operates less effectively when your energy levels are low. A disturbing example of this is an experiment involving Israeli judges ruling on requests for parole presents. Reviewing a request is a routine job that takes an average of six minutes. The default decision is a rejection; only 35% of requests are approved. It turned out that immediately after lunch, when the judges’ energy reserves had been replenished, the percentage of successful applications rose to 65%; in the hours that followed, until just before the next meal, it fell to almost 0%.
An important task of System 2 is monitoring and controlling thoughts and actions suggested by System 1. Or that is what it should do at least. Often, however, System 2 adopts the conclusions of System 1 uncritically, and incorrectly ascribes an aura of rationality to them. If Kahneman teaches us one thing, it is that it is quite difficult to overestimate the laziness of System 2.
Beware of a good story
Kahneman asks his readers to consider the following question: “Will Mindik be a good leader? She is intelligent and strong.” Intuitively, your first answer is likely to be “yes”. Yet someone intelligent and strong can also be cruel and corrupt. If those two characteristics were added, that would probably alter our opinion. The problem is that System 1 is associative rather than analytical. It does not start by asking what characteristics make a good leader, so that it can then determine whether there is enough information to make a judgement. Instead it fills the void automatically, using associations that seem to fit the idea invoked by the limited information provided. Through this process characterics like “intelligent” and “strong” are intuitively associated with other positive characteristics, thereby creating the image of a suitable leader. This happens automatically and unconsciously. Our intuitive conclusion is positive, so if we are to suspend our final judgement – because of the lack of sufficient information – a certain amount of effort is required.
Kahneman calls this phenomenon “what you see is all there is”. It is at the root of so many errors of judgement that Kahneman uses an abbreviation for it in his book: WYSIATI. One expression of WYSIATI is the narrative fallacy, the irrepressible tendency of people to fit random pieces of information into a narrative, or idea that can be understood. The story might not follow logically from the limited information available, but is created, again largely automatically and unconsciously, by System 1. That narrative is based on associations with past experience, on how an individual views the world, and on the emotions associated with those views. If the story is coherent and feels convincing, then we believe it. Kahneman puts it as follows:
“You cannot help dealing with the limited information you have as if it were all there is to know. You build the best possible story from the information available to you, and if it is a good story, you believe it.“
The danger of creating a credible story based on incomplete information is that it makes us resistant to new information that would undermine that story. In a legal dispute, once convinced of the veracity of a particular interpretation of the facts or characters involved, if we are faced with new facts we will be tempted to look for the elements in those new facts that confirm our initial interpretation and ignore elements that contradict it (confirmation bias).
Though lawyers know very well how to use these mechanisms to their advantage in pleadings, they easily fall victim to the same mechanisms in other contexts and their clients even more so. WYSIATI and the confirmation bias are particularly dangerous when assessing the chances of winning a case. That is because they operate unconsciously and ruin your judgment even when you think you are being smart.
The best way to test the robustness of an argument is to identify and actively set out to disprove each and every assumption underlying it. This includes imagining possible scenario’s that would undermine those assumptions. The chief advantage of a mock trial is that it ensures this process is taken seriously. However, conducting an appropriate thought experiment, or asking the opinion of a colleague not directly involved in the case can also be hugely effective and is obviously much cheaper. In order to ensure a colleague to whom you might submit a case is not influenced by your position, it is important to present the case as neutrally as possible, without sharing your own interpretation of facts or arguments.
A criticism sometimes levelled at lawyers is their tendency to nitpick; but that is also their added value. A good lawyer will warn a client against the dangers of the client’s own narrative fallacy and, consequently, against bad investments. It should be recognized that this may sometimes require a lawyer to act contrary to his own short-term interests because a client’s investment is the lawyer’s revenue. In the long term however this will enable the lawyer to distinguish himself from the competition, and build up a loyal client base. The firm whose goal is to protect clients from investing unwisely will build a stronger track record of successful cases. There is no better form of marketing.
Confusing what is convenient with what is true and the importance of layout
The layout and readability of court documents matter. For instance, it has been shown that people –judges, too, we may therefore assume – find a text more credible if it is printed in a clear font. If a text is easy to process, we are more inclined to regard it as true and credible. In this context, Kahneman uses the term truth illusions.
Kahneman’s explanation is that we value situations positively if it takes us little effort to deal with them (cognitive ease), and many studies have suggested the truth of this. Things that require little effort to process are instinctively and automatically evaluated as “true”, “good”, and “familiar”. That instinctive judgement reassures System 2 that it need not go to any further trouble. Circumstances requiring increased mobilization of System 2 attract a correspondingly negative connotation; they make it appear that there is a problem or danger demanding that we increase our vigilance.
Clearly then, a lawyer wanting to increase the likelihood that his arguments and his interpretation of the facts will prove persuasive must ensure that minimal effort is required from the court to allow it to follow the essence of his argument, and the point of it. This implies that clear and simple structure, syntax, and choice of words all make a significant contribution, as does the layout of the document itself.
Big losses and big wins corrupt judgment
Once the facts have been assembled and analysed objectively, the risk of losing should not be exaggerated. Studies show that people for whom there is a small chance of losing a lot tend to exaggerate their risk. A lawyer who fears that making a mistake will damage his reputation might be excessively cautious. His aversion to a loss is greater than his desire to win. Kahneman calls this loss aversion. The following examples are paraphrased but all taken directly from Kahneman’s book.
Imagine a claim awaiting the court’s judgment. The result is either an award of EUR 100,000 or dismissal of the claim. All the signs are that the ruling will be favourable, but because one can never be absolutely sure, the lawyer emphasizes that there is still a 5% chance that the court will make no award. What would you do if the counterparty then offers to settle for EUR 90,000? Would you accept that? The answer is probably “yes”, while the strictly rational settlement value would be EUR 95,000.
Apparently, we attach relatively more weight to the low probability of losing EUR 100,000 (no win) than to the high probability of winning the same amount. This can be explained by loss aversion –excessive fear of losing. As a result, we are willing to pay a premium to insure against the probability of a small loss being incurred.
The converse is also true. If there were a 5% chance of an award of EUR 100,000 being made against you, you would probably be willing to pay more than EUR 5,000 to buy off that risk. It is this fear of the small risk of losing a lot that means nuisance value actually has a price. It is the premium you are willing to pay to buy off a small risk of losing a great deal. It is the fear that forms the basis for the business model used by insurance companies.
At the same time, research shows that a person with a 5% chance of winning EUR 100,000 will actually take an aggressive approach and not be willing to settle for less than EUR 10,000. People exaggerate not only the low probability of a large loss, but also the low probability of a major win. Which explains the business model behind lotteries.
One final category is that in which a person has only a small chance of losing nothing, but a high risk of losing a lot. People then tend to cling to the low probability that the loss might be prevented, consequently underestimating the chance of losing. People are more averse to the certain loss of EUR 90,000 than they are to a 90% risk of losing EUR 100,000.
These are examples of four scenarios in which greater weight is attached to a particular risk than is justified by the probability of that risk being realized. Kahneman calls this the fourfold pattern, and it is illustrated schematically in Table 1:
Table 1: Kahneman’s fourfold pattern
|95% chance to win EUR 100,000
Fear of disappointment
Accept unfavourable settlement
|95% chance to lose EUR 100,000
Hope to avoid loss
Reject favourable settlement
|5% chance to win EUR 100,000
Hope of large gain
Reject favourable settlement
|5% chance to lose EUR 100,000
Fear of large loss
Accept unfavourable settlement
This fourfold pattern helps to explain some of the dynamics of settlement proceedings. Interestingly, although the classic Harvard negotiation model advises parties involved in negotiations to seek a win-win situation, Kahneman’s experiments suggest that in some cases it might be more effective to let the counterparty realize that they have a great deal to lose.
In some cases it might still be rational to settle for EUR 10,000 instead of accepting the 5% risk of losing EUR 100,000 – if, for example, it really is a once-only event, or if an order to pay EUR 100,000 would lead to bankruptcy. In most cases this idea, however, will be an expression of “what you see is all there is”, being an exaggeration of the specific case at the expense of long-term interests.
If a company were to pay a premium of EUR 5,000 twenty times in order to obtain certainty in a particular case, that would cost the company EUR 100,000. If, except in those cases where the worst-case scenario would jeopardize your very survival, you can muster the discipline never to pay a premium to buy off a risk, then in the long run you would be better off not paying that premium. To muster such discipline, it would help to consistently regard each single decision to accept a certain risk as part of a series of choices that recur throughout your life or the life of your company.
Sunk-cost fallacy: forget about the past
Someone acting rationally who faces having to take a decision is interested only in the future consequences of that decision. Nevertheless, people allow themselves to be much influenced by earlier decisions they have made in relation to the same subject. We all find it difficult to reverse a decision once taken, especially if money and reputations have already been invested. As a result we tend to continue on our ill-fated way, in the hope that we can both recover our loss and save our ego.
Sometimes facts or circumstances come to light in a defence statement or during witness examination that completely undermine a case, facts or circumstances which, if they had been known at the start, would have prevented proceedings from ever being instituted. Nevertheless, the case will be continued. Significant costs have already been incurred and robust positions adopted, and instead of conducting a new cost-benefit analysis based on the new information that has come to light we tend to focus on the small chance that we might be able to compensate for the mistake already made.
In an interview with the Dutch financial newspaper Het Financieele Dagblad on 17 September 2016, another winner of the Nobel Prize for Economics Joseph Stiglitz gave a textbook example of the sunk-cost fallacy. Asked why, despite his intense scepticism about whether the euro would survive, his recent book offers a blueprint to save the euro, he replied:
“If you see how much has already been invested in the project, it makes sense to complete it.”
Intuitively, that sounds plausible, but when looked at more closely it is nonsense. What has been invested in the past is gone and cannot be recovered. The only relevant question is whether future investments in the project are expected to yield more than the alternatives.
Stiglitz’s remark shows that errors of judgement are not the preserve of those less intellectually able. Kahneman’s experiments demonstrate that time and again. We know, for example, that people tend to ignore relevant statistical data on human behaviour when they are asked to judge an individual. In that case, the individual characteristics of the person concerned are overrated, and in that regard trained statisticians are not much better than “ordinary” people.
Planning fallacy: learn from the past
Over thirteen years in the making, the North-South Line breaks all construction records for the past century. (…) Setback upon misfortune, and by now both the budget and the construction time are four times their original estimate.
I remember many embarrassing instances when the time and costs required for a case were grossly underestimated. A takeover for which we had estimated costs at EUR 20,000 eventually far exceeded EUR 100,000. Of course there were all sorts of complicating factors that we could not have anticipated when initially estimating the costs. But, actually, that is a poor excuse. You do not have to know exactly what you can expect to come up against in order to realize that there are always unforeseen circumstances to complicate matters. Failure to include unforeseen circumstances in your planning is another case of “what you see is all there is”.
Serious underestimates in the planning stage are by no means the preserve of the legal profession, with plenty of dramatic examples from business and government. We are inherently too optimistic about how much time projects will take and how much they will cost. This is not necessarily just innocent optimism. Underestimating costs can also be a strategy to eliminate competition during the pitch or tender.
As an attorney, I often refused to give an estimate of the overall costs, for the very reasons I have just outlined. I would give, at most, an estimate of those costs that could reasonably be predicted up to and including the first hearing. In doing so you could claim that you would still meet the needs of the client, who will have wanted at least some idea of the costs involved.
I realize now that this was actually a sophisticated way of luring a client into the sunk-cost trap. Carelessly, and because of this uncertainty about costs, the client will use as a reference point for his investment decision the only figure he does know, namely the investment necessary until the hearing. He does not realize, or is insufficiently aware, that five or even ten times that sum might be necessary to settle the case, unless his lawyer explicitly points the fact out to him. But pointing that out would immediately show the futility of providing a cost estimate covering just the initial stages.
It is possible even that such so-called honesty might lead the client to overestimate the likelihood of a settlement being reached at the hearing – the more so if post-hearing costs will be a problem. A party who engages a lawyer because he wants to enforce his rights apparently believes that he has a case, and then that a lawyer can be of help to see that those rights be enforced. Confirmation bias then ensures that the party concerned will look to the lawyer for confirmation of this belief. If the lawyer says that the budget is just sufficient to bring the case to a hearing but that there is a small chance that the case will be settled out of court, the party will pin his hopes on that. He will tend to ignore the possibility that it is equally or even more likely that the first hearing is only the starting point of protracted and expensive legal battle that could take years to settle, if at all.
There is clearly a better way to estimate the costs and time required, namely by looking at how much time and money was necessary for similar cases in the past, and then examining the extent to which the specific circumstances of the present case differ from the average case.
It is wise to be cautious when – compared with the average – costs and time estimates are revised downwards. When you produce your estimate you do not yet know what unforeseen circumstances will arise during the course of the case. For the same reason, you should always consider, too, the possibility of an above-average outlier. Ask yourself whether the investment is still worthwhile, and financially feasible in the worst-case scenario.
Legal cases not seldomly depend on the credibility of certain individuals over others, e.g. the parties themselves, the witnesses and the experts. This is especially the case when there is uncertainty about specific facts and when objective verification is impossible, at least at that stage. An example would be the case of witnesses who give conflicting statements, or a client who swears that X or Y was what the parties intended but who can produce no proof of it. Instead of accepting uncertainty about the facts, we tend to base our judgement on the impression made on us by those taking the positions concerned.
Our focus will then, consciously or unconsciously, shift from the question of what actually happened to the question of whether the person telling the story comes across as credible. At that point we should be on guard for our intuitive emotional reactions and preferences as these can be misleading. For example, research shows that we are more inclined to trust others who resemble us in terms of speech, gestures, or appearance, or who imitate us without our noticing it. System 1 therefore automatically ascribes to handsome people other positive qualities, such as honesty and ability.
The value of experience?
What about the intuition of the experienced attorney? Is the gut feeling of a seasoned and tenacious litigator worth nothing? Kahneman does not refer to lawyers in particular, but he is not optimistic about the predictive abilities of experts in general.
In psychological terms, intuition is not a form of magic but of recognition. The intuitive judgement is the result when System 1, faced with a problem, unconsciously draws on memories of experiences associated with the problem and applies them to the present problem. Given that this happens largely unconsciously, it not always clear whether such associations are relevant.
The intuition of an experiential expert can be accurate. A chess master who plays twelve opponents simultaneously is able to make a good move after a mere glance at the board. He need not think consciously about what to do; the next move appears to him automatically, as it were. This is not magic, but the result of playing a lot of chess. A chess master has so much chess experience stored in his memory that System 1 can easily select an appropriate move. Intuition is thus a form of recognition. Kahneman quotes the American psychologist and sociologist Herbert Simon:
The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.
A prerequisite for the reliability of this kind of intuitive judgement is undoubtedly that there are sufficient relevant experiences available in the individual’s memory. This requires not only memories of similar situations in the past, it is also important that the right lesson is drawn from those experiences, and for this it is important whether, and if so how, the consequence of a correct or incorrect assessment in the past is linked to the relevant problem. The chance of the right lessons being drawn from experience is enhanced if there is rapid feedback concerning the effects of the action taken. This is the case with chess players. A player who makes a poor move is quickly punished. The same applies to a firefighter who makes a wrong decision in a burning house.
By analogy with the chess player, it seems reasonable to assume that a lawyer can develop similar intuition in negotiating situations. In negotiations, action and reaction follow one another in quick succession. The short feedback loop means the negotiator is immediately confronted with the consequences of his strategy, and so he learns what works and what does not work with different types of people. The analogy with the chess player is not entirely appropriate, however, since in negotiations the feedback is less clear. Unlike in a game of chess, with negotiations it is not always clear in retrospect whether you won or lost – that is, whether you yielded more than was necessary or, by taking an excessively aggressive stance, caused more damage than you intended.
Estimating the probability of winning a legal case is much more difficult. The number of cases litigated by even the most experienced attorney will be fewer than the number of games played by a chess master. The board on which the attorney operates is less well-defined, and, unlike in chess, unforeseen external factors can sometimes play a decisive role.
Moreover, the feedback loop can take years. It is probable that a lawyer will not even notice the difference between his original estimate and the result. Faced with new facts, people tend to adjust their original assessment retroactively. This what Nassim Taleb calls the narrative fallacy. We are apparently quite capable of convincing ourselves retrospectively that we were correct right from the start. Face-saving at the expense of learning. In this respect, we can learn something from the exhaustive way in which air crashes and military operations are investigated or evaluated in order to avoid the same mistakes being made again.
The feedback which a lawyer connects in his memory to a particular situation can also be misleading for countless other reasons. When, for example, a case is settled for less, to the satisfaction of the client, or if a satisfactory result is obtained despite flawed legal advice. Finally, an intuitive assessment of a case is coloured by subjective factors, such as the mood one happens to be in at the time. A lawyer who has slept badly and now sits at his desk feeling irritable will be more dubious about the chances of a case succeeding than the lawyer who has had a good night’s sleep after a sun-filled holiday.
Individuals are generally less rational in their assessments than they think, and so are lawyers. Anyone who, after reading this article, is still not convinced should read Kahneman’s Thinking, Fast and Slow.
His book distils the results of decades of empirical and validated psychological research into the actions and judgements of people in uncertain circumstances. In the present article, I have tried to show the relevance of that research to cases of litigation and I have offered suggestions about how to avoid some common pitfalls. That practical approach has been at the expense of a more comprehensive and complete description of the many persuasive studies that Kahneman describes in his book.
To those who, having read Kahneman, need a more structured method to improving the quality of their judgement, I can highly recommend Winning at Litigation through Decision Analysis by John Celona. Decision analysis is a method for assessing problems more objectively, and for avoiding the errors of judgement of which Kahneman warns us. The method is not new; it is a form of decision-tree analysis. Multinationals, especially in the oil-and-gas industry, have been using it for many years to evaluate billion-dollar investments. However, Celona’s book is the first to apply the complete theory of decision analysis to the practice of law.
 Daniel Kahneman, Thinking, Fast and Slow (London: Penguin Books, 2012). It was originally published in the United States in 2011 (New York: Farrar, Straus and Giroux).
 Kahneman (2012), pp. 19-39.
 Ibid., pp. 20-24.
 Ibid., p. 50.
 Ibid., p. 79.
 Ibid., pp. 41-46.
 Ibid., p. 20.
 Ibid., pp. 43-44.
 Ibid., pp. 85-86.
 Ibid. See, too, the halo effect, pp. 82-85.
 What You See Is All There Is. Ibid., pp. 85 ff.
 Ibid., pp. 199-202.
 Ibid., p. 201. See, too, Nassim Nicholas Taleb, The Black Swan: The Impact of the Highly Improbable (London: Penguin Books, 2010, 2nd edition).
 It took decades before the economic community realized that Kahneman’s experiments proved that the assumption underlying most economic theory, namely rational agent theory, was wrong. It is the contribution of Kahneman’s experiments to economic science that eventually led to his being awarded a Nobel Prize.
 Kahneman (2012), pp. 80-81.
 A form of anchoring. Kahneman (2012), pp. 119 ff.
 Kahneman (2012), p. 63.
 Ibid., p. 62.
 Ibid., pp. 59-70.
 Ibid., pp. 59-60.
 Ibid., pp. 300 ff.
 Ibid., pp. 314-316.
 Ibid., pp. 319-321.
 Ibid., pp. 314-319.
 Ibid., pp. 317-319.
 Ibid., p. 317.
 Roger Fisher and William Ury (ed. Bruce Patton), Getting to Yes: Negotiating Agreement Without Giving In (London: Random House Business Books, 2012).
 For a more comprehensive critique of the Harvard negotiation model and practical alternative negotiating strategies, see, inter alia, the book by Chris Voss, an experienced former FBI hostage negotiator, Never Split the Difference: Negotiating As If Your Life Depended On It (New York: Harper Business, 2016).
 More specifically, this is a case of narrow framing; the case is viewed in a narrower context than is rational to do. See Kahneman (2012), pp. 363 ff.
 Good frames. Ibid., pp. 371 ff.
 Ibid., pp. 343-346.
 Joseph Stiglitz, Euro: How a Common Currency Threatens the Future of Europe (New York: W.W. Norton & Company, 2016).
 Kahneman (2012), pp. 170-174 and 112 ff.
 Owing in part to the bias of confidence over doubt. Kahneman (2012), pp. 113-114.
 Ibid., p. 252.
 Lisa M. DeBruine, “Trustworthy but not Lust-worthy: Context-specific Effects of Facial Resemblance”, Proceedings of the Royal Society B, 272:1566 (2005), pp. 919-922.
 Kahneman (2012), pp. 82-85. This is the halo effect. See, too, the example of Mindik, the leader, in section 2.2.
 Ibid., pp. 234-243.
 Kahneman (2012), p. 237.
 Nassim Nicholas Taleb, Antifragile: Things that Gain from Disorder (London: Penguin Books, 2013), pp. 87-88.
 Ibid., pp. 72-73.
 Apparently, Kahneman’s book is required reading at Dutch faculties of economics and psychology at Dutch universities. I believe it should be required reading at Dutch law faculties too.
 John Celona, Winning at Litigation through Decision Analysis (Switzerland: Springer International Publishing Switzerland, 2016).