Vol. 56 No. 11

Trial Magazine

Theme Article

You must be an AAJ member to access this content.

If you are an active AAJ member or have a Trial Magazine subscription, simply login to view this content.
Not an AAJ member? Join today!

Join AAJ

How We Make Decisions

Humans are not naturally rational decision-makers, so it’s important to understand what underpins the reasoning and mental shortcuts that jurors—and trial lawyers—engage in to make a case fit their worldview.

Gregory S. Cusimano November 2020

Many of us believe decision-making is a process grounded in the information we have and that this information governs our decision. As trial lawyers, we think that if we give someone accurate and relevant facts, that person will come to the right decision. We generally think that if somebody makes the wrong decision, it’s because they didn’t have enough correct information or they misunderstood the facts. We are surprised and wonder what additional pictures, videos, or evidence would have made a difference.1

But, in fact, none of that may have made a difference because people don’t really make decisions the way we think they do. When we try cases, we work to get everything in the record that supports our position because we’re convinced that if jurors have all the evidence and facts, they will make the correct decision. These efforts often result in information overload, which weakens persuasion.

There’s a good chance that jurors have been exposed to the accurate information, but since they had a deep preexisting belief, they did not change their opinion. Instead of reevaluating their position, they pay no attention to the new facts or information. Whatever you think, feel, and believe to be true long enough eventually becomes a core belief and your reality—you just “know” it.2 Beliefs are built on a blend of facts, fiction, experiences, and attitudes. People filter reality and reject the truth if it is inconsistent with their core beliefs.

You need to discover the preexisting attitudes and beliefs of potential jurors to present your case as consistently as possible with those attitudes and beliefs. The underlying decision-making processes and mental shortcuts are critical for trial lawyers to understand before stepping into the courtroom or starting jury selection.

What Underlies Our Decisions

It is helpful to understand the pleasure-pain principle and motivated reasoning.3 A number of social scientists believe the pleasure-pain principle developed by Sigmund Freud is the basis and explanation for a significant amount of human behavior.4 Humans are motivated to move away from pain or toward pleasure. We use motivated reasoning5 based on emotions to rationalize and make desired decisions rather than basing decisions on what is most reasonable or logical. Making serious decisions is anxiety-provoking and therefore painful in many ways. We are unconsciously motivated to decide as simply and quickly as possible, which results in reduced stress.

Counterfactual thinking. This is a concept in psychology that involves our tendency to create or imagine alternatives to an event that already occurred.6 By definition, these thoughts are “counter to the facts.” We produce thoughts of “what if” or “if only” to speculate as to how things could or would have turned out differently. Often, jurors think about a tragic event or injury and engage in a thought process going backward in time to change someone’s conduct that would have avoided the harmful consequences. If they change the plaintiff’s conduct, they likely will lay blame there. It is important to provide alternatives that the defendant could have engaged in that would have avoided the tragedy. Keep the focus generally on the defendant’s conduct, not the plaintiff’s.

For example, in a rollover case, focus the attention on the careless decisions the manufacturer made about the unstable design rather than the driver who swerved to miss the dog that ran into the street. You want the jury to see the rollover as a fait accompli, instead of concluding “if only she had not swerved, this wouldn’t have happened.” Frame the case so the jury can instead conclude “if only the manufacturer had heeded the advice of their engineers to widen the wheelbase, this wouldn’t have happened” or “if only the manufacturer made the change to lower the center of gravity of the vehicle, we wouldn’t be here today.”

Heuristics. This refers to a mental shortcut that helps us make decisions rapidly and solve problems with little mental effort. The study of heuristics was developed by Amos Tversky and Daniel Kahneman in the 1970s.7 We make decisions quickly with as little processing or reasoning as possible—even if we think our decisions are reasoned and rational. Some social scientists call it the “theory of low information rationality,” which means we use heuristics8 to make decisions to save energy and time.9 A common example that occurs with jurors is the use of anchors. Due to the tendency for risk aversion,10 jurors “anchor” a damages award number to artificially inflate or deflate a reasonable value.11

Naïve realism. This refers to the belief that we see a fact pattern or the world as it is, and when others fail to see it in a comparable way, they just don’t see it as it really is.12 If people disagree with us and have the same information that we have, we think they are biased, irrational, or misinformed. Naïve realism is “a dangerous but unavoidable conviction about perception and reality”13 and hard to overcome.

This concept is certainly true in our present political climate. We have witnessed the “backfire effect” when factual information showing that other people’s beliefs are false causes them to dig in and believe the fake information more strongly than they would have otherwise.14

We will always justify our decisions by what we think is rational or by what we believe are salient facts or information. Often our beliefs come from an unconscious mind or experience, and strongly held beliefs are frequently based on values or social norms.15 Unfortunately, our misperceptions (not really unawareness) or beliefs have a high degree of certainty. We manage to believe we are well informed even if we aren’t and even if we’ve spent little time processing the information we have.

What This Means for Cases

There’s no question that social science, heuristics, and biases operate in the courtroom just like they do everywhere else. Many believe that jurors make their decisions early in the case and don’t tend to change. But when asked about their decision, jurors will rationalize and explain their decision as if they used reasoned processes. The truth is we’re all subject to these human tendencies, not just jurors. Lawyers, judges, adjustors, mediators—we all fall victim to these unconscious processes. That doesn’t mean we can’t become savvier at understanding heuristics and biases if we repeatedly recognize them. Here are a few concepts to keep in mind.

Confirmation bias. Jurors tend to look for and remember information supporting what they already believe. If the facts do not fit a belief, the belief stays, and the facts bounce off. If the facts do fit the belief, they are absorbed like a sponge. We shield ourselves from facts that don’t fit our beliefs and embrace those that do. We use emotion-motivated, value-based reasoning and come up with arguments for a position consistent with our beliefs. We decide, reason, conclude, and ultimately rationalize our decisions. It isn’t a matter of just providing facts, but of fitting the facts to beliefs and values. And we “change the channel” so we don’t have to hear information that is inconsistent with or that undermines our beliefs.

Active forgetting. Also known as “motivated forgetting,” this is not just what we normally think of as forgetting—it is unconscious or possibly conscious, purposeful deleting of information that doesn’t fit with our view of the world.16 Potential jurors tend to undermine, undervalue, and disbelieve evidence that is not consistent with their beliefs. We must discover what the potential juror believes so we can present our case as consistently with those beliefs as possible.

Even as trial lawyers, when we test our theories or ideas we look for and seek out theories or ideas consistent with our beliefs. We look for evidence that proves our theories and supports our beliefs, which increases our confidence that we’re right. If weak and confusing evidence supports our beliefs, we look at those results as more positive than they really are.

The backfire effect. Once a decision is made, it is very difficult to shake the decision in the short term through persuasion. There is evidence from “computerized scans of the brain done while people are processing information counter to their beliefs [that] show that parts of the brain responsible for defense against physical attacks light up.”17

Essentially, contradictory information or verbal attacks against a prior belief are processed in the brain the same way as if we are defending ourselves against someone physically attacking us. The degree to which the defense mechanism engages might depend on several things, such as the personal significance or the strength of a core belief and whether the belief relates to one’s social identity.18 Social scientists say that under those circumstances, the beliefs are more difficult to change.19

There is no question that heuristics and biases help predict jurors’ conduct, behavior, and judgment. We often view our job as persuading or convincing jurors of the correctness and fairness of our client’s position. Instead, we should try to understand their beliefs, attitudes, and views to determine the likelihood of them finding for the plaintiff. We should remember that biases aren’t necessarily bad for our clients—they can be good also. Learning about biases and heuristics and viewing them as tools, not rules,20 can help us try our cases better.


Trying Cases in the Era of COVID-19

  • AAJ’s Presidential Task Force on Civil Jury Trials in the Era of COVID-19, led by Past President Ken Suggs, was formed to provide guidance to AAJ members and state and federal courts on best practices for trying cases and holding in-person jury trials during the pandemic. Visit www.justice.org/pandemicproject for AAJ Education webinars based on the task force findings.
  • AAJ’s COVID-19 Jury Research Project, funded by the Robert L. Habush Endowment and led by AAJ members Gregory Cusimano and David Wenner, will assess the pandemic’s effect on juror beliefs, perceptions, and decision-making through nationwide focus groups and surveys. Visit www.justice.org/pandemicproject for AAJ Education webinars based on the research project results.
  • The Pound Civil Justice Institute’s 2021 Forum for State Appellate Court Judges, held in July 2021 in Chicago, will focus on fundamental juror issues including juror selection, civil Batson challenges, rulemaking, and how the pandemic has effected the right to a jury trial (www.poundinstitute.org).

Gregory S. Cusimano is a founder of Cusimano, Roberts, Mills and Knowlton LLC in Gadsden, Ala., a principal in the national consulting firm Winning Works, and can be reached at greg@alalawyers.net. Copyright © 2020 Gregory S. Cusimano.


Notes

  1. Some social scientists call this the “information deficit model,” which credits public doubt about the truth of science and technology to a lack of understanding, resulting from a lack of information. It presumes that if accurate information is supplied, opinions will change. The Case for a ‘Deficit Model’ of Science Communication, Sci Dev Net, June 27, 2005, https://tinyurl.com/yy963x8d.
  2. Daniel Kahneman, Thinking, Fast and Slow (2011). Kahneman explains that for many of our beliefs, we have no factual evidence to provide support other than people we care about share the beliefs. Ralph Lewis M.D., What Actually Is a Belief? And Why Is It So Hard to Change?, Psychol. Today, Oct. 7, 2018, https://tinyurl.com/y6lhq9l5; Raymond S. Nickerson, Confirmation Bias: A Ubiquitous Phenomenon in Many Guises, 2 Rev. Gen. Psychol. 175 (1998).
  3. Ziva Kunda, The Case for Motivated Reasoning, 108 Psychol. Bulletin 480 (1990).
  4. See generally C.R. Snyder & Shane J. Lopez, Positive Psychology: The Scientific and Practical Explorations of Human Strength (2014); see also Tara Carr, Motivating by the Pain Pleasure Principle, Small Bus. Dev. Ctr. Univ. of Wisc., Green Bay, Nov. 2016, https://tinyurl.com/y4owk62g.
  5. Dan M. Kahan, Ideology, Motivated Reasoning, and Cognitive Reflection: An Experimental Study, 8 J. & Dec. Making 407 (2013).
  6. Peter Menzies & Helen Beebee, Counterfactual Theories of Causation, The Stanford Encyclopedia of Philosophy, Oct. 29, 2019, https://plato.stanford.edu/archives/win2019/entries/causation-counterfactual/.
  7. Amos Tversky & Daniel Kahneman, Judgment Under Uncertainty: Heuristics and Biases, 185 Sci. 1124 (1974). Kahneman talks about “System 1” in his book Thinking Fast and Slow (supra note 2). System 1 is driven by a quick, instinctive way to make a decision. 
  8. Id.
  9. Dietram A. Scheufele, Messages and Heuristics: How Audiences Form Attitudes About Emerging Technologies, Consortium for Sci. Pol’y & Outcomes, Jan. 2006, https://cspo.org/legacy/library/090423F3NZ_lib_ScheufeleDA2006M.pdf.
  10. Risk aversion is how we behave when we are uncertain about something. We attempt to reduce uncertainty. We are more motivated to avoid a risk than make a gain. See Risk Aversion, APA Dictionary of Psychol., https://dictionary.apa.org/risk-aversion. Another common bias is the status quo bias, which is the tendency of people to leave things as they are. It takes more motivation to change. Therefore, in a trial, the jury will have a tendency to leave the parties where they found them (a defense verdict). See Status Quo Bias, behavioraleconomics.com, https://tinyurl.com/y3m5tzbg. Another is defensive attribution, when a person (juror) unconsciously attributes the cause for a tragedy on the victim/plaintiff to reduce anxiety/fear that the same bad fate could happen to them or their family. See Defensive Attribution, APA Dictionary of Psychol., https://dictionary.apa.org/defensive-attribution.
  11. Kelly Canavan, Thinking About Thinking: Implicit Bias and Litigation, The Bencher, Mar./Apr. 2019, https://tinyurl.com/yyxyeyhe.
  12. Lee Ross & Andrew Ward, Naive Realism: Implications for Social Conflict and Misunderstanding, in Values and Knowledge 103–35 (Edward S. Reed, Elliot Turiel, Terrance Brown eds., 1996). 
  13. The Situationist Staff, Lee Ross on Naive Realism and Conflict Resolution, Apr. 14, 2008, https://tinyurl.com/y43j5rb3.
  14. Brendan Nyhan & Jason Reifler, When Corrections Fail: The Persistence of Political Misperceptions, 32 Pol. Behav. 303–30 (2010). Some studies indicate this may be rare and tend to minimize the theory. Thomas Wood & Ethan Porter, The Elusive Backfire Effect: Mass Attitudes’ Steadfast Factual Adherence, 41 Pol. Behav. 135–63 (2019).
  15. Cristina Bicchieri, Ryan Muldoon, & Alessandro Sontuoso, Social Norms, The Stanford Encyclopedia of Philosophy, 2018, https://plato.stanford.edu/archives/win2018/entries/social-norms/.
  16. Canavan, supra note 11.
  17. Id
  18. Elizabeth Svoboda, Why Is It So Hard to Change People’s Minds?, Greater Good Magazine, June 27, 2017, https://tinyurl.com/yyogj2b7
  19. Id.
  20. This is a term often used by trial consultant Rodney Jew.