HOW CONFIRMATORY BIAS TAINTS THE LAW|DR. LINDA SHELTON, PRO SE CHICAGO


 

HOW CONFIRMATORY BIAS TAINTS THE LAW

By: Dr. Linda Shelton, Chicago, Illinois, Pro Se Chicago

CONFIRMATORY BIAS.PRO SE CHICAGO OR COOK COUNTY.how-confirmatory-bias-taints-the-law

There is a concept in psychology and psychiatry termed confirmatory bias, which often adversely influences the judgment of officers, attorneys, judges, and juries, resulting in biased, unfair or unlawful arrests, decisions and convictions. It is the unfortunate human tendency for each individual to become part of a position, to hold that position regardless and to hear only that which supports that position, a phenomenon known as “confirmatory bias”. Dr. Richard Rappaport, a nationally renowned member of the American Academy of Psychiatry and the Law concluded in an editorial in a leading Journal for the American Academy of Psychiatry and the Law, in 2006, that AZ, a civil rights activist, had been abused by the courts and police due to this principle. In the 2013 murder trial of David Camm, the defense argued that Camm was charged for the murders of his wife and two children solely because of confirmation bias within the investigation. Confirmatory bias is pervasive in law. It is an area which is ripe for increased efforts to recognize it, as well as for legislation and rules that incorporate methods to reduce it.
The following has been adopted from the Wikipedia article on “Confirmation Bias” , .
Confirmation bias, also called myside bias, is the tendency to search for, interpret, or prioritize information in a way that confirms one’s beliefs or hypotheses. It is a type of cognitive bias and a systematic error of inductive reasoning. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. The effect is stronger for emotionally charged issues and for deeply entrenched beliefs. People also tend to interpret ambiguous evidence as supporting their existing position. Biased search, interpretation and memory have been invoked to explain attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence), belief perseverance (when beliefs persist after the evidence for them is shown to be false), the irrational primacy effect (a greater reliance on information encountered early in a series) and illusory correlation (when people falsely perceive an association between two events or situations).
BIASED INTERPRETATION
“Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons.”
—Michael Shermer
Confirmation biases are not limited to the collection of evidence. Even if two individuals have the same information, the way they interpret it can be biased.
A team at Stanford University conducted an experiment involving participants who felt strongly about capital punishment, with half in favor and half against it. , Each participant read descriptions of two studies: a comparison of U.S. states with and without the death penalty, and a comparison of murder rates in a state before and after the introduction of the death penalty. After reading a quick description of each study, the participants were asked whether their opinions had changed. Then, they read a more detailed account of each study’s procedure and had to rate whether the research was well-conducted and convincing.[FN 5] In fact, the studies were fictional. Half the participants were told that one kind of study supported the deterrent effect and the other undermined it, while for other participants the conclusions were swapped.[FN5][FN6]
The participants, whether supporters or opponents, reported shifting their attitudes slightly in the direction of the first study they read. Once they read the more detailed descriptions of the two studies, they almost all returned to their original belief regardless of the evidence provided, pointing to details that supported their viewpoint and disregarding anything contrary. Participants described studies supporting their pre-existing view as superior to those that contradicted it, in detailed and specific ways.[FN5] Writing about a study that seemed to undermine the deterrence effect, a death penalty proponent wrote, “The research didn’t cover a long enough period of time”, while an opponent’s comment on the same study said, “No strong evidence to contradict the researchers has been presented”.[FN5] The results illustrated that people set higher standards of evidence for hypotheses that go against their current expectations. This effect, known as “disconfirmation bias”, has been supported by other experiments.
Biased interpretation is not restricted to emotionally significant topics. In another experiment, participants were told a story about a theft. They had to rate the evidential importance of statements arguing either for or against a particular character being responsible. When they hypothesized that character’s guilt, they rated statements supporting that hypothesis as more important than conflicting statements.
BIASED MEMORY
Even if people gather and interpret evidence in a neutral manner, they may still remember it selectively to reinforce their expectations. This effect is called “selective recall”, “confirmatory memory” or “access-biased memory”. Psychological theories differ in their predictions about selective recall. Schema theory predicts that information matching prior expectations will be more easily stored and recalled than information that does not match. Some alternative approaches say that surprising information stands out and so is memorable.[FN12] Predictions from both these theories have been confirmed in different experimental contexts, with no theory winning outright.
PERSISTENCE OF DISCREDITED BELIEFS
“[B]eliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the total destruction of their original evidential bases.”
—Lee Ross and Craig Anderson
PREFERENCE FOR EARLY INFORMATION
Experiments have shown that information is weighted more strongly when it appears early in a series, even when the order is unimportant. For example, people form a more positive impression of someone described as “intelligent, industrious, impulsive, critical, stubborn, envious” than when they are given the same words in reverse order. This irrational primacy effect is independent of the primacy effect in memory in which the earlier items in a series leave a stronger memory trace.[FN15] Biased interpretation offers an explanation for this effect: seeing the initial evidence, people form a working hypothesis that affects how they interpret the rest of the information.
INFORMAL OBSERVATION
Before psychological research on confirmation bias, the phenomenon had been observed anecdotally by writers, including the Greek historian Thucydides (c. 460 BC – c. 395 BC), Italian poet Dante Alighieri (1265–1321), English philosopher and scientist Francis Bacon (1561–1626), and Russian author Leo Tolstoy (1828–1910). Thucydides, in The Peloponnesian War wrote: “… for it is a habit of mankind to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy.” In the Divine Comedy, St. Thomas Aquinas cautions Dante when they meet in Paradise, “opinion—hasty—often can incline to the wrong side, and then affection for one’s own opinion binds, confines the mind.” Bacon, in the Novum Organum, wrote,
The human understanding when it has once adopted an opinion … draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises, or else by some distinction sets aside or rejects[.]
Bacon said that biased assessment of evidence drove “all superstitions, whether in astrology, dreams, omens, divine judgments or the like”.[FN20] In his essay “What Is Art?”, Tolstoy wrote,
I know that most men—not only those considered clever, but even those who are very clever, and capable of understanding most difficult scientific, mathematical, or philosophic problems—can very seldom discern even the simplest and most obvious truth if it be such as to oblige them to admit the falsity of conclusions they have formed, perhaps with much difficulty—conclusions of which they are proud, which they have taught to others, and on which they have built their lives.
WASON’S RESEARCH ON HYPOTHESIS-TESTING
The term “confirmation bias” was coined by English psychologist Peter Wason. For an experiment published in 1960, he challenged participants to identify a rule applying to triples of numbers. At the outset, they were told that (2,4,6) fits the rule. Participants could generate their own triples and the experimenter told them whether or not each triple conformed to the rule. ,
While the actual rule was simply “any ascending sequence”, the participants had a great deal of difficulty in finding it, often announcing rules that were far more specific, such as “the middle number is the average of the first and last”. The participants seemed to test only positive examples—triples that obeyed their hypothesized rule. For example, if they thought the rule was, “Each number is two greater than its predecessor”, they would offer a triple that fit this rule, such as (11,13,15) rather than a triple that violates it, such as (11,12,19).
Wason accepted falsificationism, according to which a scientific test of a hypothesis is a serious attempt to falsify it. He interpreted his results as showing a preference for confirmation over falsification, hence the term “confirmation bias”. Wason also used confirmation bias to explain the results of his selection task experiment. In this task, participants are given partial information about a set of objects, and have to specify what further information they would need to tell whether or not a conditional rule (“If A, then B”) applies. It has been found repeatedly that people perform badly on various forms of this test, in most cases ignoring information that could potentially refute the rule. ,
EXPLANATIONS
Confirmation bias is often described as a result of automatic, unintentional strategies rather than deliberate deception. , According to Robert Maccoun, most biased evidence processing occurs through a combination of both “cold” (cognitive) and “hot” (motivated) mechanisms.
Cognitive explanations for confirmation bias are based on limitations in people’s ability to handle complex tasks, and the shortcuts, called heuristics, that they use. For example, people may judge the reliability of evidence by using the availability heuristic—i.e., how readily a particular idea comes to mind. It is also possible that people can only focus on one thought at a time, so find it difficult to test alternative hypotheses in parallel. Another heuristic is the positive test strategy identified by Klayman and Ha, in which people test a hypothesis by examining cases where they expect a property or event to occur. This heuristic avoids the difficult or impossible task of working out how diagnostic each possible question will be. However, it is not universally reliable, so people can overlook challenges to their existing beliefs. ,
Motivational explanations involve an effect of desire on belief, sometimes called “wishful thinking”. , It is known that people prefer pleasant thoughts over unpleasant ones in a number of ways: this is called the “Pollyanna principle”. Applied to arguments or sources of evidence, this could explain why desired conclusions are more likely to be believed true.[FN39] According to experiments that manipulate the desirability of the conclusion, people demand a high standard of evidence for unpalatable ideas and a low standard for preferred ideas. In other words, they ask, “Can I believe this?” for some suggestions and, “Must I believe this?” for others. , Although consistency is a desirable feature of attitudes, an excessive drive for consistency is another potential source of bias because it may prevent people from neutrally evaluating new, surprising information.[FN39] Social psychologist Ziva Kunda combines the cognitive and motivational theories, arguing that motivation creates the bias, but cognitive factors determine the size of the effect.
Explanations in terms of cost-benefit analysis assume that people do not just test hypotheses in a disinterested way, but assess the costs of different errors. Using ideas from evolutionary psychology, James Friedrich suggests that people do not primarily aim at truth in testing hypotheses, but try to avoid the most costly errors. For example, employers might ask one-sided questions in job interviews because they are focused on weeding out unsuitable candidates. Yaacov Trope and Akiva Liberman’s refinement of this theory assumes that people compare the two different kinds of error: accepting a false hypothesis or rejecting a true hypothesis. For instance, someone who underestimates a friend’s honesty might treat him or her suspiciously and so undermine the friendship. Overestimating the friend’s honesty may also be costly, but less so. In this case, it would be rational to seek, evaluate or remember evidence of their honesty in a biased way. When someone gives an initial impression of being introverted or extroverted, questions that match that impression come across as more empathic. This suggests that when talking to someone who seems to be an introvert, it is a sign of better social skills to ask, “Do you feel awkward in social situations?” rather than, “Do you like noisy parties?” The connection between confirmation bias and social skills was corroborated by a study of how college students get to know other people. Highly self-monitoring students, who are more sensitive to their environment and to social norms, asked more matching questions when interviewing a high-status staff member than when getting to know fellow students.
Psychologists Jennifer Lerner and Philip Tetlock distinguish two different kinds of thinking process. Exploratory thought neutrally considers multiple points of view and tries to anticipate all possible objections to a particular position, while confirmatory thought seeks to justify a specific point of view. Lerner and Tetlock say that when people expect to justify their position to others whose views they already know, they will tend to adopt a similar position to those people, and then use confirmatory thought to bolster their own credibility. However, if the external parties are overly aggressive or critical, people will disengage from thought altogether, and simply assert their personal opinions without justification. Lerner and Tetlock say that people only push themselves to think critically and logically when they know in advance they will need to explain themselves to others who are well-informed, genuinely interested in the truth, and whose views they don’t already know. Because those conditions rarely exist, they argue, most people are using confirmatory thought most of the time.
CONSEQUENCES
In politics and law
Mock trials allow researchers to examine confirmation biases in a realistic setting.
Nickerson argues that reasoning in judicial and political contexts is sometimes subconsciously biased, favoring conclusions that judges, juries or governments have already committed to. Since the evidence in a jury trial can be complex, and jurors often reach decisions about the verdict early on, it is reasonable to expect an attitude polarization effect. The prediction that jurors will become more extreme in their views as they see more evidence has been borne out in experiments with mock trials. , Both inquisitorial and adversarial criminal justice systems are affected by confirmation bias.
Confirmation bias can be a factor in creating or extending conflicts, from emotionally charged debates to wars: by interpreting the evidence in their favor, each opposing party can become overconfident that it is in the stronger position. On the other hand, confirmation bias can result in people ignoring or misinterpreting the signs of an imminent or incipient conflict. For example, psychologists Stuart Sutherland and Thomas Kida have each argued that US Admiral Husband E. Kimmel showed confirmation bias when playing down the first signs of the Japanese attack on Pearl Harbor. ,
A two-decade study of political pundits by Philip E. Tetlock found that, on the whole, their predictions were not much better than chance. Tetlock divided experts into “foxes” who maintained multiple hypotheses, and “hedgehogs” who were more dogmatic. In general, the hedgehogs were much less accurate. Tetlock blamed their failure on confirmation bias—specifically, their inability to make use of new information that contradicted their existing theories.
In the 2013 murder trial of David Camm, the defense argued that Camm was charged for the murders of his wife and two children solely because of confirmation bias within the investigation. Camm was arrested three days after the murders on the basis of faulty evidence. Despite the discovery that almost every piece of evidence on the probable cause affidavit was inaccurate or unreliable, the charges were not dropped against him. , A sweatshirt found at the crime was subsequently discovered to contain the DNA of a convicted felon, his prison nickname, and his department of corrections number. Investigators looked for Camm’s DNA on the sweatshirt, but failed to investigate any other pieces of evidence found on it and the foreign DNA was not run through CODIS until 5 years after the crime. , When the second suspect was discovered, prosecutors charged them as co-conspirators in the crime despite finding no evidence linking the two men. , Camm was acquitted of the murders.
CONCLUSION
Confirmatory bias is a dangerous form of bias that is insidious and permeates every activity in life. It is particularly dangerous in the law as it can taint the opinions and efforts of police, attorneys, judges, and jurors. Unless all parties make great efforts to avoid this form of bias, by never shutting off alternative theories concerning behavior or evidence, the policing and judicial processes are likely to become unfair and biased.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s