BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Barrett Says Judges Can Avoid Beliefs Coloring Judicial Rulings; Social Scientists Say It’s Challenging

Following
This article is more than 3 years old.

People on the extremes of either the right wing or left wing of the political spectrum have trouble evaluating evidence with an open mind.

“Orthodox” isn’t an actual stratum of Catholicism like it is of Judaism. Still, “Orthodox Catholic” is one way that United States Supreme Court nominee Amy Coney Barrett is being described. This is because, in 1998, she and a fellow Notre Dame Law School professor used that term in a paper addressing how religious orthodoxy can but should not affect a judge’s execution of his or her duties. Barrett and her co-author advised that judges who find their religious beliefs in conflict with the law must let the law take precedence. When they cannot, they must recuse themselves from a case.

That position may be at least a little reassuring to voters concerned about protecting women’s reproductive freedoms and LGBTQ rights. However, implied in that twenty-two-year-old paper is the assumption that careful judges can avoid tainting their rulings with prejudice. This is probably not true. In the late 1960s, social scientists began compiling evidence showing that people are often unaware when deep-seated biases influence their actions and opinions. The question Senators should probably ask Barrett is not whether her religious orthodoxy will get in the way of her judicial performance. It is whether she has any orthodoxy at all. A wealth of studies show that certainty itself is a problem — and that’s true for people on both ends of the political spectrum.

It’s the Confirmation Bias, Stupid

Social scientists call the tendency to seek out, place faith in, and sometimes even embellish ideas that support one's views — and to selectively ignore or discredit those that don’t — “confirmatory” or “confirmation” bias. A wealth of studies have documented this tendency. However, many were small and university-based, and the participants were college students not particularly invested in the experiment’s tasks.

In 1977, the journal Cognitive Therapy and Research published the results of a study by prolific research psychologist Michael J. Mahoney of Pennsylvania State University. It showed that professionals who bring strong opinions to a task that they value also assess new material with bias.

Sixty-seven psychologists participated in the study. Mahoney asked them to use their professional discernment to referee a scholarly article reporting the results of a test of the validity of the precepts of behavior modification. The participants believed that the article had been submitted to a compendium volume called Current Issues in Behavior Modification. Mahoney assumed that his study’s participants would have a pro-behavior modification bias because he had recruited all of them from a list of people who had been guest reviewers of an eminent, pro-behavior modification journal.

Conceivably, the participants brought their sincerity and professional skills to the job at hand.

Mahoney gave all participants copies of the article manuscript. The stated hypothesis of the faux article was that a standard behavioral modification technique (applying external stimuli to elicit a change in internal motivation) would prove tried-and-true in a simple test. Each copy of the manuscript had identical Introduction, Methods, and Bibliography sections. The remainder of the copies varied in one of five ways.

Some copies reported results substantiating the hypothesis. Some reported results that negated the hypothesis. Some reported mixed results, but included a discussion section that gave those results a positive (pro-hypothesis and pro-behavior modification) spin. Some reported mixed results, and included a discussion section that gave those results a negative spin. Some reported no results.

If the study had shown that behavior modification could not hold up under the experimental conditions, that would have constituted a devastating blow to the very field of behavioral psychology.

The participants’ professional judgments about the faux article proved to have been significantly infected by pro-behavior modification prejudice. For example, even though the Methods sections in all five versions of the article were identical, that section got higher ratings from study participants when the Results section showed data supportive of behavior modification. Ditto for the “Data Presentation” portion of the Results section. While referee ratings for “Topic Relevance” were not significantly linked to article version, for all other measures, referees gave positive ratings when the tenets of behavioral modification seemed to have been supported and negative ratings when the tenets seemed to have been disproved.

The Backfire Effect

Mahoney’s study showed that well-intentioned professionals who bring strong opinions to evaluating fresh material are often unreliable judges of that material. With “orthodox” Barrett nominated to the Supreme Court, one good question to ponder is this: Would rational arguments like those typically heard in court be able to defeat any bias she might have?

As social science marched on from 1977, the research about bias began to suggest an answer to that question: Probably not, at least when the topic under scrutiny is emotionally complex.

The issue of capital punishment, for example, is emotionally both compelling and thorny. A seminal study conducted at Stanford University and published in 1979 in The Journal of Personality and Social Psychology revealed that biases about capital punishment are so strong that rational argument can be a waste of time in dispelling preconceptions about it. Working with a pool of forty-eight college student participants, the researchers exposed the students who supported the practice of capital punishment to rational arguments against it, and they exposed the students who were against the practice of capital punishment to arguments in favor of it. By and large, as students received information that contradicted their original attitudes, they became more rather than less entrenched in their biases. Since 1979 that refusal to budge has been demonstrated in so many studies with so many kinds of participants that researchers call it the “backfire effect.”

Of course, it’s probably not impossible for people with significant confirmation biases to conscientiously re-examine and then dispel their prejudices before making judgments about evidence at hand. Indeed, Barrett has probably had practice doing just that. A legitimate question, however, is whether that is more difficult for people (and judges) who are ideologues — or, as Barrett has described herself, “orthodox” anything.

A dismaying body of research suggests that it is.

Confirmation Bias and Right-Wing Ideologues

In July of 2020 the journal Psychonomic Bulletin Review published the results of a study of closed-minded cognition among right-wing authoritarian idealogues that was conducted by a team of psychologists and neuroscientists at Duke University.

The study’s hypothesis was that people who scored high on a scale assessing right-wing authoritarianism (“RWA”) would prove motivated to preserve their own deeply-rooted ideas at the expense of learning.

The researchers created a list of one hundred twenty widely-believed statements, eighty of which were untrue. Examples of the untrue statements included:

·     Eating before swimming increases the risk of cramps

·     Adding salt to a pot of water makes it boil faster

·     Deoxygenated blood in your veins is blue.

The researchers presented all one hundred twenty statements to nearly four hundred participants whom they had recruited from Amazon’s Mechanical Turk (a crowdsourcing marketplace that has been shown to be more diverse in terms of gender, economics, race, employment, education, and age than the typical group of college students). The researchers presented their statements as a computer trivia quiz. When, while playing the game, participants indicated that they believed a particular statement, they also had to specify on a scale of zero to one hundred how strongly they believed it.

Each time anyone made a mistake, the computer gave feedback.

All participants were also rated on a scale of RWA, which the researchers defined as “characterized by a desire for order, structure, and preservation of social norms.” The RWA scale used in the study was created by other researchers in 2005. Participants were also rated on a scale of Actively Open-Minded Thinking (“AOT;” the scale was created by other researchers in 2013).

Examples of those scales’ questions, to which the participants had to agree or disagree included:

·     From the RWA scale: “Our country needs a powerful leader, in order to destroy the radical and immoral currents prevailing in society today” and “An ideal society requires some groups to be on the top and others to be on the bottom.”

·     From the AOT scale: “People should take into consideration evidence that goes against conclusions they favor” and “When faced with a puzzling question, we should try to consider more than one possible answer before reaching a conclusion.”

The researchers noted that people who scored high in RWA also typically scored high in confidence, even when the answers they gave were wrong.

All participants took the trivia test a second time. As a rule, those who had scored high in RWA demonstrated less learning. They got many questions wrong both times, and that was especially true for questions on which their confidence scores had been moderate to high.

RWA participants typically scored lower than average on measures of AOT.

Right-Wing or Left-Wing, Cognitive Bias Increases with the Intensity of Political Beliefs

Two years prior to the right-wing authoritarian study, the journal Social Psychology and Personality Science published a paper called “Science Denial Across the Political Divide.” In it, research psychologists from the University of Illinois at Chicago reported on a test of whether both political conservatives and liberals are likely to deny scientific claims that conflict with their preferred conclusions.

They were.

The researchers had recruited almost thirteen hundred participants via their college email system and nearly eight hundred from Amazon’s Mechanical Turk. They asked everyone to read scientific studies with results that were either consistent or inconsistent with their own expressed attitude about one of several issues. After being told what the conclusions were of the study they’d read, participants had to rate how much they agreed with the study’s conclusions.

Both liberals and conservatives tended to judge the studies in ways that could have been predicted by their political beliefs. Both were willing to disregard facts when they conflicted with ideology.

The problem of cognitive bias only gets worse with the intensity of people’s political beliefs.

Then, in 2018 in the journal Current Biology, three cognitive neuroscientists from University College in London reported on a study in which they gave participants a computer-based perceptual discrimination test to see whether radical political beliefs correlated with undeserved self-confidence.

The experiment had two phases and three hundred eighty-one general population participants.

In Phase I in repeated trials, participants looking at black squares on the left and right of their computer screens had to judge which square contained more flickering dots. They also had to rate the confidence of their choices.

Phase II also consisted of repeated trials; this time, however, after they’d made a choice participants were given “bonus information” with which they could change their choice. Then they were asked to rate their confidence in their initial decision.

The researchers also gave everyone questionnaires about their political attitudes. Those forms gathered data about political orientation (including voting behavior and party affiliation), dogmatism (including intolerance and belief in personal superiority), and authoritarianism (both left-wing and right-wing).

The team found that people with either right-wing or left-wing radical beliefs had trouble incorporating the “bonus information” — or, in other words, changing their minds when presented with new evidence.  Radicalism of any stripe seems to be a cognitive style that makes ideologues unusually certain that their decisions are correct whether or not the certainty is deserved.

Amy Coney Barrett’s Radicalism

When Barrett explained her twenty-two-year-old reference to “orthodox” Catholicism, she elaborated that her religious beliefs merely adhere to those of traditional Church teachings.

‘If you’re asking whether I take my faith seriously and I’m a faithful Catholic, I am,” she said to Senator Dick Durbin (D-Ill).

Barrett’s answer to Durbin bathed her religious orientation in reassuring tones. Even so, her membership in People of Praise, a charismatic Christian group that the Wall Street Journal has likened to Pentecostalism, might reasonably raise concern — if only as an indicator of radicalism. According to the Journal, “Members embrace the highly physical forms of worship that distinguish the charismatic movement, including speaking in tongues and practicing faith healing, and eventually pledge a covenant to live as part of a community and ‘obey the direction of the Holy Spirit.’”

Her Catholicism is not of the garden variety sort.

Neither is her political conservatism anywhere near the middle of the road. In her opening statement in the Supreme Court hearings she proclaimed her admiration for and indebtedness to Supreme Court justice Antonin Scalia, for whom she once clerked. This early in her confirmation process, it is partly by his consistently authoritarian, dogmatic, and far-right stances that she can be judged.

His judicial philosophy is mine,” Barrett said after President Trump announced her nomination during the October 3 Rose Garden Ceremony. If seated on the Supreme Court she will need to weigh in without bias many emotionally and morally complex cases of enormous import. The fact that the ceremony announcing her nomination turned out to be a COVID-19 superspreader event — and that, at it, she refused to incorporate current knowledge about the virus and to socially distance or wear a mask — seems a bit concerning.

Follow me on TwitterCheck out my website