You have /5 articles left.
Sign up for a free account or log in.

The closest thing to a consensus judgment in contemporary American public life is probably the belief that we are split up into tribes of the like minded, sealed off in our respective echo chambers, except when sending missiles of toxic verbiage at one another, as goes on pretty much constantly. And that is on a good day. At other times, weaponry of a more literal sort is involved.

If we agree on nothing else -- and from the looks of it, we don't -- this much seems clear to everyone: people now live in bubbles of misinformation and confirmation bias, or at least the people we disagree with do.

Now along comes Chris Bail's Breaking the Social Media Prism: How to Make Our Platforms Less Polarizing (Princeton University Press), which takes aim at this fragile consensus by insisting that things are not nearly as fractured as they usually seem. The echo-chamber effect (created by the confluence of mass media and social media, insofar as that distinction still means very much) is not entirely illusory. But Bail counsels a degree of skepticism, suggesting that the notion of a society of echo chambers concedes far too much to Silicon Valley's sense of its own role as driver of change. Furthermore, it would lend itself to resignation: the self-pulverizing culture becomes a self-fulfilling prophecy.

A professor of sociology and public policy at Duke University, the author is also the director of the Polarization Lab at Duke and characterizes the work that he and his colleagues are doing as computational social science. They conduct studies on online political behavior considered in its relationship to off-line experience. The connection is not altogether straightforward.

That the American population as a whole is not overwhelmingly preoccupied with politics is both widely recognized and something quickly forgotten once the preoccupied begin talking among themselves. Bail cites research on how people "redirect conversations away from politics or avoid discussing current events" altogether. That is not to say they are apolitical, any more than people who avoid talking about religion are necessarily irreligious. They just prefer to keep the yelling to a minimum.

And the firm opinions people do hold are more often a patchwork quilt than a full suit of armor. There's no reason why holding a given position on immigration policy would compel people to agree about (or even to care very much, one way or the other) same-sex marriage rights or how the Second Amendment should be interpreted. The impulse to affix red and blue labels to the respective sides of every disputed issue in public life does not come naturally. Nor does consistency, for that matter, which may account for the finding Bail cites from one study: “only 3 percent of Americans identified themselves as either ‘extremely liberal’ or ‘extremely conservative.’”

Where things become much more fraught is with the phenomenon of perceived polarization -- or, as the author prefers, false polarization. This refers to "the tendency for people to overestimate the amount of ideological difference between themselves and people from other political parties." A national survey by the Pew Research Center from 2018 found that 55 percent of Republicans thought of the Democratic Party as "extremely liberal" while a little over a third of Democrats described the GOP as "extremely conservative."

Bail examined the Pew study data and determined "the extent to which people exaggerated the ideological extremism of people from the other party" to be "significantly larger among those who used social media to get their news." Other researchers note that social media users "frequently perceive more political disagreement in their daily lives than those who do not," which will come as no surprise to anyone. Of greater interest is the finding that "the amount of perceived polarization grows as the social distance between people grows."

In other words, a given statement on some hot-button topic is more likely to be perceived as confrontational if the person making it is only the friend of a friend than if it comes from someone more familiar. Lots of people use social media just to keep up with their extended families, or to share their shaky cellphone videos of Kanye West in concert, or what have you. They feel no need for a digital soapbox. But they sense a horde of extremists just a few clicks away, ready to pounce should they venture a political remark.

The worry is not altogether unfounded: Bail acknowledges a mode of online political advocacy -- "leaderless demagoguery," he calls it -- that awards status and recognition to those capable of entertaining displays of hostility. More than anything else, it is the troll armies who make the notion of a political echo-chamber effect seem plausible. But in Bail's analysis, amplifying political conflict is only part of social media's role in public life now. Vast multitudes of politically engaged but ambivalent or conflict-averse citizens withdraw from the fray or keep a low profile. The possibility of public deliberation shrinks; anything less than an ardently held opinion has little chance of being heard.

One oft-mooted suggestion is that people might develop more nuanced or balanced opinions if they were exposed to the other side's perspectives. The author describes such a moderating impact as "highly intuitive" and recounts experiments he and his colleagues have conducted with an expectation of confirming it. The social media feeds of people identifying as supporters of one political party were tweaked so that news and commentary posted by supporters of the other party would appear occasionally. The researchers would conduct interviews before and after the period of admixture, to determine what impact it had, if any.

Exposure to more diverse perspectives did not have the expected, "highly intuitive" moderating effect, after all. Rather, things went markedly in the other direction. Someone characterized as an "unenthusiastic" supporter of a party did not feel a pull to grant the other side's talking points but started to identify more strongly with the party and to grow partisan on issues they hadn't much cared about before. Those with ardent partisan identities before the experiment came out the other side feeling more combative. “In fact,” Bail writes, “most people who followed our bots did not discuss the new ideas they encountered about social policies at all … Stepping outside their echo chambers seemed to sharpen the contrast between ‘us’ and ‘them.’”

While discounting the echo-chamber effect at some points, the book confirms it at others -- and it seems that the echoes also go through a prism, somehow. This is a case in which mixed metaphors are practically obligatory, and it wouldn't hurt to add the image of social media as a carnival (where ordinary behavioral norms are suspended) full of fun-house mirrors (in which little actually is as it appears). The problem is that all of this is now standard equipment with most social media platforms; proposals to change any of it will be evaluated against the viability of an existing business model.

Apart from encouraging people to develop more critical literacy about the sources they rely on -- and to think before they post, maybe tone it down a little and not take the troll bait -- what can be done? Bail describes his research group's creation of DiscussIt, a social media platform advertised as "a place where people anonymously discuss various topics," in which users were assigned androgynous pseudonyms, paired up with a chat partner of the opposite political persuasion, and given a hot topic to discuss. A test run with 1,200 subjects showed "significantly lower levels of polarization after using it for just a short time," with users expressing "fewer negative attitudes toward the other party or subscrib[ing] less strongly to stereotypes about them." Most participants reported liking the app, and some wanted to sign up for it if it became commercially available.

The salient features here were the removal of identifying information about the participants and their "disembedding" from familiar social networks or political milieus in the course of one-to-one dialogues. Bail extrapolates the potential:

“Imagine a platform that gave people status not for clever takedowns of political opponents but for producing content with bipartisan appeal … Instead of boosting content that is controversial or divisive, such a platform could improve the rank of messages that resonate with different audiences simultaneously … ‘Like’ counters could be replaced by meters that show how people from across the ideological spectrum respond to people’s posts in blue, red, and purple. Artificial intelligence could be used to ask people who are about to post uncivil or ad hominem content to reflect upon their goals or to help rephrase their messages using values that appeal to the other side.”

Such a platform would offer little to "the trolls and extremists who gain notoriety on other platforms" and might foster better online behavioral norms, over all. (That's possible, though one suspects it would just breed pests of a more nimble sort.) By its closing pages, Breaking the Social Media Prism shifts from criticizing Silicon Valley's hubris to appealing to its long-term interests:

"If governments don't step in" to establish depolarizing apps, "there is money to be made by entrepreneurs willing to bet that a place where people can build reputations as effective bipartisan communicators will be more attractive to businesses, governments and nonprofit organizations alike."

The curative power of bipartisanship is one of the hardier bromides of American politics and a topic for another day. What makes this version of it distinctive is the idea that it can be jump-started via artificial intelligence. In the 1920s, John Dewey suggested that the challenges of democracy could best be met with more democracy. The 21st-century variant is that the problems created by technology can only be fixed with new technology. Maybe we can get to utopia by digging ourselves out of this hole?

Next Story

Written By

More from Intellectual Affairs