Speaking in an interview with Tyler Cowen, Neal Stephenson offers an excellent analysis of how social media has hurt civic discourse:

COWEN: You saw some of the downsides of social media earlier than most people did in Seveneves. It’s also in your new book, Fall. What’s the worst-case scenario for how social media evolved? And what’s the institutional failure? Why do many people think they’re screwing things up?

STEPHENSON: I think we’re actually living through the worst-case scenario right now, so look about you, and that’s what we’ve got. Our civil institutions were founded upon an assumption that people would be able to agree on what reality is, agree on facts, and that they would then make rational, good-faith decisions based on that. They might disagree as to how to interpret those facts or what their political philosophy was, but it was all founded on a shared understanding of reality.

And that’s now been dissolved out from under us, and we don’t have a mechanism to address that problem.

Mr. Stephenson’s observation corresponds to my experience of social media (especially Twitter): It’s not that folks are talking past each other, it’s that they’re not even interacting with people who don’t share their mental models. The mere hint of the possibility of an alternate take can lead to ostracism — or worse. Amplified through continuous validation and a complete lack of pushback, opinions replace facts as the basis for worldviews. To talk of filter bubbles is misleading: these aren’t tenuous membranes; they’re thick, hardened shells.

The interview continues:

COWEN: But what’s the fundamental problem there? Is it that decentralized communications media intrinsically fail because there are too many voices? Is there something about the particular structure of social media now?

STEPHENSON: The problem seems to be the fact that it’s algorithmically driven, and that there are not humans in the loop making decisions, making editorial, sort of curatorial decisions about what is going to be disseminated on those networks.

As such, it’s very easy for people who are acting in bad faith to game that system and produce whatever kind of depiction of reality best suits them. Sometimes that may be something that drives people in a particular direction politically, but there’s also just a completely nihilistic, let-it-all-burn kind of approach that some of these actors are taking, which is just to destroy people’s faith in any kind of information and create a kind of gridlock in which nobody can agree on anything.

In other words, it’s a structural problem. As such, it’s also systemic. Unmentioned in the interview is the driving force behind these algorithmic constructs: business models based on monetizing users’ attention. Incentivizing engagement leads to systems that produce fragmentation and conflict.

Neal Stephenson on Depictions of Reality (Ep. 71)