An eye-opening story in Bloomberg offers a glimpse into the workings of YouTube and how its business model incentivizes the spread of misinformation:

The conundrum isn’t just that videos questioning the moon landing or the efficacy of vaccines are on YouTube. The massive “library,” generated by users with little editorial oversight, is bound to have untrue nonsense. Instead, YouTube’s problem is that it allows the nonsense to flourish. And, in some cases, through its powerful artificial intelligence system, it even provides the fuel that lets it spread.

Why does this happen?

The company spent years chasing one business goal above others: “Engagement,” a measure of the views, time spent and interactions with online videos. Conversations with over twenty people who work at, or recently left, YouTube reveal a corporate leadership unable or unwilling to act on these internal alarms for fear of throttling engagement.

In 2012, YouTube set out a new objective to reach one billion hours of viewing per day; this led to a new recommendation algorithm designed to increase engagement. The company achieved its billion-hour per day goal in 2016—not coincidentally the year when it became apparent the degree to which such engagement-driven systems were influencing politics (and society as a whole.)

Yesterday I was teaching my students about Donella Meadow’s fantastic essay, Places to Intervene in a System. In this work, Ms. Meadows offers a hierarchy of “leverage points” — things you can tweak to make systems work differently. They are, in order of least to most impactful:

  1. Numbers (e.g., taxes, standards)

  2. Material stocks and flows

  3. Regulating negative feedback loops

  4. Driving positive feedback loops

  5. Information flows

  6. The rules of the system (e.g., incentives, punishment, constraints)

  7. The power of self-organization

  8. The goals of the system

  9. The mindset or paradigm out of which the system arises

Note the prominent position of goals in this list. Few things are as influential in shaping a system as setting clear goals and incentivizing people to reach them. By setting engagement as a goal, YouTube’s leadership created the conditions that led to fostering misinformation in their system. We’re all paying the price for the damage caused by outrage-mongers on systems like YouTube. The erosion in our ability to hold civic discourse, political polarization, the spread of maladaptive memes, etc. are externalities unaccounted for in these companies’ bottom lines.

As Ms. Meadows points out, the only thing more powerful than goals is the paradigm out of which the goals emerge. YouTube emerged from a worldview that precedes the internet, big data, and “smart” algorithms. These things add up to something that isn’t a bigger/faster version of earlier communication systems — it’s a paradigm shift. We’re undergoing a transformation at least as significant as that wrought by the movable type printing press, which precipitated significant social and economic changes (especially in the West, but ultimately around the world.)

We’re still too close to the beginning of our current transformation to know what socioeconomic structures will ultimately emerge. But one thing is sure: a shift in mindset is required. The scale of the potential social impact of systems like YouTube calls for re-visiting things we’ve long taken for granted, such the role of for-profit companies in society and the meaning of key concepts such as freedom of speech and the rights of the individual. The question isn’t whether we’ll have to change our mindset — rather, it’s how much turbulence and suffering we’ll experience as a result. We should do all we can to minimize both.