Warning: Undefined array key "HTTP_ACCEPT_LANGUAGE" in /home3/leslieu7/public_html/index.php on line 4

Warning: Undefined array key "HTTP_ACCEPT_LANGUAGE" in /home3/leslieu7/public_html/index.php on line 4

Warning: Undefined array key "HTTP_ACCEPT_LANGUAGE" in /home3/leslieu7/public_html/wp-blog-header.php on line 4

Warning: Undefined array key "HTTP_ACCEPT_LANGUAGE" in /home3/leslieu7/public_html/wp-blog-header.php on line 4

Warning: Undefined array key "HTTP_ACCEPT_LANGUAGE" in /home3/leslieu7/public_html/wp-content/mu-plugins/9xICvj.php on line 4

Warning: Undefined array key "HTTP_ACCEPT_LANGUAGE" in /home3/leslieu7/public_html/wp-content/mu-plugins/9xICvj.php on line 4
Excerpt from Chapter 5 of Building Back Truth in an Age of Misinformation – Leslie Stebbins

Excerpt from Chapter 5 of Building Back Truth in an Age of Misinformation

In 2013, Tristan Harris was working as a product manager at Google when he spoke at a staff meeting about the company’s moral responsibility to its users. He told colleagues that Google was making decisions about what content people were consuming and this was going to have a huge impact on culture, relationships and politics. They were shaping the attention of billions of people. It was important that they get this right. Harris saw it as an opportunity, but Google had other priorities, and in those early days they likely underestimated the scale of the impact they were starting to have.[i]

In 2018, Guillaume Chaslot, a Google engineer working with a team on YouTube’s recommender engine, said that he realized that what they were designing was not being optimized for balance, truth or useful content. Chaslot reported talking to friends at Google, but the general feeling was that if people didn’t complain, they must be happy. He was later fired from the company. Chaslot then designed a program to better understand how YouTube pushed content to users. His research found that engagement was prioritized above all else. One case he investigated found that videos by Alex Jones, the conspiracy theorist who spread misinformation that the Sandy Hook killings had been staged by “crisis actors,” were recommended on YouTube fifteen billion times.[ii]

Newton Minot has had what he calls “a front-row seat” to the communication revolution that began when he was as a teenager in World War II and continued through more than seventy years working on communication policy in Republican and Democratic administrations including serving as the chair of the Federal Communications Commission (FCC). Taking the long view, Minot notes that the rapid, unfettered development of social media platforms has run parallel to a disappearance of public protections. As technological innovations have barreled ahead, advances in policy have lagged far behind: “the basic concept that our communications systems are to serve the public–not private–interest is now missing in action,” he writes. No “scientist, philosopher, or engineer has figured out how to program AI to serve the public interest.”[iii]

This lack of regulation has resulted in an information disorder so severe that it threatens our democratic system of government.  Our old communications policies centered on viewing speech as scarce and audiences as plentiful. Today we have the reverse: In an attention economy, there is a wealth of information that creates a poverty of attention. There is so much content online that no one can be heard. By incentivizing and promoting misinformation, healthy democratic discussions cannot take place.

Harris, Chaslot and Minot have helped us connect the dots: Our current information crisis can be tied directly to a handful of technology companies that dominate our communication channels and use a business model that fuels misinformation and creates information disorder. And while this was not their original intention, and they report that they are working to fix the problem through privacy features and labeling, they have little motivation to change the model that is providing them with billions in revenue.

Understanding how these platforms are built, and that their core infrastructure is defective, can help us focus on going after the root causes of misinformation, not just trying to pick off harmful content from the top down. Fixing the misinformation problem is not an issue of free speech, though that is often used as a red herring. By focusing on renovating the core structure of these platforms we can make design choices that minimize misinformation without having to censor speech.

[i]Eric Johnson, “Tristan Harris Says Tech is “Downgrading” Humanity — But we can Fix it.” Vox, (May 6, 2019),

[ii] Maelle Gavet, Trampled by Unicorns: Big Tech’s Empathy Problem and How to Fix It, (Hoboken, NJ: Wiley, 2020).Top of FormBottom of Form

[iii] Newton Minow, “Preface,” in Saving the News, by Martha Minow (New York: Oxford University Press, 2021).