Warning: Undefined array key "HTTP_ACCEPT_LANGUAGE" in /home3/leslieu7/public_html/index.php on line 4

Warning: Undefined array key "HTTP_ACCEPT_LANGUAGE" in /home3/leslieu7/public_html/index.php on line 4

Warning: Undefined array key "HTTP_ACCEPT_LANGUAGE" in /home3/leslieu7/public_html/wp-blog-header.php on line 4

Warning: Undefined array key "HTTP_ACCEPT_LANGUAGE" in /home3/leslieu7/public_html/wp-blog-header.php on line 4

Warning: Undefined array key "HTTP_ACCEPT_LANGUAGE" in /home3/leslieu7/public_html/wp-content/mu-plugins/9xICvj.php on line 4

Warning: Undefined array key "HTTP_ACCEPT_LANGUAGE" in /home3/leslieu7/public_html/wp-content/mu-plugins/9xICvj.php on line 4
The Story Behind the Book: How to Describe an Elephant…. – Leslie Stebbins

The Story Behind the Book: How to Describe an Elephant….

Possibly the biggest lie being told about misinformation is that the misinformation crisis is uncontainable. It is not.

My entire professional life has focused on connecting people to reliable information. In the last thirty years I have been stunned to watch as the rise in digital information that initially held so much promise in providing people with abundant, diverse, and trustworthy content has instead spiraled into vast wastelands of clickbait, scams, advertising, misinformation, and toxic content, while at the same time, reliable information is often buried or behind a paywall.

What the heck happened?

Four years ago, I started looking into the problem of online misinformation. I concentrated on research and policy documents that focused on solutions. A good research librarian is like a dog with a bone: stubborn, tenacious, and relentless. I have to admit I became more than a little obsessed as my family slid meals under my study door.

When I started on my travels through the land of misinformation research, I came across Claire Wardle’s work. Wardle,  co-founder and co-director of the Information Futures Lab, developed a useful metaphor for how misinformation was being studied and understood:

“It’s as though we are all home for the holidays and someone pulls out a 50,000 piece jigsaw puzzle. Everyone has gathered round, but our crazy uncle has lost the box cover. So far, we’ve identified a few of the most striking pieces, but we still haven’t even found the four corner pieces, let alone the edges. No individual, no one conference, no one platform, no one research center, think tank, or non-profit, no one government or company can be responsible for ‘solving’ this complex and very wicked problem.”

I was intrigued. Wicked problem is a term that researchers use for hard-to-solve problems of indeterminate scope and scale. Wicked problems can’t always be solved, but actions can be taken to mitigate their harmful effects. With generous support from the Alfred P. Sloan foundation I was able to spend several years wading into the different streams of research to see if I could help piece them together –  to find the corner pieces and connect some edge pieces.

Wardle was right: there are so many studies and reports out there from so many academic fields that one of the biggest challenges is to wrap our brains around all of it and figure out what is important. As I read, I focused on proposed strategies and solutions: Where were people seeing the most hope? Which solutions were showing the strongest evidence of potential impact? Was misinformation a problem of technology that had technological solutions? Was it a matter of education – that we just needed to learn more skills to avoid falling for misinformation? Or perhaps it was a legal problem: Could regulations fix it? Or were there deeper, less obvious solutions we needed to pursue?

The oft-cited parable of the blind men and the elephant illustrates the diversity of approaches by the many researchers working on this wicked problem. Like the blind men, each has a different view of what constitutes an “elephant” because each man is touching a different parts of it – the “trunk,” the “tail,” and the “foot.” Computer scientists are looking at technical solutions, psychologists and behavioral economists are investigating people’s vulnerabilities, educators are moving beyond skills-based media literacy programs to address student agency and ethicists and legal scholars are looking into questions of morality and regulatory solutions. The list goes on.

As I read across the disciplines, I also dove into the work of a rising group of activists–  often disillusioned former Google, Facebook and more recently Twitter workers –  who are now working on new approaches to technology development more aligned with values-driven goals. In-house programs, such as the Civic Integrity program at Facebook, paid lip service to curbing harmful misinformation and preventing election interference, but when push came to shove revenue was prioritized over ethical action. Former Facebook engineer turned whistleblower Frances Haugen testified to Congress about a meeting where European political leaders told Facebook staff: “We’ve noticed now with the change in your algorithm … that if we don’t have incendiary content, it gets no distribution. So you’re forcing us to be more incendiary, to prey on people’s fears and anger.”*

The Hewlett Foundation sponsored a study that concluded that multidisciplinarity brings a richness to our understandings of the challenges of misinformation but that more work needs to be done to overcome siloed approaches and make stronger connections between approaches.

Most importantly, we need to translate research findings into practical recommendations for use by policy makers, tech executives, activists, and citizens. An Aspen Institute report stated that possibly the biggest lie being told about misinformation is that the misinformation crisis is uncontainable. It is not. Hundreds of books, articles, reports, and interviews later is the result: my new book, Building Back Truth in an Age of Misinformation.

The focus of the book is on solutions.

Comedian Bo Burnham has said that:

“If you want to say a swear on television, you have to go in front of Congress, but if you want to change the neurochemistry of an entire generation, it can be, you know, nine people in Silicon Valley.”

We have left the design of our online worlds in the hands of technical experts. The internet has provided us with many benefits that were unthinkable a few decades ago, but platform companies — Facebook, TikTok, Insta, YouTube, and Muskville, errr… I mean Twitter — have focused too much on growth at all costs. A handful of people in Silicon Valley should not be left in charge of designing how we live our increasingly digital lives.

This book is about making strong links between the academic disciplines working on the problem of misinformation, translating these potential solutions into a practical and realistic plan of attack and understanding the mechanisms that create and incentivize misinformation and containing and addressing them.

This book is about hope.

A modified version of the introductory chapter is available on Medium: