Rampant online misinformation looms large in the public imagination: it is destroying democracy, don’t ya know. Maybe.
Research from Ceren Budak and collaborators, including computational social scientist supremo Duncan Watts, published in Nature this week reveals a different reality. It’s an important paper because it unpicks the moral panic around misinformation, allowing us to address it more with our heads than with our guts.
Key findings:
Exposure to false and inflammatory content is remarkably low, with just 1% of Twitter users accounting for 80% of exposure to dubious websites during the 2016 U.S. election. This is heavily concentrated among a small fringe of users actively seeking it out. Examples: 6.3% of YouTube users were responsible for 79.8% of exposure to extremist channels from July to December 2020, 85% of vaccine-sceptical content was consumed by less than 1% of US citizens in the 2016–2019 period.
Conventional wisdom blames platform algorithms for spreading misinformation. However, evidence suggests user preferences play an outsized role. For instance, a mere 0.04% of YouTube's algorithmic recommendations directed users to extremist content.
It's tempting to draw a straight line between social media usage and societal ills. But studies rigorously designed to untangle cause and effect often come up short.Â
The paper argues these misunderstandings arise from a tendency to cite impressive-sounding statistics without appropriate context, to focus on engaging but unrepresentative content, and to confuse correlation and causation.Â
As a result, the scale and impact of misinformation on social media tends to be overstated in public discourse relative to the empirical evidence. Facebook’s research about Russian troll farms suggested 126 million Americans were exposed to the material, even though that represented 0.004% of the material Americans saw in the Facebook newsfeed.Â
How do we move forward?
Shine a light on the fringe: Zoom in on the small minority driving the bulk of misinformation exposure. Map their networks, decode their motivations, and use precision-guided strategies to stem the tide of false content.
Pursue causality with rigour: Ditch the correlations and dig deeper for causal gold. Randomized trials, natural experiments, and longitudinal studies can untangle the web of variables. Partnering with academic matters.
Think globally: Date coverage and research is deeper in the US and Europe. More investment is needed globally to understand the country to country differences.
Iterate, test, adapt: With data about causality, we should turn to evidence-based interventions. These could be active, like counter-messaging; regulatory, like mandating algorithmic adjustments or investments in safety teams; or policies, like media literacy. Because the landscape is shifting, like other institutional adaptations of the exponential age, these need to be dynamic and iterative.Â
We must adopt a more nuanced, systems-level perspective rather than getting caught up in moral panics or one-size-fits-all solutions. This is how we build our epistemic integrity.Â
Just as the exponential technologies that underpin our world continue to evolve at breakneck speed, so too must our responses to the challenges they present. A culture of experimentation, learning, and collaboration delivers this agility.Â
The prize is a healthier, more resilient information ecosystem supporting informed citizenship, democratic deliberation, and social cohesion in the face of that accelerating change.
Cheers,
A
Great piece. Will read Nature article soon. The methodology outlined has many applications.
These findings are shallow and naive, 2016 is ages ago in current times. The transformer innovation that has accelerated generative AI wasn't even published yet (2017). Why do you think the US senate is moving to bann TikTok?
If you like systems thinking, add in the factual numbers of covid and climate change denial. Plus the critical reasoning skills, sustained attention and reading comprehension of youth.
Zoom out to see more clearly.
Here's a nice analysis, though it won't subdue (grounded) concern:
https://open.substack.com/pub/noahpinion/p/how-liberal-democracy-might-lose?utm_source=share&utm_medium=android&r=a5c61