Monitoring misinformation-related interventions on mainstream social media platforms
The WebClim team will present its recent work during the 28th september seminar.
Event, Research Seminar
There is growing pressure for web platforms, such as Facebook, Twitter or YouTube, to fight misinformation by moderating the content that spreads on their websites.
We will first present the methods we used to investigate platforms’ interventions by collecting social media data via APIs and scraping. These interventions can be classified into three broad categories: (i) temporary or permanent suspension of users, (ii) displaying flags and information panels, and (iii) reducing the visibility of problematic content. We list data sources and provide examples illustrating how researchers and journalists can monitor misinformation-related interventions by platforms within each of the three categories.
We then investigate in more detail Facebook’s “repeat offenders’’ policy, which the company said should result in reducing the distribution of content from accounts that repeatedly post false information. The implementation of this policy or its impacts have never been investigated before. We show that this policy only results in moderate reduction of engagement metrics for their content.
Finally, we will discuss results about misinformation in search and recommendations on YouTube, as the company repeatedly communicates on its policy to promote content from “authoritative” sources. To that end, we examined the most recommended videos either when searching for keywords related to health and climate change or when watching a number of misinformation videos. We find that Youtube’s most recommended videos are rarely misinformation, however the most recommended channels are a mix of reliable and hyper-partisan sources, illustrating Youtube’s failure to surface “authoritative” content.
Shaden Shabayek is a post-doctoral researcher with the WebClim team. Her research interests focus on the persistence and spread of "misinformation" on social networks. In particular, she will study the impact of measures announced by social networking platforms to combat misinformation in scientific fields. On the other hand, it will explore the relevance of these measures (e.g. tagging or deletion of content) in relation to different cognitive biases (e.g. reactance).
Héloïse Théro is a research engineer at the medialab. She works on the collection of web data related to disinformation, their processing, analysis and visualization. The objective is to better understand how the major digital platforms (Facebook, Youtube, etc.) are currently fighting against the spread of fake news.
Emmanuel Vincent studies online misinformation in scientific fields such as climate change and health. Laureate of the Make Our Planet Great Again grant, he joined the Sciences Po médialab in 2019 to lead research on the influence of web platforms that are widely used to access online information such as Google, Facebook, Youtube… on public understanding of scientific topics.