Recommendation algorithms profoundly shape users' attention and information consumption on social media platforms. This study introduces a computational intervention aimed at mitigating two key biases in algorithms by influencing the recommendation process. We tackle , or algorithms creating narrow nonnews and entertainment information diets, and , or algorithms directing the more strongly partisan users to like-minded content.
View Article and Find Full Text PDFPolarization, misinformation, declining trust, and wavering support for democratic norms are pressing threats to the US Exposure to verified and balanced news may make citizens more resilient to these threats. This project examines how to enhance users' exposure to and engagement with verified and ideologically balanced news in an ecologically valid setting. We rely on a 2-week long field experiment on 28,457 Twitter users.
View Article and Find Full Text PDFCurrent interventions to combat misinformation, including fact-checking, media literacy tips and media coverage of misinformation, may have unintended consequences for democracy. We propose that these interventions may increase scepticism towards all information, including accurate information. Across three online survey experiments in three diverse countries (the United States, Poland and Hong Kong; total n = 6,127), we tested the negative spillover effects of existing strategies and compared them with three alternative interventions against misinformation.
View Article and Find Full Text PDFWe study the effect of Facebook and Instagram access on political beliefs, attitudes, and behavior by randomizing a subset of 19,857 Facebook users and 15,585 Instagram users to deactivate their accounts for 6 wk before the 2020 U.S. election.
View Article and Find Full Text PDFMost scholars focus on the prevalence and democratic effects of (partisan) news exposure. This focus misses large parts of online activities of a majority of politically disinterested citizens. Although political content also appears outside of news outlets and may profoundly shape public opinion, its prevalence and effects are under-studied at scale.
View Article and Find Full Text PDFAlgorithms of social media platforms are often criticized for recommending ideologically congenial and radical content to their users. Despite these concerns, evidence on such filter bubbles and rabbit holes of radicalization is inconclusive. We conduct an audit of the platform using 100,000 sock puppets that allow us to systematically and at scale isolate the influence of the algorithm in recommendations.
View Article and Find Full Text PDFMany critics raise concerns about the prevalence of 'echo chambers' on social media and their potential role in increasing political polarization. However, the lack of available data and the challenges of conducting large-scale field experiments have made it difficult to assess the scope of the problem. Here we present data from 2020 for the entire population of active adult Facebook users in the USA showing that content from 'like-minded' sources constitutes the majority of what people see on the platform, although political information and news represent only a small fraction of these exposures.
View Article and Find Full Text PDFWe studied the effects of exposure to reshared content on Facebook during the 2020 US election by assigning a random set of consenting, US-based users to feeds that did not contain any reshares over a 3-month period. We find that removing reshared content substantially decreases the amount of political news, including content from untrustworthy sources, to which users are exposed; decreases overall clicks and reactions; and reduces partisan news clicks. Further, we observe that removing reshared content produces clear decreases in news knowledge within the sample, although there is some uncertainty about how this would generalize to all users.
View Article and Find Full Text PDFDoes Facebook enable ideological segregation in political news consumption? We analyzed exposure to news during the US 2020 election using aggregated data for 208 million US Facebook users. We compared the inventory of all political news that users could have seen in their feeds with the information that they saw (after algorithmic curation) and the information with which they engaged. We show that (i) ideological segregation is high and increases as we shift from potential exposure to actual exposure to engagement; (ii) there is an asymmetry between conservative and liberal audiences, with a substantial corner of the news ecosystem consumed exclusively by conservatives; and (iii) most misinformation, as identified by Meta's Third-Party Fact-Checking Program, exists within this homogeneously conservative corner, which has no equivalent on the liberal side.
View Article and Find Full Text PDFWe investigated the effects of Facebook's and Instagram's feed algorithms during the 2020 US election. We assigned a sample of consenting users to reverse-chronologically-ordered feeds instead of the default algorithms. Moving users out of algorithmic feeds substantially decreased the time they spent on the platforms and their activity.
View Article and Find Full Text PDFWe offer comprehensive evidence of preferences for ideological congruity when people engage with politicians, pundits, and news organizations on social media. Using 4 years of data (2016-2019) from a random sample of 1.5 million Twitter users, we examine three behaviors studied separately to date: (i) following of in-group versus out-group elites, (ii) sharing in-group versus out-group information (retweeting), and (iii) commenting on the shared information (quote tweeting).
View Article and Find Full Text PDFUnlabelled: Affective polarization is a key concern in America and other democracies. Although past evidence suggests some ways to minimize it, there are no easily applicable interventions that have been found to work in the increasingly polarized climate. This project examines whether irrelevant factors, or incidental happiness more specifically, have the power to reduce affective polarization (i.
View Article and Find Full Text PDFSocial media vaccine misinformation can negatively influence vaccine attitudes. It is urgent to develop communication approaches to reduce the misinformation's impact. This study aimed to test the effects of fact-checking labels for misinformation on attitudes toward vaccines.
View Article and Find Full Text PDF