BROWSE

Related Scientist

Meeyoung, Cha's photo.

Meeyoung, Cha
데이터 사이언스 그룹
more info

ITEM VIEW & DOWNLOAD

Analyzing Biases in Perception of Truth in News Stories and Their Implications for Fact Checking

Cited 0 time in webofscience Cited 0 time in scopus
137 Viewed 0 Downloaded
Title
Analyzing Biases in Perception of Truth in News Stories and Their Implications for Fact Checking
Author(s)
Babaei, Mahmoudreza; Kulshrestha, Juhi; Chakraborty, Abhijnan; Redmiles, Elissa M.; Meeyoung Cha; Gummadi, Krishna P.
Publication Date
2022-06
Journal
IEEE Transactions on Computational Social Systems, v.9, no.3, pp.839 - 850
Publisher
IEEE Systems, Man, and Cybernetics Society
Abstract
Misinformation on social media has become a critical problem, particularly during a public health pandemic. Most social platforms today rely on users' voluntary reports to determine which news stories to fact-check first. Despite the importance, no prior work has explored the potential biases in such a reporting process. This work proposes a novel methodology to assess how users perceive truth or misinformation in online news stories. By conducting a large-scale survey (N = 15,000), we identify the possible biases in news perceptions and explore how partisan leanings influence the news selection algorithm for fact checking. Our survey reveals several perception biases or inaccuracies in estimating the truth level of stories. The first kind, called the total perception bias (TPB), is the aggregate difference in the ground truth and perceived truth level. The next two are the false-positive bias (FPB) and false-negative bias (FNB), which measures users' gullibility and cynicality of a given claim. We also propose ideological mean perception bias (IMPB), which quantifies a news story's ideological disputability. Collectively, these biases indicate that user perceptions are not correlated with the ground truth of new stories; users believe some stories to be more false and vice versa. This calls for the need to fact-check news stories that exhibit the most considerable perception biases first, which the current voluntary reporting does not offer. Based on these observations, we propose a new framework that can best leverage users' truth perceptions to remove false stories, correct misperceptions of users, or decrease ideological disagreements. We discuss how this new prioritizing scheme can aid platforms to significantly reduce the impact of fake news on user beliefs.
URI
https://pr.ibs.re.kr/handle/8788114/12298
DOI
10.1109/TCSS.2021.3096038
ISSN
2329-924X
Appears in Collections:
Pioneer Research Center for Mathematical and Computational Sciences(수리 및 계산과학 연구단) > Data Science Group(데이터 사이언스 그룹) > 1. Journal Papers (저널논문)
Files in This Item:
There are no files associated with this item.

qrcode

  • facebook

    twitter

  • Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.
해당 아이템을 이메일로 공유하기 원하시면 인증을 거치시기 바랍니다.

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Browse