Approaches to Identify Vulnerabilities to Misinformation A Research Agenda

2025-04-30 0 0 684.74KB 7 页 10玖币
侵权投诉
Approaches to Identify Vulnerabilities to Misinformation: A Research
Agenda
NATTAPAT BOONPRAKONG, University of Melbourne, Australia
BENJAMIN TAG, University of Melbourne, Australia
TILMAN DINGLER, University of Melbourne, Australia
Given the prevalence of online misinformation and our scarce cognitive capacity, Internet users have been shown to frequently
fall victim to such information. As some studies have investigated psychological factors that make people susceptible to
believe or share misinformation, some ongoing research further put these ndings into practice by objectively identifying
when and which users are vulnerable to misinformation. In this position paper, we highlight two ongoing avenues of research
to identify vulnerable users: detecting cognitive biases and exploring misinformation spreaders. We also discuss the potential
implications of these objective approaches: discovering more cohorts of vulnerable users and prompting interventions to
more eectively address the right group of users. Lastly, we point out two of the understudied contexts for misinformation
vulnerability research as opportunities for future research.
CCS Concepts: Human-centered computing Human computer interaction (HCI).
Additional Key Words and Phrases: misinformation; vulnerable users; user proling; cognitive bias
ACM Reference Format:
Nattapat Boonprakong, Benjamin Tag, and Tilman Dingler. 2022. Approaches to Identify Vulnerabilities to Misinformation:
A Research Agenda. In CHI ’22: ACM CHI Conference on Human Factors in Computing Systems, April 30–May 6, 2022, New
Orleans, LA. ACM, New York, NY, USA, 7pages. https://doi.org/10.1145/1122445.1122456
1 INTRODUCTION
There is a sheer amount of misinformation circulating on the Internet nowadays. Dened as false or misleading
information, misinformation misinforms people and bears many of the harmful eects to individuals and
society [
19
]. More recently, misinformation has undermined public health guidance about COVID-19 [
31
].
Through Internet users who consume online media, misinformation has been shown to spread faster than
factually accurate information [
38
]. This phenomenon, where users believe and share misinformation without
taking deliberate considerations, can be attributed to not only the presence of misinformation but also the
psychological vulnerabilities of information consumers [
9
]. Humans are not always rational [
35
]; therefore, they
are prone to make erroneous decisions, like misinterpreting news or sharing unveried information.
There are two main actions regarding misinformation: believing and sharing. In the literature, research has
explored several user-side factors that govern the susceptibility of people to believe online misinformation.
Cognitive biases–mental shortcuts we use to achieve faster but less deliberate decisions– have been marked
as factors that make people be gullible to false or unveried information [
26
]. Martel et al
. [21]
have shown
that reliance on emotion indicates higher belief in false information. Some research have shown that political
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that
copies are not made or distributed for prot or commercial advantage and that copies bear this notice and the full citation on the rst
page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy
otherwise, or republish, to post on servers or to redistribute to lists, requires prior specic permission and/or a fee. Request permissions from
permissions@acm.org.
CHI ’22, April 30–May 6, 2022, New Orleans, LA
©2018 Association for Computing Machinery.
ACM ISBN 978-1-4503-XXXX-X/18/06. . . $15.00
https://doi.org/10.1145/1122445.1122456
1
arXiv:2210.11647v1 [cs.HC] 21 Oct 2022
CHI ’22, April 30–May 6, 2022, New Orleans, LA N. Boonprakong, B. Tag, T. Dingler
partisans [
25
] or those who are older than 65 years [
3
] are prone to believing fake news. Roozenbeek et al
.
[31]
found that higher numeracy skills and better trust in science are indicators of lower susceptibility to
misinformation.
On the other hand, some research has investigated why people share unveried information online. Motivated
by [
8
,
28
], Karami et al
. [17]
listed ve motivational factors in spreading fake news: uncertainty, anxiety, lack
of control, relationship enhancement, and social rank. Laato et al
. [18]
found that trust in online information
and information overload are strong indicators of sharing unveried information. Chen et al
. [5]
suggested that
people share misinformation because of social factors, i.e., self-expression and socialization. In addition, Avram
et al
. [1]
found that social engagement metrics (i.e., the number of likes or shares) increased the tendency to
share misinformative contents.
Based on the above-mentioned psychological ndings, some research have put these ndings into practice by
building computing systems that objectively identify potential misinformation spreaders based on their online
and oine behaviours [
13
,
17
,
36
]. Unlike previous approaches that rely on self-report measures, these approaches
quantify how people behave with information and produce insights about their specic behaviours regarding
to misinformation. Through behavioural and psycho-physiological measures, these approaches can explicitly
identify when and which users are prone to believe or spread misinformation.
Dierent individuals have dierent backgrounds and dierent levels of cognitive capacity. Therefore, some
information consumers may be more prone than others to misinformation. Echoing the ndings in Geeng et al
.
[12]
that dierent design interventions may be eective for dierent users, the objective approaches to identify
vulnerable users would also provide a a guide on which misinformation intervention should be deployed to which
group of users. Ultimately, interventions would be made more eective as they address the right group of users.
In this position paper, we discuss empirical approaches to identifying people who may be susceptible to
misinformation: detecting cognitive biases and proling vulnerable users. We also highlight their implications of
identifying more cohorts of vulnerable users and prompting interventions to address the right group of users.
Lastly, we elaborate on challenges and future avenues to investigate vulnerabilities to misinformation. We intend
to generate a fruitful discussion in this workshop about the need to explore potential victims of misinformation,
identify their vulnerabilities, and employ the right interventions to address the right user cohorts.
2 APPROACHES TO IDENTIFYING VULNERABILITIES TO MISINFORMATION
Various studies have investigated why people believe and share misinformation on the Internet. However, most
of the studies addressed such questions based on self-report measures, e.g., questionnaires, which are prone to
many limitations such as subjectiveness. On the other hand, some research has proposed objective approaches
to identifying users who may be susceptible to misinformation. Combining prior psychological ndings and
behavioural measures, these research eorts explicitly identify which and when users tend to believe and share
misinformation. In this paper, we discuss two promising approaches: detecting cognitive bias detection and
proling social media users.
2.1 Cognitive Bias Detection
Humans possess limited cognitive capacity and employ cognitive biases as mental shortcuts, which can lead
to irrational judgements. One prominent example is “conrmation bias” (also known as selective exposure) as
a tendency to seek solely information to supports one’s perspectives or expectations, while ignoring other
dissenting information [
24
]. Taking the advantage of social media platforms that inform users with contents
that reinforce their viewpoints, conrmation bias is easily seen on social media platforms as many users tend to
consume and spread content items that match their belief without checking its veracity [39].
2
摘要:

ApproachestoIdentifyVulnerabilitiestoMisinformation:AResearchAgendaNATTAPATBOONPRAKONG,UniversityofMelbourne,AustraliaBENJAMINTAG,UniversityofMelbourne,AustraliaTILMANDINGLER,UniversityofMelbourne,AustraliaGiventheprevalenceofonlinemisinformationandourscarcecognitivecapacity,Internetusershavebeensho...

展开>> 收起<<
Approaches to Identify Vulnerabilities to Misinformation A Research Agenda.pdf

共7页,预览2页

还剩页未读, 继续阅读

声明:本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。玖贝云文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知玖贝云文库,我们立即给予删除!
分类:图书资源 价格:10玖币 属性:7 页 大小:684.74KB 格式:PDF 时间:2025-04-30

开通VIP享超值会员特权

  • 多端同步记录
  • 高速下载文档
  • 免费文档工具
  • 分享文档赚钱
  • 每日登录抽奖
  • 优质衍生服务
/ 7
客服
关注