Strategies and Vulnerabilities of Participants in Venezuelan Influence Operations Ruben Recabarren Bogdan Carbunar Nestor Hernandez Ashfaq Ali Shafin FIU

2025-05-02 0 0 1.94MB 18 页 10玖币
侵权投诉
Strategies and Vulnerabilities of Participants in Venezuelan Influence Operations
Ruben Recabarren Bogdan Carbunar Nestor Hernandez Ashfaq Ali Shafin
FIU
Abstract
Studies of online influence operations, coordinated efforts to
disseminate and amplify disinformation, focus on forensic
analysis of social networks or of publicly available datasets
of trolls and bot accounts. However, little is known about the
experiences and challenges of human participants in influence
operations. We conducted semi-structured interviews with
19 influence operations participants that contribute to the
online image of Venezuela, to understand their incentives,
capabilities, and strategies to promote content while evading
detection. To validate a subset of their answers, we performed
a quantitative investigation using data collected over almost
four months, from Twitter accounts they control.
We found diverse participants that include pro-government
and opposition supporters, operatives and grassroots cam-
paigners, and sockpuppet account owners and real users.
While pro-government and opposition participants have sim-
ilar goals and promotion strategies, they differ in their mo-
tivation, organization, adversaries and detection avoidance
strategies. We report the Patria framework, a government plat-
form for operatives to log activities and receive benefits. We
systematize participant strategies to promote political content,
and to evade and recover from Twitter penalties. We iden-
tify vulnerability points associated with these strategies, and
suggest more nuanced defenses against influence operations.
1 Introduction
Social networks have become the central medium for influ-
ence operations (IOs), enabling them to disseminate and am-
plify disinformation, and compromise the integrity of informa-
tion posted by others. We observe a parallel between disinfor-
mation (information designed to mislead [96]) and malware
(e.g., self-propagating worms) where disinformation runs on
human minds instead of computers. From this perspective,
influence operations seek to fraudulently boost the search
rank of the content they distribute, and increase the number
of human hosts exposed and infected.
Figure 1: Map of discovered strategies (gray rectangles) of
influence operations participants (yellow ovals). Red circles
represent influence operations vulnerabilities that we identi-
fied, and discuss in the context of Twitter changes, in § 5.
Goals of influence operations include manipulating or cor-
rupting public debate, undermining trust in democratic pro-
cesses and scientific evidence, and even influencing elec-
tions [25,36,67,82]. Such efforts are becoming increasingly
prevalent: Facebook reported the discovery of 150 covert IOs
on its site between 2017 - 2020, that originated from countries
all over the world [36].
Influence operations were shown to be well organized [37,
54,60,61,82,94,96], control many social network sock-
puppet accounts [54,59,82], and employ inauthentic be-
haviors [36,61,85,87,110]. This knowledge was collected
through journalistic efforts [28,61,63,67,82,85,87] and foren-
sic analysis of social networks [24,46,52,59,109,110] and
released datasets [1,39,69,79,96,102].
To develop information assurance solutions that can control
influence operations, we need however to understand the ex-
periences, challenges and vulnerabilities of their contributors.
We lack such information due to difficulties to identify, reach,
recruit and establish trust with such participants.
In this paper, we investigate the perspective of participants
in influence operations. For this, we leverage unique back-
ground and insights into Venezuela, a country where influence
operations have replaced verified news [44,68,82,83]. Since
arXiv:2210.11673v1 [cs.SI] 21 Oct 2022
2013, Maduro’s regime has taken over the country’s institu-
tions, electoral and justice system, and has censored standard
news delivery solutions [18]. To bypass censorship, commu-
nicate and organize, the opposition uses social networks and
mobile apps [42]; conversely, the government uses them to
distribute hyperpartisan news and disinformation [82].
In a first contribution, we developed a protocol to identify
and recruit participants in Venezuelan influence operations.
Our protocol uses Telegram groups and Twitter to identify can-
didates, and to contact them over direct messaging. Second,
we recruited 19 relevant participants, and conducted semi-
structured interviews to study the following key questions:
RQ1
: What are concrete (a) organization and communi-
cation mechanisms, (b) resources, capabilities, and limi-
tations, (c) motivation, and (d) promotion strategies of
participants in Venezuelan influence operations?
RQ2
: Do they participate in influence operations that
target other countries? (a) Are they willing to be hired
to participate in external influence operations?
RQ3
: What is the participant perception on disinforma-
tion? (a) Do they contribute or do they have strategies to
avoid their distribution?
RQ4
: Are participants aware of, and affected by social
network defenses and penalties? (a) Have they developed
strategies to circumvent and recover from detection? (b)
Are these strategies effective?
Third, to validate participant claims, we performed a quan-
titative investigation with data collected over four months,
from 34 Twitter accounts they control. Our findings include:
(1) Interview participants are diverse, e.g., pro-government
vs. opposition supporters, paid operatives vs. grassroots cam-
paigners, and sockpuppet owners vs. real users (
§
4.6). We
found consistency with the “communication constitutes orga-
nization” perspective of organizational theory [77](§4.5);
(2) Both pro-government and opposition participants re-
vealed a history of contribution to foreign campaigns. Many
on both sides are willing to be hired to participate in influence
operations, including targeting US politics (§ 4.4);
(3) Participants claimed strategies to verify information
they post. However, we report concrete instances of pro-
government participant distribution of disinformation that
received significant community engagement (§ 4.2);
(4) Adversarial environments: Pro-government participants
reported efforts by social nets to thwart their activities; the
opposition revealed pro-government operative attacks against
their accounts (
§
4.8). Both sides disclosed strategies to avoid
detection and recover from account suspensions (§ 4.9);
(5) Pro-government and opposition participants differ
in their motivation, organization, adversaries, and detection
avoidance strategies. Based on our findings, we present the
IO strategy map of Figure 14.7).
In a fourth contribution, we identify vulnerability points
(VPs) associated with promotion strategies revealed by our
participants (Figure 1), and suggest changes to social net-
works’ handling of influence operations (§ 5).
2 Background, Model and Goals
2.1 The Venezuelan Crisis
Venezuela is experiencing the worst economic crisis in its
history [72]. The government controls every aspect of daily
life ranging from food to gas supply, while the country strug-
gles with hyperinflation, unemployment and poverty. In re-
cent years, Maduro’s government has implemented a takeover
of the Venezuelan government and institutions, the electoral
and justice systems, and the army. The government has used
lethal force against protesters, exiled critics, and held political
prisoners. Six million people have migrated to neighboring
countries [88].
A large majority of Venezuelans oppose the current
regime [19]. The government has however invested heavily
in media censorship efforts [18]. This has led to a migration
of anti-government movements to social media and mobile
apps [44,68]. In turn, this was followed by the creation of a
government-sponsored online army [83], to disrupt the oppo-
sition activities and promote the government propaganda.
Political allies of Venezuela provide support, by distributing
hyperpartisan news and disinformation through their Spanish
language news organizations [21,81], e.g., Russia Today (RT)
Español [2], Sputnik Mundo [11], the Iranian Hispan TV [7]
and Cuban [5,6] news outlets.
2.2 Adversary
Influence Operations
. Influence operations (IOs) also
known as information campaigns [101] or strategic infor-
mation operations [96], are coordinated efforts to manipulate
or corrupt public debate for a strategic goal [36]. Influence
operations were shown to have at least short-term effects,
that include political beliefs and behavior changes [26], in-
creased xenophobia [104], and increased uncertainty about
vaccines [76].
In this work we distinguish between centralized influence
operations and grassroots movements. While grassroots move-
ments are bottom-up, often spontaneous decision making ef-
forts [107], centralized influence operations have a command
and control (C&C) center (e.g., government, institution, or
interest group) that designs the operation’s goals and message.
We now define several types of participants in influence
operations. Not all are adversarial. We discuss them here
because they are used or manipulated by adversarial C&Cs.
IO Participants
. To avoid detection, centralized operations
were shown to emulate online grassroots movements [36,70].
They achieve this by recruiting real people, that include oper-
atives and grassroots campaigners. Operatives receive incen-
tives to promote the operation’s message online [36,70]; grass-
roots campaigners believe the information they distribute, do
not get paid and are unwilling to be hired for activities that
contradict their beliefs. Our study includes both types of par-
ticipants. Such participants were shown to create and amplify
posts that promote the operation’s message and to communi-
cate and coordinate activities [96].
In contrast, unwitting agents [29,96] are human participants
that receive and occasionally engage with influence operations
content, but do not receive external incentives and do not
coordinate activities.
Influence operations contributors also include trolls, that
use anonymous accounts and post inflammatory and digres-
sive messages, designed to trigger conflict and disrupt online
discussions [62,82]. Datasets of Russia’s Internet Research
Agency (IRA) trolls in Twitter [40,102] revealed several types
of troll accounts, each performing a specialized function [59].
While IRA tweets reached many users [109], they had minor
impact in making content viral [109]. In contrast, we found
that many of our participants accumulated significant commu-
nity engagement for their posts, including disinformation.
Influence operations also use bots, automated accounts that
require little human supervision [23,38,47]. Both trolls and
bots use sockpuppet accounts to hide their identities. Previous
work however has found that many IO participants are not
bots, and manage their online identities in complex ways [46].
Most of our interview participants use their own identity to
establish a personal brand and a follower base.
Coordination Apps
. IO participants use apps to communi-
cate and coordinate activities [96]. Previous work studied
misinformation and fear speech in WhatsApp [48,51]. In
particular, Javed et al. [48] analyzed the spread of informa-
tion through WhatsApp, and documented information flows
between WhatsApp and Twitter. Our participant recruitment
process builds on a similar finding, that Venezuelan operators
use communication apps to organize, coordinate and dissemi-
nate messages to be promoted in Twitter (
§
3). We also found
similar information flows in Venezuelan operations, between
Telegram groups and Twitter.
2.3 Research Goals
Social networks implement various techniques to address in-
fluence operations. They include mechanisms to detect [106],
verify [53] and penalize accounts and activities that violate
their terms of service. Twitter penalties include suspending
accounts detected to post spam, suspected to be compromised,
or reported to violate rules surrounding abuse [15]. Further,
social networks were reported to shadowban, i.e., remove or
limit the distribution or visibility of certain content [17,90].
Such mechanisms are often unable to address influence
operations in real time [36], and some consider them to be
censorship [90]. Our study confirms this.
Instead, in this work we seek to provide insights into influ-
ence operations, by studying the perspective of IO participants.
We document experiences, motivation, organization and com-
Algorithm 1:
Pseudocode for study protocol. The
recruitment takes place on a social network SN
(Twitter in our case). IOGroups lists influence op-
erations communication groups used to seed the
candidate search (from Telegram in our case).
1.StudyProtocol(SN: SocialNet, IOGroups: list)
2. while (true)do{
3. IOMembers =getSNAccounts(IOGroups);
4. IOActive =followBack(IOMembers);
5. IOFollowers =getFollowers(IOActive);
6. candidates =getOpenToDM(IOFollowers);
7. respondents =sendDM(candidates);
8. groups ={};
9. for each Rin respondents
10. (answers,accounts,g) = interview(R);
11. accountData =SN.collectData(accounts);
12. validate(answers,g,accountData);
13. groups =groups g;
14. if (groups IOGroups)then break;
15. IOGroups =IOGroups groups;}
munication mechanisms, capabilities, goals and strategies of
participants in Venezuelan influence operations, in order to
understand their strengths and vulnerabilities, and help in-
form future efforts to design more appropriate, inclusive and
effective solutions to address influence operations.
3 Methodology
Our study consists of a qualitative exploration and a quan-
titative investigation into various aspects of participation in
influence operations. We first detail the recruitment procedure
and ethical considerations, then describe the studies.
3.1 Participant Recruitment
We focused recruitment efforts on identifying Twitter ac-
counts with a verifiable history of participation in influence
operations. Our recruitment protocol identifies active opera-
tives, by starting with a seed set of communication groups. To
identify this set, we leveraged observations that Venezuelan
operatives use Telegram to communicate about their goals
and objectives. We have used Telegram’s search (keyword
“Twiteros activos”) to identify three Telegram groups and
channels dedicated to Venezuelan influence operations.
Members of these groups often disclose their Twitter han-
dles to follow one another. We have selected a set of Twitter
accounts that were revealed by members of these groups.
We did not contact these accounts directly: To minimize the
chance of interference with our study, we wanted to delay
news of our efforts from reaching the influence operations
command and control center (
§
2.2). Members of these Tele-
gram groups may have communication channels with the
command and control, thus may quickly alert many partici-
pants and influence their perception about our study.
Instead, we followed these accounts from our lab’s Twitter
account. For the accounts that followed us back, we collected,
via breadth-first search, and using the Twitter API, their Twit-
ter followers. From these, we identified the accounts that were
open to direct messaging from our account.
We sent an interview invitation to these accounts, over
direct messaging (DM). We then sent personalized messages,
including a consent form, to the accounts that replied. We
inspected the accounts that accepted the consent form, and
interviewed those that were active, were posting tweets with
political topics and had at least 500 followers.
During the interview, we also collected other accounts
claimed to be controlled by participants, and groups they
claimed to use for communications. We then iterated our
recruitment activities over these groups.
Algorithm 1summarizes our study procedure, including
recruitment, interviews, and data validation steps.
In total, we followed 1,543 Twitter accounts. From the 109
accounts that followed us back, we collected their 256,770
followers. We sent DMs to the subset of 2,843 accounts that
were open for DM, then sent personalized messages to the
99 accounts that replied. From the 35 Twitter accounts that
accepted the consent form, we selected 19 for interview.
3.2 Ethical Considerations
The study procedure was scrutinized and the full study was
approved by the institutional review board of our university
(IRB-20-0550). We followed ethical practices for conducting
sensitive research with vulnerable populations [30]. For in-
stance, we tailored the consent process to the participant, and
re-confirmed consent. We sent the consent form link and ob-
tained consent both during recruitment, and at the beginning
of the interview. We obtained consent both electronically and
verbally. We accommodated participant requests for private
payments: cryptocurrencies, intermediaries in Venezuela, and
sending money over snail mail.
During recruitment and the interview, we clearly declared
the identity of the researchers, the research objective, the
data that we collect (including Twitter account data) and how
we process it, and potential impact on the participant. More
specifically, our invitation and consent form made clear that
our intention is to study political content promotion capabil-
ities, resources and behaviors on social platforms. During
recruitment, we followed candidate accounts from our lab’s
Twitter account, where we made clear its association to our
lab, and its use strictly for research purposes.
We also explained any risks that their work may have
through our research. We asked several times during the inter-
view if they are comfortable discussing potentially sensitive
topics and told them that they could skip any question.
We were careful to hide participant identity. Following the
account data collection and participant payment steps, we
removed all participant PIIs from our data (e.g., names, IDs,
handles, locations). From the participants’ Twitter accounts,
we kept only account statistics, their posts, and their followers.
We used multiple solutions to securely store de-identified
research data. The data was stored on a physically secure
Linux server in our university, and accessed through encrypted
channels only from the password-protected authors’ laptops.
Further, all data was processed only on the server.
In
§
6we further revisit ethical considerations from the
perspective of the impact of our findings.
Team Positionality Statement
. The research team consists
of Venezuelan and international investigators, all located out-
side Venezuela. The investigators support neither the Venezue-
lan government or the opposition. The interviews were con-
ducted by a politically neutral team member, who shared
context with study participants, including the language and
some knowledge of the country’s political and economical
circumstances.
3.3 Qualitative Study
Interviews
. We conducted semi-structured interviews with
the recruited participants: one pilot interview to test our in-
terview guide and method, then in-depth interviews with 19
participants. The interview focused on participant (1) incen-
tives to contribute to influence operations, (2) organization
structure, (3) resources, capabilities and limitations, (4) strate-
gies employed to promote IO goals, (5) operations in which
they participated and in which they are interested to partici-
pate, (6) perception of disinformation in influence operations,
(7) perception on the impact of Twitter’s defenses on IO activ-
ities, and (8) strategies to evade and recover from detection.
The interviews were conducted over the phone, in Spanish,
by one author who is a native speaker. All audio interviews
were recorded with participant permission. The interviews
lasted between 17 and 98 minutes (M = 52, SD = 19.43). We
paid 2.5 USD for every 15 minutes spent in the interview.
Analysis Process
. We analyzed responses using a grounded-
theory open-coding process [99], performed by two co-
authors: the one who conducted the interviews and a non-
Spanish speaker. We conducted the interviews over 6 weeks.
During this time we also transcribed and anonymized the
recorded interviews, then translated them into English. Given
time constraints, we started the analysis after data collection
was complete. Following each interview, the interviewer dis-
cussed impressions, observations and findings with the rest
of the team. This enabled us to detect reaching data satura-
tion [41], where the interviewer reported no new insights from
the last two participants. We confirmed this during analysis.
In the preliminary analysis stage, we independently read
five transcripts to establish a thematic framework of the inter-
摘要:

StrategiesandVulnerabilitiesofParticipantsinVenezuelanInuenceOperationsRubenRecabarrenBogdanCarbunarNestorHernandezAshfaqAliShanFIUAbstractStudiesofonlineinuenceoperations,coordinatedeffortstodisseminateandamplifydisinformation,focusonforensicanalysisofsocialnetworksorofpubliclyavailabledatasetso...

展开>> 收起<<
Strategies and Vulnerabilities of Participants in Venezuelan Influence Operations Ruben Recabarren Bogdan Carbunar Nestor Hernandez Ashfaq Ali Shafin FIU.pdf

共18页,预览4页

还剩页未读, 继续阅读

声明:本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。玖贝云文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知玖贝云文库,我们立即给予删除!
分类:图书资源 价格:10玖币 属性:18 页 大小:1.94MB 格式:PDF 时间:2025-05-02

开通VIP享超值会员特权

  • 多端同步记录
  • 高速下载文档
  • 免费文档工具
  • 分享文档赚钱
  • 每日登录抽奖
  • 优质衍生服务
/ 18
客服
关注