command and control center (
§
2.2). Members of these Tele-
gram groups may have communication channels with the
command and control, thus may quickly alert many partici-
pants and influence their perception about our study.
Instead, we followed these accounts from our lab’s Twitter
account. For the accounts that followed us back, we collected,
via breadth-first search, and using the Twitter API, their Twit-
ter followers. From these, we identified the accounts that were
open to direct messaging from our account.
We sent an interview invitation to these accounts, over
direct messaging (DM). We then sent personalized messages,
including a consent form, to the accounts that replied. We
inspected the accounts that accepted the consent form, and
interviewed those that were active, were posting tweets with
political topics and had at least 500 followers.
During the interview, we also collected other accounts
claimed to be controlled by participants, and groups they
claimed to use for communications. We then iterated our
recruitment activities over these groups.
Algorithm 1summarizes our study procedure, including
recruitment, interviews, and data validation steps.
In total, we followed 1,543 Twitter accounts. From the 109
accounts that followed us back, we collected their 256,770
followers. We sent DMs to the subset of 2,843 accounts that
were open for DM, then sent personalized messages to the
99 accounts that replied. From the 35 Twitter accounts that
accepted the consent form, we selected 19 for interview.
3.2 Ethical Considerations
The study procedure was scrutinized and the full study was
approved by the institutional review board of our university
(IRB-20-0550). We followed ethical practices for conducting
sensitive research with vulnerable populations [30]. For in-
stance, we tailored the consent process to the participant, and
re-confirmed consent. We sent the consent form link and ob-
tained consent both during recruitment, and at the beginning
of the interview. We obtained consent both electronically and
verbally. We accommodated participant requests for private
payments: cryptocurrencies, intermediaries in Venezuela, and
sending money over snail mail.
During recruitment and the interview, we clearly declared
the identity of the researchers, the research objective, the
data that we collect (including Twitter account data) and how
we process it, and potential impact on the participant. More
specifically, our invitation and consent form made clear that
our intention is to study political content promotion capabil-
ities, resources and behaviors on social platforms. During
recruitment, we followed candidate accounts from our lab’s
Twitter account, where we made clear its association to our
lab, and its use strictly for research purposes.
We also explained any risks that their work may have
through our research. We asked several times during the inter-
view if they are comfortable discussing potentially sensitive
topics and told them that they could skip any question.
We were careful to hide participant identity. Following the
account data collection and participant payment steps, we
removed all participant PIIs from our data (e.g., names, IDs,
handles, locations). From the participants’ Twitter accounts,
we kept only account statistics, their posts, and their followers.
We used multiple solutions to securely store de-identified
research data. The data was stored on a physically secure
Linux server in our university, and accessed through encrypted
channels only from the password-protected authors’ laptops.
Further, all data was processed only on the server.
In
§
6we further revisit ethical considerations from the
perspective of the impact of our findings.
Team Positionality Statement
. The research team consists
of Venezuelan and international investigators, all located out-
side Venezuela. The investigators support neither the Venezue-
lan government or the opposition. The interviews were con-
ducted by a politically neutral team member, who shared
context with study participants, including the language and
some knowledge of the country’s political and economical
circumstances.
3.3 Qualitative Study
Interviews
. We conducted semi-structured interviews with
the recruited participants: one pilot interview to test our in-
terview guide and method, then in-depth interviews with 19
participants. The interview focused on participant (1) incen-
tives to contribute to influence operations, (2) organization
structure, (3) resources, capabilities and limitations, (4) strate-
gies employed to promote IO goals, (5) operations in which
they participated and in which they are interested to partici-
pate, (6) perception of disinformation in influence operations,
(7) perception on the impact of Twitter’s defenses on IO activ-
ities, and (8) strategies to evade and recover from detection.
The interviews were conducted over the phone, in Spanish,
by one author who is a native speaker. All audio interviews
were recorded with participant permission. The interviews
lasted between 17 and 98 minutes (M = 52, SD = 19.43). We
paid 2.5 USD for every 15 minutes spent in the interview.
Analysis Process
. We analyzed responses using a grounded-
theory open-coding process [99], performed by two co-
authors: the one who conducted the interviews and a non-
Spanish speaker. We conducted the interviews over 6 weeks.
During this time we also transcribed and anonymized the
recorded interviews, then translated them into English. Given
time constraints, we started the analysis after data collection
was complete. Following each interview, the interviewer dis-
cussed impressions, observations and findings with the rest
of the team. This enabled us to detect reaching data satura-
tion [41], where the interviewer reported no new insights from
the last two participants. We confirmed this during analysis.
In the preliminary analysis stage, we independently read
five transcripts to establish a thematic framework of the inter-