systematically tested setting up both iOS and macOS devices and documented all the steps required to disable settings.
We used this information as the basis of our qualitative interviews towards the end of probing to users’ understandings
and experiences with these privacy configurations and devices. We asked our participants to use their own devices rather
than test devices provided by us. We wanted our participants to perform the tasks with i) familiar devices and ii) devices
that contain their own privacy configurations that they have setup. Our results reflected users’ actual experiences with
privacy settings they have configured on their own rather than if we pre-configured it for them. Our work contributes by
revealing the difficulties on properly configure privacy for features that are enabled by default, and the surprises and
tensions that users experiences because of these settings.
Next, we will discuss a body of research that explored privacy of users when using mobile apps and configuring privacy
settings.
User Privacy in Mobile Apps
Prior work has largely explored app permissions in mobile devices. Studies have
been motivated to explore users configuring permissions due to the reported difficulties when doing so. For several
years, researchers have focused on permissions of apps in an attempt to explore ways to improve users’ understand-
ing and expectations when setting privacy configurations of apps. The focus on app permissions was motivated
through the many studies that demonstrated users’ difficulties in understanding privacy configurations of mobile apps
[Balebako et al.(2013), Ramokapane et al.(2019), Tan et al.(2014), Liccardi et al.(2014)].
A study on 308 Android users revealed that only 17% of users were attentive to the permissions that were prompted
during app installations, therefore indicated that permission warnings are not sufficient to make informed security
decisions [
Felt et al.(2011)
]. A recent study in 2021 on 4,636 Android users has also confirmed that information
provided by the current system is not enough for users to make informed decisions on their privacy [
Shen et al.(2021)
].
Other studies also showed that often users either ignore or accept permissions without reading the details properly
[Felt et al.(2012a), Felt et al.(2012b), Ramokapane et al.(2019)].
Researchers highlighted factors that influence the misunderstandings that users have of privacy configurations. Several
factors make it harder for users to know what happens to this personal information when agreeing to permissions or
configurations; for example, unclear Privacy Policies, and lack of transparency about data practices. Earlier studies have
investigated ways to improve Privacy Policies for better delivery for users [
Kelley et al.(2009)
]. However, recent studies
have reported that the unclear nature of Privacy Policies of apps still contribute to the difficulty users have to grasp
what happens to their personal data [
Alohaly and Takabi(2016)
,
Coen et al.(2016)
,
Kelly et al.(2012)
]. As a result of
the unclear nature of Privacy Policies, users rarely follow Privacy Policy links to read what part of their information is
disclosed [
Coen et al.(2016)
]. Another factor that has been suggested to contribute to the difficulty in understanding
what happens to personal data in apps is transparency [
Liccardi et al.(2014)
,
Van Kleek et al.(2017)
]. A study found
that providing more transparency to users about what occurs to their personal data can make users more confident
in their app use [
Van Kleek et al.(2017)
]. To help users better understand how their personal data is handled, recent
literature work explored several solutions to help make more informed decisions [
Liccardi et al.(2014)
,
Liu et al.(2014)
,
Palmerino(2018)
,
Thompson et al.(2013)
,
Van Kleek et al.(2017)
]. For instance, a study deployed machine learning to
offer a prospect of mitigating the burden of increased privacy decisions [
Smullen et al.(2020)
]. Another study proposed
a prototype that adjusted privileges given to apps on iOS as well as the ability to replace real data with mock data
[
Lutaaya(2018)
]. Another study analysed settings of 4.8 million smartphone users and demonstrated a number of
profiles that aim to simplify the decisions mobile users have to make about their privacy [Liu et al.(2014)].
Impacts of setting privacy configurations
A second line of research focused on understanding user’s concerns
when it comes to setting privacy preferences [
Balebako et al.(2013)
,
Ramokapane et al.(2019)
,
Shen et al.(2021)
,
Wijesekera et al.(2018)
]. Research has found that users often have misconceptions about the data sharing that oc-
curs on smartphone apps [
Balebako et al.(2013)
,
Frik et al.(2022)
,
Ramokapane et al.(2019)
,
Tan et al.(2014)
]. Mis-
conceptions about the handling of personal data can create challenges to users. Studies suggested that users
may feel uncomfortable or confused when learning about what occurs to their data [
Tan et al.(2014)
]. For ex-
ample, studies suggested that users are often surprised when asked to share their personal data collected by apps
[
Balebako et al.(2013)
,
Shklovski et al.(2014)
] for example, users understood that data was used for purposes such as
marketing but were surprised by the scope of data sharing, frequency and destination [Balebako et al.(2013)].
Users can also experience other emotions such as confusion about certain personal data that is requested, sometimes dis-
may or outrage [
Shklovski et al.(2014)
]. Studies that focused on tracking of users by apps [
Coopamootoo et al.(2022)
,
Ur et al.(2012)
,
McDonald and Cranor(2010)
], suggest that users can feel negatively under the perception of being
tracked. These feelings can include: anger, distrust and anxiety. Often users would feel accepting of the fact that
they are tracked, under certain conditions [
Coopamootoo et al.(2022)
]. The reactions users had upon learning about
how their data is handled can be denoted to the insufficient information on mobile apps to help users make informed
3