subject privacy within their systems. They found that, ir-
respective of engineers’ level of expertise, such guidelines
led to similar levels of incorporation of privacy practices
in the resulting designs. In addition, the study made clear
that providing software engineers with a privacy guideline
list affects design success, with a success rate of 75.12%.
Providing personalised assistants for software engineers
is likely to have an impact on compliance with PbD prac-
tices, which drive improvement of data subject privacy.
This is particularly true, based on automated assistant
systems’ proven ability to support clients efficiently [14].
The current work has incorporated as many PbD mea-
surements as possible to embrace the idea of “explainable
privacy”. This has been done because, among the vari-
ous different types of PbD measurements, privacy patterns
are the most suitable for implementation by software en-
gineers, yet other scheme levels, being more abstract, offer
better descriptions of the aims behind each practice. The
next section thus develops the concept that the knowl-
edge underlying PbD can be translated into a machine-
interpretable format in order to facilitate automation of
PbD recommendations for IoT systems.
3.3. Ontologies Representing Domain Knowledge
Ontologies, as a technology in the semantic web, of-
fer reasonable means of representing a knowledge base.
An ontology can thus represent a very wide range of con-
cepts, along with their relationships and interactions, in a
machine-readable format. For example, Dragoni et al. [15]
considered the adoption of advanced technology into an in-
dividual’s lifestyle as a way to develop recommendations
for personalised healthy practices by applying an ontology-
centric decision support system called PerKApp. The pro-
posed ontology provided expert knowledge and the infor-
mation required to assist the user in developing healthy
practices. It also incorporated semantic rules to act as
expert support for the user’s healthy practices, identify-
ing any violations in such practices, and notifying the user
by means of motivational messages as required. That sys-
tem was tested within the Key to Health project and thus
found to be applicable in real-world scenarios.
In another example, Malone et al. [16] applied se-
mantic technologies to achieve data reproducibility in the
bioinformatics field. Their motivation was their belief that
data analysis results vary depending on the software used
for such analysis. To make data results more easily repro-
ducible, researchers thus need to know the details of the
software used to analyse the data. In ordered to build the
required Software Ontology (SWO), they followed Agile
methodology principles, as well as involving various types
of participants as ontology users. The resulting SWO on-
tology was later merged with the EDAM [17] ontology,
which was designed to handle bioinformatics operations,
data types and identifiers, topics, and formats, and the
resultant joint ontology was used in various biomedical
applications, including the BioMedBridges software reg-
istry [18], eagle-I [19], and the Gene Expression Atlas Data
project [20]. These examples illustrate ontologies as an
effective technology to supply assistive systems. Hence,
in this research, we are developing the PARROT ontol-
ogy that fulfills our purpose. Many methodologies exist
to guide ontology developers in creating associated ana-
lytical studies to compare options [21] [22] [23]. For this
work, however, the NeON [24] and Chaware et al. [25]
methodologies were adopted.
3.4. Ontologies for Privacy by Design
Multiple ontologies have been developed to support
increased rigour in the privacy field. Harshvardhan et
al. [26] attempted to address the complexity of under-
standing privacy policies by transforming such policies into
machine-readable data. They proposed an ontology design
pattern (ODP) that contains all details in a given privacy
policy document, such as those on collection, usage, stor-
age, and sharing of personal data, along with the relevant
processes and legal basis in the GDPR. This ODP would
thus have benefits above and beyond those of the GDPRov
[27] and GDPRtEXT [26] ontologies, which cover the vo-
cabulary, concepts, and terms within the GDPR. They
designed an ODP to answer a set of competency ques-
tions related to personal data; further competency ques-
tions about how personal data may be changed, deleted,
and obtained were not incorporated at that stage. The
authors thus acknowledged that the ODP required wider
patterns to include all information in a privacy policy doc-
ument in order to develop it into an ontology that could
allow the full manipulation and understanding of the use
of personal data. They modeled the information of privacy
policies, whereas in our ontology we modeled PbD knowl-
edge which are the structures to be followed in the design
phase of software development.
Gharib et al. [28] applied the PbD concept, rather
than focusing only on security requirements, as a solu-
tion to privacy breaches. However, they suggested that
the vagueness of this privacy concept confuses designers
and stakeholders, preventing them from making the right
design decisions. To address this, they suggested that on-
tologies offer a more robust means of conceptualising pri-
vacy concepts and their interrelations, and to develop a
relevant ontology, they systematically reviewed the liter-
ature to identify key concepts and relationships underly-
ing general privacy requirements. From this review, they
identified 38 key concepts and relationships, which they
grouped across four categories, creating 17 organisational
factors, nine risks, five treatments, and seven privacy fac-
tors. Although their concern is PbD requirements, their
objective varies from the PARROT ontology that they aim
to provide the software developer with generic privacy key
concepts where we provide explainable PbD measurements
that are matching and should be applied in the correspond-
ing IoT system.
3