2.1 Taxonomies of Deceptive Patterns
As far back as 2010, Conti and Sobiesk described malicious interface designs and called for joint work between the security and human-computer interaction communities to address this issue [
16]. In addition, their paper, based on three extensive surveys, formalized a first taxonomy of different types of deceptive patterns, which served as the foundation for further research in this area. Another early example of a taxonomy of deceptive patterns was presented by Brignull et al., who was among the first to raise awareness not just in the research community, but among designers and users through his website on deceptive patterns [
13,
14]. Quickly, further research emerged that aimed at gaining a deeper understanding of what deceptive patterns are and how to properly classify them. In 2013, Zagal et al. first applied the concept of deceptive patterns to the design of games, coining the term "Dark Game Design Pattern" and defining it as "a pattern used intentionally by a game creator to cause negative experiences for players which are against their best interest and likely to happen without their consent" [
56]. Their exemplary deceptive game design patterns include temporal deceptive patterns, such as
Grinding, monetary deceptive patterns, such as
Pay-to-Skip, and social capital-based deceptive patterns, such as
Social Pyramid Schemes. These kinds of deceptive patterns, as well as psychological deceptive patterns, are even found in mobile games for children from 0-5 years old [
49]. More recently, Aagaard et al. performed interviews and workshops with both players and designers of mobile games, finding that developers are often preassured into integrating deceptive patterns into their games to drive engagement [
1]. Bösch et al. directly aligned their classification to Hopeman’s privacy design strategies [
11,
26]. The idea was that privacy deceptive strategies, the underlying basis for privacy deceptive patterns, could be described as a direct reversal of privacy strategies, originally aimed at increasing data privacy. The result is a rather abstract set of terms, such as
maximize or
obscure. Building on this, in what is regarded as one of the most influential deceptive pattern taxonomies to this day, Gray et al. were able to provide a balance of abstraction and over-specification [
22]. Their taxonomy consists of five main categories, which manage to keep the same level of abstraction and at the same time be concrete enough that the terms used allow readers to relate them to specific examples. A slightly different approach was presented by Mathur et al., who classified different deceptive patterns with respect to their influence on user decision-making [
37]. The authors describe their approach as "offering a set of shared higher-level attributes that could descriptively organize instances of deceptive patterns in the literature" [
38]. Most recently and
after our gameplay design and studies were already concluded, Gray et al. presented an ontology about deceptive patterns with the aim to unify the different perspectives and provide a clear understanding of the relationship between the varying perspectives [
21].
2.2 Understanding Users’ Vulnerability to Deceptive Patterns
Aiming to understand why users actually are vulnerable to deceptive patterns, different theories and conceptual models have been applied. Xiao and Benbasat identified affective and cognitive mechanisms with certain deceptive information practices and created an overall theoretical model [
55]. Bösch et al. applied Kahnemann’s Dual process theory of System 1 and System 2 thinking [
27], which is based on the understanding that humans have two modes of thinking, a fast one (System 1, unconscious, automatic, less rational) and a slow one (System 2, conscious, rational). The assumption of Bösch et al. is that deceptive patterns systematically exploit users’ System 1 thinking. Lewis tried to establish a link between deceptive patterns and psychological motivators, based on Reiss’s Desires theory [
45].
Several studies explored user perceptions when faced with deceptive patterns. Luguri and Strahilevitz found that mild deceptive patterns are more effective and less educated users are more susceptible to deceptive patterns [
34]. In subscription decisions, decision architecture appeared to be even more important than material differences like price [
34]. While moderately aware of deceptive patterns, users have a resigned attitude towards them, as they believe themselves dependent on the very services that employ deceptive patterns [
35]. Users may struggle to identify deceptive patterns [
17] but even if users are aware of and able to identify deceptive patterns, they lack knowledge of specific harms deceptive patterns might cause and how to oppose them [
10]. More recently, Mildner et al. developed and evaluated a five-question tool to help users assess the presence of deceptive patterns within social media interfaces [
40]. While the results show promise in that a majority of users were able to discern deceptive interfaces from non-deceptive ones, ratings for the "darkness" of the deceptive interfaces remained considerably low, indicating some difficulty [
40].
2.3 Serious Games to Raise Awareness for Data Privacy Issues
Becker defines serious games as "games designed specifically for purposes other than or in addition to pure entertainment" [
8]. Alvarez et al. further specify that such games may include aspects of tutoring, teaching, training, communication, or information [
6]. Games, as such, can in essence be defined as consisting of a closed environment that is interactive, has a set of specific rules, and has one or more goals for the player [
8]. Serious games have been shown to increase learning outcomes for visual and spatial processing, complex concepts and abstract thinking, and deduction and hypothesis testing [
18]. However, serious games can not only increase learning outcomes. In the form of persuasive games, they can be strongly effective at promoting behavior change [
42]. In our recent work, we argue that serious and persuasive games are a promising solution to bolster resistance against deceptive patterns, with the specifics of game design, game presentation, and study design being crutial aspects to consider for an effective outcome [
30].
The idea of serious games being effective for learning can be ascribed to multiple aspects. For one, we know that the experience of situations with direct consequences based on decisions should result in emotional learning [
48]. Serious games provide such an environment, where, through game design, decisions can lead to immediate consequences. In the context of data privacy, we also know of the importance of immediate consequences. The less prominent risks are, the more likely users will disclose information [
29]. A game also provides the possibility to include failures and offer a learning experience from failure. Again, the literature shows that such experiences, associated to negative emotions, can be highly influential [
46], and accordingly, negative emotions linked to privacy violations may also trigger privacy protection behavior [
47]. This persuasive strategy of allowing players to observe the cause and effect of their behaviors is called
Simulation, and is one of the most used persuasive strategies in persuasive games [
41].
On the other hand, narrative elements increase immersion in and engagement with a serious game, as well as motivate players to learn more compared to serious games without narrative elements [
41]. Naul and Liu also find that intrinsic integration of narrative, game-play and learning content provide both educational and motivational benefits to serious games [
41]. Such motivational benefits of intrinsic integration were also identified by Habgood and Ainsworth [
24]. Moreover, they identify that humans interact with virtual agents much the same way as they do with real people, including developing emotional responses (including negative ones) towards these agents [
41].
Existing serious games in the data privacy context have approached the concept quite differently. Akinyemi proposed the game "Dark Cookie" which aims to train users to spot deceptive patterns in cookie banners [
4]. To do so, it embeds cookie banners in a kind of cover story (four bears and a raccoon) where the game tries to trick the user into accepting deceptive cookies. PrivaCity is a chatbot game with a focus on smart cities [
9]. It is similar to older text-based adventures and offers the users certain choices related to data privacy in a ficticious scenario of a smart city.
The approach by Gupta et al. which aims to train cyber security professionals falls between the categories of serious games and gamification [
23]. Their approach confronts players with realistic threat scenarios of fishing and threat hunting and adds gamification elements such as a scoreboard and time sparsity.
Maragkoudaki and Kalloniatis explored the idea of a virtual reality escape room to provide an environment for privacy awareness [
36]. While the overall escape room story is unrelated to privacy at first sight, the authors slipped in specific data privacy examples, such as very long privacy policies or a computer search history page. Depending on the players’ behavior, they get rewarded by receiving more time to leave the escape room and solve the riddles. In addition, it confronts the player with messages in a subtextual form, e.g. written on walls, that aim to raise privacy awareness.
Hart et al. follow yet again a very different approach by developing the physical tabletop game "Riskio" which, however, again adapts a very specific data privacy scenario and allows players to explore it in typical boardgame style with game mechanics such as card decks and different turn-based game phases [
25]. In a similar, physical approach, Tjostheim introduces a board game called "Dark Pattern" designed to raise awareness about deceptive data-sharing practices in apps [
50]. Players are tasked with installing apps while trying to minimize personal data shared. Tjostheim found that while knowledge about deceptive patterns increased, the impact on behavioral intention to protect privacy remained weak [
50]. Keeping with the board game aspect, Nyvoll designed an interactive social deduction board game in which players try to deduce which player is a "CEO" who is deploying deceptive patterns while they are using a smartphone.
Overall, all these games have in common that the implemented narrative, scenario, or privacy-related game situations provide a
direct adaptation of the real-world situation, most often by placing the real-world interface component (i.e., a cookie banner, a deceptive pattern), quite literally, in a game environment. While such learning environments can show positive effects in the short term, we also know from educational research, that approaches which require more reflection and thought and involve powerful emotional experiences might be more effective long term, through the creation of new neural links for reflection and understanding [
5,
32].
In contrast to previous approaches, we aim to design a narrative-driven game with the following attributes, based on the lessons learned in the literature:
•
a compelling narrative to increase motivation and educational value [
41]
•
a simulation environment to allow the player to experience the immediate consequences of their actions to trigger emotional learning [
48].
•
ways for powerful emotional experiences and responses, such as interaction with a virtual agent, to facilitate reflection and understanding [
5,
41]
•
an intrinsic integration of deceptive patterns into the game world [
24].
In addition, we were interested in a gameplay concept which provides utmost flexibility with regards to how deceptive patterns could be adapted within the game. By this we mean that the gameplay concept should lend itself towards the possibility to implement the same deceptive pattern in different forms. For the short term, this allowed us to a) reduce the need for compromise when designing a gamified deceptive pattern to make it fit into the overall narrative and gameplay concept, b) provide the basis for future research and iterations in design, i.e., we may be able to keep the gameplay concept in future version of the game while still redesigning or exchanging the specific adaptations of a deceptive pattern.