1 Introduction
Social media platforms frequently address online interpersonal harm, such as harassment, through content moderation; this involves reviewing user-submitted content for appropriateness and sanctioning contributors that violate the platform’s rules [
42,
94]. However, despite efforts in research and industry to improve moderation practices in recent years, the number of people experiencing severe forms of harassment continues to grow. In 2014, 15% of Americans reported experiencing severe harassment, including physical threats, stalking, sexual harassment, and sustained harassment [
33]. That number grew to 18% in 2017 and 25% in 2021. Further, many people report simultaneously experiencing multiple forms of severe harassment [
34,
125]. Research shows that online harms are insufficiently addressed by platforms’ and communities’ current approaches [
23,
35,
77,
126], and 32% of Americans say that social media companies are doing a poor job at addressing online harassment on their platforms [
125]. Alternative approaches are desperately needed, but what principles should guide them, and how would they work in practice?
Restorative justice is a framework that argues for repairing harm and restoring individuals and communities after harm has occurred. In this article, our goal is to
draw from restorative justice philosophy and practice to study how an online gaming community currently addresses—and might alternatively address—interpersonal harm. We focus on restorative justice here because it has an established offline practice and has been successfully institutionalized to address harm in other contexts, such as schools and prisons [
6,
73,
75]. In recent years, the HCI and CSCW communities have also explored its utility in addressing online harm [
12,
69,
106], and we build upon these efforts, as well.
Restorative justice focuses on providing care, support, and in other ways meeting people’s needs after harm has occurred. It has three major principles: (1) identify and address the victim’s needs related to the harm, (2) support the offender in taking accountability and working to repair the harm, and (3) engage the community in the process to support victims and offenders and heal collectively [
80,
132]. In practice, restorative justice addresses harm differently than more common punitive models. The main tool for action in a punitive justice model, as embodied in content moderation, is
punishing the rule violator. In contrast, in restorative justice, it is
communication among the harmed person, the offender, and the community. For instance, in a common restorative justice practice called a
victim-offender conference, the victim and offender meet to discuss the harm and how to address it under the guidance of a facilitator; interested community members are also invited to join this conversation since the conference aims at addressing the needs and obligations of all three parties involved. A follow-up process may include apologies or community service by the offenders [
132].
This article uses the three restorative justice principles described above and its common practices (e.g., the victim-offender conference) as a vehicle to study the perspectives and practices of victims, offenders, and moderators during instances of online harm. First, we use the principles to evaluate current practices for addressing interpersonal harm and identify the potential need for restorative justice practices. We focus on the experiences of victims, offenders, and moderators, who are key stakeholders and participants in restorative justice conferences (with the moderator acting as facilitator). Second, we use the victim-offender conference to identify the benefits and challenges of practicably implementing restorative justice practices in an online setting.
We study harm cases in the Overwatch gaming community, which spans two major platforms : the Overwatch platform on which the game is played
1 and the Discord platform
2 on which gaming discussions, teammate selection, and match organization occur. Online gaming communities have long suffered from severe and frequent incidents of online harm [
2,
9,
51]. Our analysis of Overwatch, a multi-player game, lets us explore such harm in the context of different types of user relationships, including competition and collaboration. We interviewed self-identified victims (people who have been harmed), offenders (people who have harmed others), and moderators who dealt with the cases being discussed. Our interview protocol resembles the restorative justice practice of
pre-conferencing, which is used to learn people’s history and preferences, explain restorative justice to them, and evaluate the appropriateness of holding a victim-offender conference [
132]. Additionally, given that restorative justice has been chiefly developed through practice [
122], we deepened our understanding of its principles and practices by conducting two interviews with offline restorative justice practitioners.
We find that current, punitive online moderation processes do not effectively stop the perpetuation of harm. First, content moderation is offender-centered and does not address victims’ needs, such as receiving support or healing from harm. Though victims may report individual offenders, they continue to receive harm in a community where abuse is prevalent. Second, content moderation directs offenders’ attention to the punishment they receive instead of the damage they cause. When punishment is ineffective, as is often the case, there are no alternative ways to hold offenders accountable. Finally, community members with a punitive mindset may further perpetuate harm by not acknowledging the harmed person’s experiences or reacting punitively toward perpetrators or victims, particularly when harm cases are complex and layered.
Our findings show that some current moderation practices align with restorative justice, and a few participants have attempted to implement restorative justice practices in their own online communities. Some victims and offenders also expressed needs that align with restorative justice values. However, applying restorative justice online is not straightforward: there are structural, cultural, and resource-related obstacles to implementing a new approach within the existing punitive framework. We elaborate on the potential challenges of implementing restorative justice online and propose ways to design and embed its practices in online communities.
Our work contributes to a growing line of research that applies alternative justice frameworks to address online harm [
23,
28,
47,
48,
86,
103,
106]. By evaluating the moderation practice of the Overwatch gaming community through a restorative justice lens and identifying key stakeholders’ preferences for the justice framework, our work sheds light on ways to address online harm that go beyond simply maintaining healthy content and working within a perpetrator-oriented model. We highlight how restorative justice has the potential to reduce the continuance of harm and improve community culture in the long run.
4 Methods
In total, we interviewed 23 participants from the Overwatch gaming community for this study (Table
1). To understand Overwatch gamers’ perspectives on the restorative justice process, we interviewed victims (those harmed), offenders (those who harm),
4 and Discord moderators who dealt with the cases being discussed. Some participants fall into more than one of these three groups. We could not include Overwatch moderators in our study because they are commercial content moderators [
95] who remain anonymous and constrained by non-disclosure agreements.
Restorative justice primarily evolved through practice rather than as an academic discipline. To more deeply understand how its principles might be applied online, we conducted two additional expert interviews with facilitators from a restorative justice center at the University of California, Berkeley. Further, the first author attended 30 hours of restorative justice training courses to learn how it is practiced in local communities. These interviews and the training helped ground our research in restorative justice values and practices.
4.1 Recruitment
We recruited participants using a combination of convenience sampling and snowball sampling [
10,
96]. First, we joined multiple Discord communities focused on Overwatch and reached out to moderators, sending private messages to request an interview. After building rapport with moderators through interviews, we asked their permission to publish recruitment surveys in their communities to find victims and offenders of harm. Some moderators referred us to their fellow moderators for interviews and invited us to other Overwatch Discord communities they were involved in. In total, we recruited participants from five Overwatch Discord “server” communities. In addition, we recruited two facilitators for expert interviews from a training program the first author participated in. We recruited victims and offenders separately through two surveys. In the survey for victims, we described our recruitment criteria as people who have experienced online harm on Overwatch Discord or in an Overwatch game. During the interviews, some victims referred us to their friends who have experienced harm or have been banned in the Overwatch gaming community, and we included them as participants. For the second survey, we did not describe participants as “offenders” or “people who have caused harm” since prior research suggests that people may not want to associate themselves with those categories, especially when there has been no opportunity to discuss what has happened [
58,
60]. Therefore, we described the recruitment criteria as people who have been warned or banned on Overwatch Discord or in the game.
In the recruitment surveys, we asked participants to briefly describe a harm case they had experienced. We selected participants from this survey based on the time order of their replies. Additionally, we conducted preliminary data analysis to categorize the types of harm (e.g., on Discord vs Overwatch; between friends/strangers; within the moderation team/between end-users/between end-users and moderators). We then prioritized participants who had experienced different types of harm for interviews. Table
1 describes the demographic information of our participants.
4.2 Interview Procedure
Through
interviews with victims and offenders, we wanted to understand both their current experiences with harm cases and their perspectives on a restorative justice process for those cases. We adapted our interview questions from restorative justice pre-conferences, where facilitators meet one-on-one with victims and offenders to solicit their perceptions on using a restorative justice process to address harm [
132]. During the first stage of the interviews, we asked participants questions about the harm case they had experienced or caused, including how it was handled, its impact, and the need to address the harm.
During the second stage, we introduced participants to restorative justice principles and the victim-offender conferences. We focused on these conferences because they constitute a widely used practice that embeds the core restorative justice principles. We included frequently posed questions in preparation for victim-offender conferences, such as what information they would like to convey in the conference and their expectations and concerns regarding the process. If time allowed, we asked participants to reflect on more than one harmful incident. Further, some victims and offenders were also moderators of the community. We first inquired about the harm with their primary self-identified roles and then asked about their secondary role(s).
Our interviews with Discord moderators were intended to assess how they deal with harm in their communities and their attitudes toward using restorative justice to repair the harm. Additionally, since the facilitator is essential in offline restorative justice practices, we explored the possibility of creating a corresponding role in online scenarios. Since moderators have the closest currently existing role to a facilitator, we sought to learn their perspectives on assuming this role. During interviews with them, we first asked about the harm cases they had handled in their communities and their decision rationales. We then introduced the idea of a victim-offender conference and asked them to (1) reflect on potentially using it as an alternative approach for addressing harm cases they had handled in their communities and (2) share their thoughts about serving as restorative justice facilitators on those cases.
It is challenging to elicit people’s perspectives on a hypothetical process or a process they lack previous knowledge about. To make restorative justice concepts more concrete, we asked participants to imagine a restorative justice process based on actual harm cases they had experienced or handled. Additionally, we answered their follow-up questions and corrected any misconceptions we identified in our discussions. We continued analyzing our interview data as we recruited and interviewed more participants. We ceased recruiting when our analysis reached theoretical saturation [
21].
We conducted
two expert interviews with restorative justice facilitators to elicit their insights about using restorative justice in online settings. We introduced these facilitators to Discord and Overwatch moderation mechanisms and described examples of how harm cases were handled based on our interviews with victims and offenders. Here, we stayed close to our raw data and described the harm cases through the perspectives of our participants. The facilitators evaluated the current moderation practices through the restorative justice lens and envisioned the future of restorative justice on Discord and Overwatch. We did not intend to reach theoretical saturation for this population [
21]; we still incorporated these interviews because doing so provided valuable insights on how these cases could be alternatively handled in a restorative justice context [
104].
We conducted our interviews from February to July 2020. The interviews lasted one to two hours each, and participants received compensation from $25 to $50 US dollars based on interview duration. We conducted 21 interviews using Discord’s voice call function, and two participants (P1, P13) chose to be interviewed using Discord’s text chat. Before each interview, we negotiated interview time, which was based on participants’ availability and the number of harm cases they wanted to share during the interview. We also conducted two in-person interviews with the facilitators (P5, P6). Our study was approved by the Institutional Review Board (IRB) at the University of California, Berkeley.
4.3 Data Analysis
We conducted interpretive data analysis on our interview transcripts [
21]. We began with a round of initial coding [
102], applying short phrases as codes to our data line-by-line to keep the codes close to the data. Examples of first-level codes included “impact of harm,” “creating new account,” and “banning.” Next, we conducted focused coding [
102] by identifying frequently occurring themes and forming higher-level descriptions; second-level codes included “notion of justice” and “sense of community.” Coding was done iteratively, where the first author frequently moved between interview transcripts and codes and discussed emergent themes with other authors. After these initial and focused coding rounds, we applied restorative justice principles and values as a lens to guide our interpretations. Finally, we established connections between our themes to arrive at our findings, which we categorized according to participants’ roles (offenders, victims, moderators, and facilitators). We coded the expert interviews using the same code book we used for other interviews since it helped us compare facilitators’ views of harm to other participants’ opinions during the analysis.
4.4 Methodological Limitations
We used convenience and snowball sampling to recruit participants in a single gaming community. In addition, our research examines harm from an interpersonal perspective. As a result, the experiences participants report and the solutions we design may not be representative of, or applicable to, all forms of gaming communities or online harm. Further, though Overwatch has a multinational and multicultural user base, most participants were English speakers from Western cultures. Finally, our recruitment method did not let us recruit offenders who were neither warned nor banned.
As researchers, we lack a full picture of what occurred in the harm cases we studied. To this end, we adopted Whitney Phillips’s approach [
89], which involved observing how our participants presented themselves to us and drawing conclusions from their performance, which may have been choreographed. As a result, we present our findings as subjective perspectives of Overwatch members rather than as objective truths. During the interviews, all victims believed that they were the party that had received harm; all offenders believed they had caused harm, but some assumed only partial accountability since they were also harmed in the process. We analyzed our interviews based on these self-described roles and views of participants in each harm case.
4.5 Positionality Statement
As researchers working in the sensitive space of understanding online harm and imagining interventions that may help address it, we briefly reflect on our position on this topic. All article authors feel deeply concerned that online harm is a persistent social problem that disproportionately affects marginalized and vulnerable people [
34]. Some authors also come from marginalized groups and are survivors of harm. Our prior research on online harassment has helped us understand the inherent limitations of existing moderation mechanisms, which has shaped our desire to look for alternative approaches. We celebrate the growing interest in applying alternative justice theories [
8,
54,
67,
81,
127], e.g., racial justice [
106] and transformative justice [
25] to reduce harm to victims, offenders, communities, and societies. We are particularly enthused by the success of restorative justice in addressing offline harm and its potential to provide agency and care for vulnerable groups who are often ignored and harmed in a punitive justice model.
While we embrace the values of restorative justice and see its potential, we do not seek to advocate for restorative justice in this work uncritically. We all live in cultures where punitive justice is the dominant approach to addressing harm, and we have actively worked to learn about restorative justice through research and by practicing its values in our own lives. This education and experience have shaped our interpretations of and understanding of the implications of our findings. At the same time, we acknowledge that restorative justice processes alone cannot address deep-rooted, systemic cultural and social issues such as racism and sexism (see transformative justice [
25]).
We recognize that restorative justice is voluntary, and forcing this process may further harm the victims. Thus, we attempt to present an impartial account of our participants’ perspectives on the victim-offender conference and restorative justice principles. Designing and implementing tools for restorative justice practices, and experimenting with these practices in online communities, are crucial areas for further understanding the utility of restorative justice in the online context.
5 Findings: Current Moderation Models Through A Restorative Justice Lens
We use the three restorative justice principles (see Section 1) to understand the experiences of victims, offenders, and the community during instances of online harm within the current content moderation landscape. With our sampling methods, we included two victim-offender pairs in our study — P17 and P25 as well as P22 and P24. To illustrate our findings, we present a relevant harm case from our data in each section.
We warn readers that the following sections contain offensive ideas and language. However, we believe that such sensitive content can help readers understand the nature of online harm.
5.1 Victims Have Limited Means to Heal from Past Harm or Stop the Continuation of Harm
Restorative justice centers on victim needs. In offline restorative justice processes, victims usually share their stories, receive emotional support, and provide input on what is needed to repair the harm. The process aims at helping them heal from harm and stop harm continuation [
132]. Our analysis indicates that the main tool available to online victims to address harm is reporting their offenders to moderators or the moderation system. However, it also shows that reporting such incidents does not effectively address their need for emotional healing or prevent future harm.
To demonstrate the harm participants receive in online gaming communities, we first relate an example in Case 1.
Several participants told us that they frequently experience harm, such as offensive name-calling or sexual harassment, on Overwatch or Discord. We find that such cases are often related to structural issues such as sexism, misogyny, transphobia, and homophobia—offenders often target their victims’ gender or gender identity. Victims also received negative comments about their gaming skills. Women and gender non-conforming gamers sometimes experience compounding harms due to both of these patterns. P11 offered some examples of the comments she had heard while gaming:
“Go make me a sandwich, calling me a bitch, telling me that I should go make food for my husband, I probably have kids and cats [...] getting called bitch, slut, whore, cunt, just every derogatory name for women in the book.”
On both Overwatch and Discord, the main tool for victims to address harm is reporting to the moderation system or the moderators. However, victims often felt left out of the decision-making process after reporting a harm case and did not find the process helpful for healing from the harm. Several participants told us that after reporting, they either do not hear back or receive a vague message that indicates the moderation decision but offers no procedural details: “[The moderation system] just tells me that “something” happened, and my report led to that happening” (P11). P9 believes that the moderation system is intentionally opaque to gamers: “They don’t want players understand[ing] the system.”
In addition, though reporting the harm may result in punishment of the offenders, it does not directly help victims heal from the emotional impact of the harm. P11 talked about the emotional toll of the incident: “I’ve cried about it a few times [...and I told my friend], “I don’t even feel like I want to live in a world where there are people like that.””
Many victims indicated that though they tried to report their offenders, they continued to be harmed in the community, where a culture of harm is prevalent. Even if they do not encounter the same offender again, incidents of harm are so frequent that they come across new offenders repeatedly: “At least 5 out of 10 games, I get somebody saying some sort of crude comment about sexually harassing me” (P25). When those experiences of harm accumulate, victims anticipate being frequently subjected to harm, and they accept that there are no effective ways to address it. P9 said, “[Harm] happens frequently enough that you just get used to it.” P11 said, “I feel pretty helpless that I have to endure that every time I play a game.” P12 noted that reporting itself becomes labor intensive: “[Harm] happens so often. I wouldn’t want to report every single person I’ve talked to.”
Victims indicated that they need to consciously avoid harm in the community, which impacts their gaming experiences. For example, some participants stopped using voice chat out of fear that it would reveal their gender, even though communication with teammates is important for winning. P13 fears engaging with strangers after having experienced ongoing harmful behavior: “I will never play a game without a friend I trust, or talk in the game without my friends around because I don’t trust that there will be even one person that will defend me when it gets bad.” Some participants also leave a game or Discord community where they have experienced harm.
In sum, these findings show that the intensity and frequency of online harm can substantially impair users’ online experiences and cause long-lasting emotional damage. Although the current moderation systems on platforms like Overwatch and Discord offer socio-technical mechanisms like reporting to address cases of online harm, the goals, and the lack of transparency and follow-up inherent in these mechanisms, do little to meet victims’ need to heal. In addition, the current reporting mechanisms do not substantially reduce offenses within the gaming community because they do not effectively change the culture: even when participants can avoid the original offender, they continue to be harmed by new ones.
5.2 Offenders Are Not Supported in Learning the Impact of Their Actions or Taking Accountability
In a restorative justice view, offenders can address harm by acknowledging their wrongdoing, making amends, and changing their future behaviors. Through practices like victim-offender conferencing, offenders learn the impact of their actions by listening to the victim’s side of the story. Afterward, they can repair the harm and learn through actions, such as acknowledging harm (e.g., apologizing), taking anger management courses, or doing community service [
27,
132].
As our participants report, current moderation approaches often embody graduated sanctions [
70]. That is, moderation teams in Overwatch and Discord tend not to ban offenders permanently after their first offense. Instead, they use more lenient punishment first to give offenders a chance to change their behaviors. Our interviews show that many Discord moderators have elements of a restorative mindset: they want to help offenders learn their wrongdoing and change their behavior by giving them second chances and providing explanations for their sanctions. Several Discord moderators explained their rationale for graduated sanctions as “giving people a second chance.” As moderator P14 noted,
“We don’t want [the moderation decision] to be a surprise [...] and we will actually really want to encourage them to improve.” Some Discord moderators also provide explanations of their moderation decisions. Such messages typically explain the rules and consequences of breaking them and can be posted by a moderator or a pre-configured bot.
Though some moderators have a restorative mindset, the punitive moderation system and rule-based explanations they rely on may fail to provide learning and support for offenders. They may not effectively stop the perpetuation of harm. We show an example through Case 2.
In case 2, the moderators made efforts to negotiate with the transgender woman, hoping that she would change by giving her multiple chances and providing explanations for their sanctions. However, their explanations mainly emphasize that her actions violated the rules and were prohibited, and the moderation decisions are all punitive. We argue that this model directs offenders’ attention away from the victims and does not further their understanding of the impacts of their actions on the victim. Instead, the punishment, warnings, and restating of rules position the offenders against the moderation system and in defense of their own behavior.
Restorative justice recognizes that often, offenders of harm may have been the victims of harm in other cases. As activist Mariame Kaba says:
“No one enters violence the first time by committing it.” [
46]. In case 2, the harm that the transgender woman caused is a reaction to the harm she experienced elsewhere from cisgender men. Similarly, several offenders in our sample revealed how they had been constantly harmed in gaming communities:
“I have been called countless names that are vulgar, offensive, and stuff like that” (P21). Restorative justice practices assume that although the past harms experienced by offenders do not absolve them of their responsibility for committing an offense, it can be hard to stop offensive behavior without addressing their sense of victimization through support and healing. Punishment, on the contrary, usually reinforces the sense of victimization [
132].
The moderators in case 2 stopped the transgender woman from sharing her story. Instead, she received a denial, which worsened the damage and led her to defend herself more aggressively. In addition, when harm such as discrimination is a systemic issue in the gaming community or the broader society, stopping the perpetration of harm may require work to change the culture in addition to work to change individual offenders. Facilitator P6 explained how a restorative justice approach would strive to provide support and look at the root cause instead of punishing a particular offender:
“Rather than just zooming in on that one case and trying to blame this individual for that action, acknowledging that [they were] also harmed by the community and pushed to act in that way [...] it’s a much harder conversation to engage with, but then, that takes away from saying that person is wrong. It’s more about [...] how do we understand this collectively, and then, how do we address this collectively? ”
We also find that when punishment is used as the only tool to stop offenses, it loses its function when offenders are not actually punished. Several moderators, victims, and offenders mentioned the convenience of using alternative accounts once one account has been moderated: “It is so easy to make new accounts in Discord that “reporting” people don’t really work” (P1). P17 used to conduct multiple offenses in games. He pointed out the fallacy of regulating with banning while there is no cost to creating and using another account in the game:
“Everyone right now is happy with that system because, if I was a normal player [...] I heard that [the offenders] got banned, I would be like, “Wonderful, they got banned. I’m never going to see them again,” but in actuality, when I got banned I’m going to say, “I don’t give a fuck, I’m just going to log into my second account.”
Despite these shortcomings, we found evidence that the current punitive approach contributed to maintaining the health of community content and stopped the continuation of harm. Moderators we interviewed told us that some offenders would stop their misbehavior after receiving a warning or would not return to the same community after being banned. However, when punishment is the only means of addressing harm, it cannot always be effective.
5.3 Gaming Communities Create Challenges for Victims and Offenders to Address Harm
Harmful actions disrupt relationships within a community and potentially affect all its members. Thus, it is important for community members to acknowledge the harm and participate in redressing it collectively [
132]. Restorative justice defines a
micro-community as the secondary victims who are affected by the harm and the people who can support the victims and offenders to address the harm (e.g., family or friends) [
80]. In online gaming communities, the micro-community includes not only victims, offenders, and moderators but also bystanders, friends of victims and offenders, and gamers, more generally.
However, we found that the relevant stakeholders had no shared sense of community during many instances of online harm. As a result, the harm was often considered “someone else’s problem” and remained unaddressed. Additionally, when micro-community members got involved, they often created secondary effects that further harmed victims and offenders.
5.3.1 Victims, Offenders, and Moderators Do Not Have a Shared Sense of Community.
Restorative justice appeals to the mutual obligations and responsibilities of all community members to each other as necessary to address harm. However, in current moderation systems, we found that victims, offenders, and moderators lack a shared sense of community.
Many victims in our sample relate to and show care toward other gamers the offenders might harm. For example, P11 said, “I usually am just sad not for myself really, but just that other people have to deal with those people.” Some victims even care about their offenders and what may have led them to perpetrate harm. However, there is a lack of shared sense of connection from the offenders, which makes it hard for them to care about their victims and may lead them to fail to see the impact of their actions on others. The anonymous and ephemeral nature of online conversations deters offenders from relating or caring about the community or the victims. P17 used to harm other gamers but has since reformed himself. Reflecting on his previous mindset as an offender, he pointed out: “[The victims’] day is legitimately ruined because of what [offenders] said, but these people aren’t going to think about what they said twice [...] They’re not going to reflect because it doesn’t affect them.”
Additionally, Overwatch randomly pairs up gamers when they do not join as a team. As a result, the offenders do not need to interact with their victims after a game has finished. This absolves offenders from feeling accountable for their actions and in fact creates an environment where repeating harm comes at no cost. P21, who has attacked others in Overwatch games, noted: “In a game where you know somebody for 20 minutes and you see that they’re bad so you flame [insult] them, and then you just move on. You never see them again.”
As mentioned in Section
5.2, many gamers we interviewed have multiple online identities in games and do not feel particularly attached to one. P17 used to attack others online using his anonymous accounts. He reflected,
“No one can recognize what accounts we’re on so we can do whatever we want.” Given the limited information and social cues online, it is harder for people to find mutual relationships and connections. P21 used it to justify offenders’ behavior:
“In games you don’t know these people. You don’t know their personalities. You don’t know their backstory [...] All you know is that they’re doing bad [at gaming], and so that’s what you use against them. And, there’s no way of fixing that. And if you’re getting offended by some of these words, then just mute them.”
On many Discord servers, moderators regulate a confined, public space — the general chat. Several moderators told us they do not intervene in harm cases that happen outside this general chat. Moderator P4 noted: “For the most part, we just tell people we can’t moderate things that happen outside of our community, and at that point it’s on them to block people.” For harm cases in the general chat, moderators do not help offenders and victims mitigate problems; instead, they punish the offender or ask both parties to resolve problems themselves. Moderator P19 would move contentious conversations to a private space: “I’ll delete all messages that they’ve put through to each other, put them both into the group chat, ban them from talking in general or whatever and keep them in this private chat and get it all solved in there.”
In a restorative justice view, a sense of community is essential for offenders to care about the harm they cause to the victims and may even stop them from conducting harm in the first place. However, we found that the anonymous, ephemeral nature of online interactions makes offenders careless about breaking relationships with victims. Additionally, the unclear distinction between public and private online spaces creates challenges for moderators to support victims, and the ensuing lack of support may leave victims vulnerable.
5.3.2 The Community Creates Secondary Harm Against the Victims.
In some offline practices of restorative justice conferencing [
132], community members who care about the harm that occurred can choose to join the conference. Community members can support victims by listening to their stories, acknowledging the harm that occurred, and providing emotional support [
132]. Analyzing our interview data, we found that due to the absence of such a victim-centered structure online, community members often create secondary harm. Though it takes courage for victims to share their stories, finding an audience that supports and cares about them can be difficult. Instead, victims may be challenged, humiliated, or blamed, which creates secondary harm.
P12 was sexually harassed by a male member in the gaming community when she was underage. She chose to reveal the member’s behavior during a voice chat when other community members were present: “I felt like more people should know about it since most of the people in [the Discord community] are underage.” Though P12 expected to get support from her friends, she received multiple challenges from them after this revelation: “Some people were on my side [...] but some other people were on his side and said I just wanted attention, and I should have known he was just joking.” Later, the man denied all accusations and accused P12 instead. According to P12, “He was trying to make me look like a bad person.” P12 felt so unsupported and hurt by the community’s reaction that she eventually chose to leave it.
Community members often ignore the harms they witness, which can further fuel harm. P13 was a victim herself and has witnessed harm in the Overwatch community. She talked about how she was disappointed by the bystanders who did not speak up for the victims:
“Most people just think that the game is over; there is no point in trying to help even your teammates, and think they will just end the game and try again another game. Since it is online, most people don’t realize that it can actually hurt people what someone says.”
Further, community members can feed into harm by rewarding offenders with attention and positive reactions. P17 talked about why he used to harm intentionally to get attention from friends:
“All my friends found it funny. The reason I said those things was not for [the victim ...] It’s for them [my friends] to laugh. It’s for the reaction. It’s the adrenaline rush of being the center of attention, you know?”
These findings suggest that community members are not neutral bystanders when harm occurs. They may create secondary harm for victims by ignoring the harm, challenging their stories, or encouraging offenders’ behavior. In contrast, restorative justice enables the community to build a culture of responsibility and care, where community members collectively prevent and address harm.
5.3.3 The Community has a Punitive Mindset Toward Offenders.
Restorative justice encourages the community to facilitate change in offenders’ behaviors. Community members can share how the harm has impacted them, express willingness to support offenders through the change, and acknowledge any changes offenders have made. However, in punitive settings, community members who disagree with the offenders’ behavior want to punish them, as is the case in online moderation systems. Case 3 shows an example of such an occurrence and its effects in Overwatch.
In Case 3, victim P25 had a positive experience when sharing her experiences of harm on Twitter. However, most other gamers decided to punish P17 instead of telling him how his actions caused harm. As we mentioned in Section
5.2, the punishment only further harmed P17:
“It’s just anxiety. That’s what’s constantly going through your head.” This punishment stopped P17 from offending again, but he did not learn to care about his victims until his friend reached out and supported him.
Additionally, though P17 apologized to P25, stopped those behaviors, and learned the impact of his action afterward, he did not have a chance to demonstrate his change to the community. He left the community and abandoned his career goal to become a professional gamer. This is not what his victim, P25, had wished would happen. She wanted the offender to have a chance to learn and demonstrate his change:
“Nobody’s perfect, everybody makes mistakes. We’re all human [...] I don’t think that just because you did one bad thing a year ago means that you just don’t have a chance anymore.” The community response also runs counter to restorative justice views, which maintain that offenders will be welcomed back to the community after their reform [
132].
In general, we found that some community members have a sense of responsibility and care about the victims and the harm that occurs inside their community. However, how they address harm is largely shaped by the punitive justice values prevalent in the gaming community and broader society. Their actions of punishing the offenders can create further harm in the community and do not help offenders who may choose to reform themselves.
7 Discussion
As noted throughout the article, current content moderation systems predominantly address online harm using a punitive approach. Analyzing through an alternate lens of restorative justice values, we found cases where this approach does not effectively meet the needs of victims or offenders of harm and can even further perpetuate the harm. Restorative justice provides a set of principles and practices that have the potential to address these issues. Table
2 compares our sample’s punitive content moderation approach with a restorative justice one. As the table shows, the latter provides alternative ways to achieve justice from the perspectives of victims, offenders, and community members, who are often absent in current content moderation.
Applying restorative justice practices to the governance of online social spaces is not straightforward. Victims, offenders, and moderators currently have unmet needs and concerns about the process. In this section, we first discuss the challenges of implementing restorative justice. We then offer possible ways for interested online communities to implement the approach. Finally, we reflect on the relationship between punitive content moderation approaches and restorative justice.
7.1 Challenges in Implementing Restorative Justice Online
We now explore the potential structural, cultural, and resource-related challenges of implementing restorative justice online. We also discuss some possible ways to address them.
7.1.1 Labor of Restorative Justice.
Restorative justice practices and the process of shifting to them will likely require significant labor from online communities. Restorative justice conferencing can be time intensive for all stakeholders involved. Before victims and offenders can meet, facilitators must negotiate with each party in pre-conferences, sometimes through multiple rounds of discussion. The collective meeting involves a sequence of procedures, including sharing, negotiating, and reaching a consensus.
Stakeholders, in particular facilitators, must expend emotional as well as cognitive labor. In offline practices, facilitators are usually paid by host organizations such as schools [
41]. However, the voluntary nature of moderation on social media sites like Discord means that online facilitators may be asked to do additional unpaid work. This issue can be particularly salient when moderators already expend extensive labor with a growing community [
32,
94,
113]. Labor is also involved in training the facilitators. Unlike punitive justice, restorative justice is not currently a societal norm that people can experience and learn about on a daily basis. Aspiring facilitators need additional training to learn restorative justice principles and practices and implement them successfully to avoid creating additional harm.
We estimate that resources for addressing the aforementioned labor needs could be attained in both a top-down and a bottom-up fashion. A top-down process could require resources from companies that host online communities. There is precedent for platforms making such investments; in recent years, social media companies such as Discord have hosted training for its community moderators.
5 A bottom-up process could engage users with preexisting knowledge about restorative justice to first introduce the process in their communities and gradually expand the restorative culture and practice from there. In our sample, two moderators attempted or practiced online restorative justice within their own communities; they showed enthusiasm for expanding its practice to other online gaming communities. It is possible that resources from companies
and practitioners could collectively begin to address the labor problem.
Additionally, implementing online restorative justice requires reallocationing labor instead of merely adding labor. We found that many moderators we interviewed already practiced different elements of restorative justice. Some aim at supporting victims and give offenders a second chance but do not have the proper tools or procedures to achieve that. Other moderators have practices that embed elements of restorative justice, such as talking with offenders and victims. Rather than necessarily requiring new procedures, restorative justice requires a shift of purpose in existing processes – from determining the point of offense to caring for victims and offenders.
Importantly, if online restorative justice could stop the perpetuation of harm more frequently than punitive justice, it could
reduce the need for moderation labor in the long term. While research has shown that offline restorative justice has successfully reduced re-offense rates [
27,
75,
84], evaluating the effectiveness of restorative justice practices in online communities is an important area for future work.
7.1.2 Individuals’ Understanding of Justice Aligns with the Existing Punitive Justice Model.
Although people have needs that the current system does not address, we found that their mindsets and understanding of potential options often align with what the current system can already achieve through its punitive approach. As our research shows, many moderators and victims think that punishing offenders is the most or best they can do, and some offenders also expect to receive punishment. Community members also further perpetuate the punishment paradigm. This mindset is not only a result of the gaming community’s culture; it is pervasive throughout society, including in prisons, workplaces, and schools [
37,
68].
Given the lack of sufficient information about and experiences with restorative justice, people may misunderstand and mistrust this new justice model. Some offenders in our interviews still expected to receive bans after the process, but restorative justice usually serves as an alternative to punishment, not an addition to it. Some participants wanted to implement alternative justice models in their own communities but received resistance from users who argued that the current moderation system works for them while disregarding its limitations for others. We found that such perspectives usually lead to quick rejections of the notion of implementing restorative justice online.
Before people can imagine and implement alternative justice frameworks to address harm, they must be aware of them. Crucial steps in this direction are information and education. Helping people understand the diversity of restorative justice processes and how their aim is restoration instead of punishment may address their doubts and open more opportunities for change. This is especially important since an incomplete understanding of restorative justice may cause harm in its implementation. For example, enforcing “successful outcomes” may disempower victims and result in disingenuous responses from offenders. Adapting restorative justice to online communities may require changes in the format and procedure of how harm is handled, but prioritizing its core values should help avoid additional unintentional harm.
Restorative justice has developed and rapidly evolved in worldwide practice [
26,
132]. Future research can build on and expand restorative justice beyond the three principles and the victim-offender conference. In addition, people need to experience it to understand it, adapt it to their needs, and learn about its effectiveness. By experiencing it offline, several participants in our sample came to see it as a natural tool adaptable to online communities.
In future work, we plan to collaborate with online communities to implement and test restorative justice practices. We want to pilot online restorative justice sessions and run survey studies to understand the types of harm and types of processes most likely to benefit from these practices. Building on that research, we aim at providing more precise empirical guidelines about how restorative justice can be embedded in moderation systems based on the socio-technical affordances of various online communities. To manifest a future of restorative justice,
“We will figure it out by working to get there.” [
64].
While this research focuses on the restorative justice approach, it is not the only alternative and has its limitations. As some of our participants mentioned, many harms are rooted in structural issues such as sexism and racism.
Transformative justice [
25] and
community accountability [
82] are frameworks of justice developed to address such issues.
Procedural justice emphasizes the importance of a fair moderation decision-making process [
52] and is a key objective in restorative justice practices [
4]. Future work should explore the potential of these different justice frameworks to address online harm.
7.1.3 Offender Accountability.
Another challenge may be in motivating offenders to take accountability for their wrongdoing – a persistent moderation problem regardless of the justice model implemented. In our interviews, we found that given the finite scope of moderation in many contexts and the limits in technical affordances of online communities, offenders can often easily avoid punishment. The harm may happen in a place without moderation or clear rules of moderation, e.g., when harm occurs during a private Discord chat or across multiple platforms. Some participants also noted that having multiple identities/accounts in Overwatch or Discord is easy. Thus, when punishment is ineffective, punitive justice may also lose its effectiveness.
Punishment is not the only form of holding offenders accountable. Restorative justice believes that people are also connected through relationships. Our interview data, as well as restorative justice literature, suggest the importance of a sense of community. If offenders perceive themselves as members of a shared community with victims, they will be more likely to take accountability for addressing harm [
65,
90]. However, in our interviews, we find that there exists a lack of sense of community. Offenders may not expect to meet victims again or hide behind multiple anonymous accounts. Moderators typically moderate a confined space of general chat, which can leave harm unaddressed in any place outside.
Therefore, it is vital that we inquire into what accountability means to the community and how to hold people accountable within the current moderation system. If offenders can simply avoid any punitive consequences of conducting harm and do not feel a sense of belonging to the community where harm occurs, it would be challenging to engage them in any process –punitive or restorative– that holds them accountable.
7.1.4 Emotion Sharing and Communication in Restorative Justice.
The limited modes of interaction and the often anonymous member participation in online platforms may influence the effectiveness of restorative justice. Many online interactions are restricted to text or voice, prohibiting victims and offenders from sharing emotions, and may give rise to disingenuous participation. Emotional engagement by victims and offenders is essential for empathy and remorse [
114]. Face-to-face sharing lets victims and offenders see each other’s body language and facial expressions.
Implementing offline restorative justice includes a series of physical structures and meeting procedures to elicit genuine, equal sharing. For example, participants sit in a circle; individuals who speak hold a talking stick; there are rituals at the beginning of the conference to build connections among participants and mark the circle as a unique space for change [
90]. Those rituals for emotion sharing are hard to replicate in the online space. For example, if an offender messages an apology through text, it can be harder to discern a genuine apology from a disingenuous one.
The issue of computer-mediated communication and emotion-sharing has been long discussed in the HCI and CSCW literature. In recent years, increasingly more advanced technologies have been developed to facilitate civic engagement and communication in online systems. For example, Kriplean et al. built a platform, ConsiderIt, to support reflective interpersonal deliberation [
71]. REASON (Rapid Evidence Aggregation Supporting Optimal Negotiation) is a Java applet developed to improve information pooling and arrive at consensus decisions [
55]. Many researchers have attempted to model human-like characteristics and emotional awareness in chatbots [
1,
112].
In the context of the restorative justice approach, Hughes has developed a tool, Keeper, for implementing online restorative justice circles [
53]. Such existing systems can be leveraged to develop advanced tools that facilitate emotion-sharing and communication in online restorative justice processes. Online platforms such as Overwatch and Discord could add such tools to improve emotional sharing and communication, necessary conditions for implementing restorative justice.
7.2 Applying Restorative Justice in Online Moderation
We now discuss possible ways to adapt current moderation practices to implement restorative justice online. While we have used victim-offender conferencing as a vehicle to interrogate the opportunities and challenges of implementation, restorative justice includes a set of values and practices that extend beyond the conference. It is important to meet online communities and platforms where they are and design restorative justice mechanisms based on their resources, culture, and socio-technical affordances. For example, compared to Overwatch, Discord provides more flexibility in communication and ways of maintaining social connection after a game. Some Discord servers have a greater sense of community and moderation resources than others. We suggest that online communities or platforms can begin with partial restorative justice practices that involve only a few stakeholders (e.g., pre-conferencing) or adapt some moderation practices to embed restorative justice values (e.g., moderation explanations) and implement victim-offender conferencing when its preconditions are met.
7.2.1 Embed Restorative Justice Language in Moderation Explanations.
Moderators can embed restorative justice language in explanations of moderation decisions. Prior work has shown that providing such explanations can reduce re-offense in the same community. However, many moderated offenders dismiss such explanations and continue re-offending [
56,
57]. Our research shows that explanations often focus on describing community guidelines or presenting community norms to the users, not encouraging offenders to reflect on the impact of their actions. These explanations usually indicate the actual or potential punitive consequence, which may direct offenders’ attention to the moderation system instead of their victims.
We suggest a shift in language from a rule-based explanation to an explanation that highlights the possible impact offenders may cause to victims and supports the offender in taking accountability. Facilitator P5 provided an example of how she would communicate with the offenders if she were an online facilitator: “This post had this emotional impact [...] this is how you’ve caused harm. This is the feedback from the community, and we want to work with you to create change.”
Victims are often left out of the moderation process in the online communities we studied. However, some victims want information on the moderation process being used and results of moderator interventions. While prior research has discussed how moderation transparency prevents offenders from re-offending [
57], our work highlights the importance of providing information for victims in the moderation process since they are the ones with needs for support and healing. We suggest that a note of care may help victims feel heard and validated and help them recover.
7.2.2 Restorative Justice Conferences with Victims or Offenders.
In our interviews, we found that some victims or offenders may not be available or willing to meet and have a conversation to address the harm. When a collective conference is not possible, it is possible to apply a
partial restorative justice process that includes offenders or victims alone [
73,
80]. Some Discord moderators already talk with victims and offenders before making moderation decisions, a practice similar to restorative justice pre-conferencing. In offline pre-conferencing, facilitators ask questions to help victims and offenders reflect on the harm, providing them with opportunities for healing and learning.
We propose that pre-conferencing provides opportunities to meet some of the needs our participants identified. For example, when an offender wants to apologize to the victim, the victim may not want to meet the offender but communicate their feelings through the facilitator.
In offline restorative justice, pre-conferencing is a preparation step for a potential victim-offender conference. Thus, if both victim and offender are willing to meet with each other in the pre-conference, the moderators can organize a victim-offender conference, where both parties share their perspectives and collectively decide how to address the harm. Moderators have the responsibility to maintain a safe space for sharing, for example, to halt the process when harm seems likely to happen, ensure a power balance between the victim and the offender, and work with participants to establish agreements of behavior and values to adhere to throughout the process [
15]. In cases where restorative justice does not succeed, preventing the continuation of harm must be prioritized. Because facilitating these processes is difficult, we discuss the challenges and opportunities for moderators to facilitate in the following section.
A victim-support group is another form of restorative justice conferencing [
29]. In our sample, many victims indicated a need to receive emotional support. We propose that online communities offer a space for victims to share the harm they experienced and receive support. Systems with similar goals have previously been built in online spaces. For example, Hollaback is a platform that allows victims of street harassment to empower each other through storytelling [
30]. Heartmob enables victims to describe their experiences of harm and solicit advice and emotional support from volunteer Heartmobers [
13]. These platforms offer templates for how victim support systems can be built. However, support in these platforms is distant from where harm occurs, and it is also important to think about how online communities can support victims by motivating community members to provide support.
7.3 Situating Restorative Justice in the Moderation Landscape
We have illustrated possible ways to implement restorative justice in the current moderation system. Yet, we do not seek to advocate for restorative justice as a wholesale replacement of the current moderation approach. We propose that restorative justice goals and practices be embedded as part of an overall governance structure in online spaces.
Restorative justice conferencing should be used in select cases only because it is effective only when it is voluntary and the parties involved are committed to engaging. Our findings show that individuals have different conceptualizations of justice. Schoenebeck et al. also found that people’s preferences for punitive or restorative outcomes vary with their identity or social media behaviors [
106]. It is thus important to attend to victims’ and offenders’ preferences in individual harm cases.
A larger governance structure should also take into account what to do if restorative justice processes fail. For instance, if it is determined at the pre-conference stage that a restorative justice approach cannot be applied, actions such as removing access by muting or banning might be used to prevent the continuation of harm. This is consistent with offline restorative justice practices, where the community or court clearly defines the types of cases that go through a restorative justice process and the action to take if a restorative justice process is not possible [
41,
123].
We estimate that whether an online community decides to apply restorative justice is a value-related question. While restorative justice and punitive justice processes share a goal of addressing harm and preventing its perpetuation, they have different processes and orientations toward when justice is achieved. Content moderation—closer to a punitive justice approach—addresses harm by limiting the types of content that remain on the site and punishing offenders in proportion to their offense. In contrast, restorative justice aims at assuring that victims’ needs are met, and offenders learn, repair harm, and stop offending in the future. Thus, the primary reason for applying restorative justice in moderation is not to achieve the current goals of effectively removing inappropriate content and punishing offenders but to benefit online communities using restorative justice values and goals.
Seering et al. found that community moderators have diverse values regarding what moderation should achieve [
108]. While some have a more punitive mindset and hope to be a “governor,” others align more with restorative justice values and hope to be “facilitator” or “gardener.” Thus, online communities must reflect on their values and goals and decide on what mechanisms (e.g., punitive or restorative) help realize those values. Recent research has argued that social media platforms are responsible for incorporating ethical values in moderation instead of merely optimizing to achieve commercial goals [
50]. In particular, some researchers have proposed values and goals that align with restorative justice, such as centering victims’ needs in addressing harm [
13,
106], democracy [
107], and education [
128]. Our work adds to this line of research and envisions how restorative justice may benefit online communities in addressing some severe forms of online harm, such as harassment.
Finally, communities should be cautious about expecting or enforcing a positive outcome. Enforcing forgiveness from victims or expecting a change in offenders’ behavior may undermine victims’ needs and put them in a vulnerable place for forgiveness or induce a disingenuous response from offenders [
5]. Online communities should allow for partial success or no success without enforcing the ideal outcome, especially at the early stage of implementation when there are insufficient resources or commitments. Instead, they may focus on how victims, offenders, and the entire community could holistically benefit from the process.