Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3613904.3642946acmconferencesArticle/Chapter ViewFull TextPublication PageschiConference Proceedingsconference-collections
research-article
Open access

“I finally felt I had the tools to control these urges”: Empowering Students to Achieve Their Device Use Goals With the Reduce Digital Distraction Workshop

Published: 11 May 2024 Publication History

Abstract

Digital self-control tools (DSCTs) help people control their time and attention on digital devices, using interventions like distraction blocking or usage tracking. Most studies of DSCTs’ effectiveness have focused on whether a single intervention reduces time spent on a single device. In reality, people may require combinations of DSCTs to achieve more subjective goals across multiple devices. We studied how DSCTs can address individual needs of university students (n = 280), using a workshop where students reflect on their goals before exploring relevant tools. At 1-3 month follow-ups, 95% of respondents still used at least one type of DSCT, typically applied across multiple devices, and there was substantial variation in the tool combinations chosen. We observed a large increase in self-reported digital self-control, suggesting that providing a space to articulate goals and self-select appropriate DSCTs is a powerful way to support people who struggle to self-regulate digital device use.

1 Introduction

Smartphones, laptops, and related digital technologies give people effortless access to an enormous range of functionality. Yet, amidst all this digital freedom many people find it difficult to control their time and attention without being distracted by notifications, infinite recommendation feeds, or compulsive urges to check their devices [38, 52, 111]. This can have serious negative effects on work performance, social relationships, and mental health [17, 20, 43, 85].
‘Digital self-control tools’ (DSCTs) are a potential solution. These apps, browser extensions, and system settings help people use digital devices in line with their goals and avoid distraction [96]. They provide interventions such as blocking or delaying access to distracting apps, tracking and visualising how people spend time on their devices, or rewarding intended use [73]. Some even change the user interface of specific services directly, such as browser extensions that hide recommended videos on YouTube, or the newsfeed on Facebook [1, 67, 74].
Hundreds of DSCTs are now publicly available, and many have tens of thousands, or even millions, of users [72]. The emerging research on their effectiveness suggests that they have a clear potential to change behaviour and help people feel more in control [32, 56, 74, 96, 100]. Can these tools help us address the immediate need for support in populations such as students and information workers, where challenges with self-regulating of device use are widespread [4, 24, 97]?
To realise the potential for large-scale impact of DSCTs, important research-to-practice gaps need to be filled: in the real world, a person who wants to use DSCTs needs to identify and apply tool(s) that match their personal device ecosystem and goals for changed use. Typically, their device ecosystem involves multiple devices and their goals go beyond ‘reduce screen time’ [38, 45, 80, 83]. Moreover, people often have limited knowledge of which DSCTs are available, and/or find it difficult to navigate the available options and find the right tool(s) for their situation [13, 72, 76].
Existing research on DSCTs, however, has mainly focused on how a single intervention on a single device affects screen time or self-reported ‘addiction’ [95]. This has yielded core evidence on the potential of specific interventions [49], but the practical challenge of how to scaffold a process by which people can browse multiple DSCTs and apply the options most relevant to their personal goals across their device ecosystem, has so far received limited attention [55, 80].
Moreover, recognising the limitations of screen time and addiction metrics [5, 38, 45], researchers have called for measures that are better grounded in local and culturally-dependent notions of digital wellbeing [39, 65, 72, 76, 79]. Empirical investigations of such notions have tended to focus only on the smartphone [38, 45, 84, 95, 109, 111] and/or a single platform [11, 68, 92, 93]. Finding ways to evaluate DSCTs overall impact on self-regulated use across one’s device ecosystem, is likely to be practically useful for stakeholders who wish to provide broadly applicable interventions to support digital wellbeing.
To help fill these research-to-practice gaps, our paper explores the following general question: How can we empower people to identify their personal challenges and goals around digital device use and then select appropriate DSCTs to help?
We explore this question in the context of university students, where prior work has documented widespread challenges with self-regulating digital device use [7, 44, 97, 116]. We present results from a collaboration with the counselling service at the University of Oxford, where increasing numbers of students in their conversations with counsellors expressed frustration over inability to control their device use. To help address this local challenge, we developed an online workshop intervention in which students reflected on their concerns and goals around digital device use, explored relevant DSCTs, and were supported in applying them on their devices. Since its wider introduction in March 2021, the workshop has become the most popular of any offering ever provided by the counselling service.
We use data from these workshops to address three specific research questions:
RQ1: How do students want to change their digital device use? For their ecosystem of digital devices, what positive goals for use do students want to achieve?
RQ2: Can an intervention that combines reflection on personal goals for change with self-selection of relevant DSCTs improve digital self-control? Given urgent and widespread needs for support, might this style of intervention provide a simple and accessible solution for student counselling services and other stakeholders?
RQ3: How, if at all, do students’ preferences for different types of DSCTs vary? Are certain types of tools generally useful, or are students highly individual in what they choose and benefit from?
We explore these questions using open trial data from 280 students who participated in workshops between March 2021 and November 2022. Using a digital self-control measure adapted from relevant psychological research, we assessed self-regulated device use1 before and 1-3 months after the workshop, as well as which DSCTs the participants tried and how they had been useful.
We found that most participants wanted to make changes to, or be more in control of, both their smartphone and computer use. Overall, they wanted to use their devices more intentionally, and get better at balancing time spent on different activities, by having clearer boundaries around time and context of device use, reducing overall use, and limiting the ‘addictiveness’ of their digital environments.
At the follow-up (52% of participants responded), 95% of respondents still used at least one of the DSCT strategies2 they tried. The median respondent had tried three different strategies and were still using two, typically applied across devices (mainly smartphone + laptop computer). Respondents showed a significant quantitative improvement in digital self-control, with a large effect size (Cohen’s d = 0.93), which was supported by the qualitative data. Self-reported time spent also declined on laptop computer, smartphone, and tablet. Most participants applied a unique combination of the strategies included in the workshop. We observed a combination of general usefulness and individual variability: strategies related to blocking or hiding distractions, as well as focus timers, were generally rated as useful. Other strategies, such as grey scale or setting up rewards and punishment (typically using apps such as Forest) saw a large amount of variation.
Our paper contributes an identification of key design considerations for combining reflection and self-experimentation to support self-regulated device use via DSCTs. We also contribute the first direct comparison of multiple DSCTs in subjective usefulness, which may help inform theoretical discussions in DSCT research, as well as help stakeholders such as educational institutions provide practical advice. Finally, with our participants’ full consent, we contribute an anonymised qualitative dataset (over 62,000 words) on university students’ challenges and goals, and how useful they found the DSCTs they tried. This open dataset may help accelerate the development of empirically grounded measures of digital wellbeing, and of DSCTs to support it [69, 79, 114], and can be found on the Open Science Framework (https://osf.io/zmt78/).

2 Background

To address widespread concern over negative effects of digital device use on outcomes such as productivity and mental wellbeing, researchers and policymakers have proposed changes to the incentive structure of the attention economy [112, 118], the development of age-appropriate design directives [2], and requirements for social media companies to make data available for researchers to independently assess the effects of their platforms [101].
These initiatives have the potential for systemic impact, but are also likely to take years to implement. In the meantime, many user groups face an immediate need for guidance, including families, students, and information workers [4, 24, 97]. The interventions that HCI researchers are investigating in the form of digital self-control tools (DSCTs) are imminently available and could help address this need. However, to provide practical guidance on use of DSCTs, important gaps need to be bridged between the current state of research and the challenges individuals face in daily life.

2.1 Aligning evaluation of DSCTs with people’s usage goals in multi-device contexts

Most research on DSCTs has focused on use of a single device, typically the smartphone (see Roffarello and De Russis [95] for a review). In practice, people’s challenges with self-regulating use often involve multi-device contexts [18, 81, 96], where unwanted use curbed by a DSCT on one device might ‘spill over’ to another [56]. Some researchers have started to explore ways to take multi-device use into account [47, 56, 82, 93]. This work suggests that DSCTs implemented on one device can indeed — when assessed on screen time metrics — help people make progress towards their goals without causing negative second-order effects [56]. However, which evaluation measures to use remains an open question [8].
Many DSCTs are explicitly designed to limit time spent [e.g., 48, 51, 52], and the most common outcome measure in efficacy studies is time spent overall or in specific apps [95]. This to some extent aligns with users’ goals, as surveys of smartphone use, as well as individual platforms such as YouTube, suggest that goals to reduce time spent are common [38, 52, 68]. However, time spent is often an unreliable indicator of whether a DSCT is helpful [39, 65, 88, 89]. Lock-out mechanisms, a common feature in DSCTs, illustrate this conundrum: whereas locking users out of devices after a time limit may seem effective when assessed by reducing time spent, it can lead to strongly negative responses when users find themselves in ‘out-of-routine’ situations where they need to use their devices [48].
The most frequently used subjective outcome measures in DSCT research are questionnaires related to ‘addiction’ (e.g., the ‘Smartphone Addiction Scale’ [59, 95]). This, too, to some extent aligns with users’ goals: analyses of publicly available user reviews suggest that the most frequent purpose of use for DSCTs is to help oneself stay focused on demanding or boring tasks amidst readily available digital distractions. This capacity is captured to some extent by addiction scales [72]. However, the ‘addiction’ framing has been criticised as imprecise, and for pathologising everyday self-control struggles [5, 14, 41, 60, 87, 110]. One recently proposed alternative (which we adopt in the present paper, see Section 3.2.2) is to adapt measures from basic psychology research on everyday self-regulation to the context of digital device use [71]. This might both help capture global ability to self-regulate behaviour in a more precise manner than the scales related to behavioural addiction, and help fill the ‘theoretical gap’ in DSCT research, by grounding measures in relevant psychological research [37, 73, 95].
A key consideration, as researchers explore measures of digital wellbeing to evaluate DSCTs against [8], is to ensure that those measures accurately capture specific populations’ usage goals and preferences for support [79]. Since most recent empirical investigations of the goals that DSCTs should serve have focused only on smartphone use, it would be useful to collect additional data on these goals and preferences in cross-device contexts. Moreover, open data sharing is rare in DSCT research, but might accelerate the development of empirically grounded measures, and allow for easier comparison between populations [69, 114].
Hence, our first use of data from the workshop focuses on the following question (RQ1): in the context of their personal ecosystem of digital devices, how do students want to change their use?

2.2 Helping people identify and apply DSCTs relevant to their goals

People vary substantially in the way they use digital devices, both compared to others and to themselves over time [16]. To empower people to influence their own behaviour using DSCTs, we therefore need effective ways to elicit individualised goals and apply DSCTs in bespoke ways to support those goals [86].
Work on ‘digital self-nudging’ suggests that empowering people to co-create their own DSCTs (e.g., using the Shortcuts automation app on iOS) can support a sense of agency, accomplishment, and perceived usefulness [92]. Active involvement is likely to be particularly important for DSCTs, as they often involve interventions that restrict the user. This can be an effective way to influence behaviour, but also cause psychological ‘reactance’, in which the user is motivated to circumvent or otherwise rebel against the intervention [74, 106]. This is likely to be reduced if the user freely commits themselves to a DSCT because it supports their personal goals and values, as opposed to having it assigned by experimental condition or algorithmic allocation [66].
The research to date, however, has almost invariably focused on a single intervention on a single device or service, pre-assigned to research participants [95]. Few studies have investigated the practical challenge for which stakeholders such as university counselling services urgently need advice: how can we help people identify and apply the right combination of DSCT(s) for their personal needs (see also [80])?
One notable exception is HabitLab [55, 57, 58]. This project deployed an in-the-wild intervention that helped more than 12,000 daily Chrome and Android users meet time limiting goals on websites and apps via a wide range of interventions, from hiding content feeds to displaying timers. However, a limitation of this work was that it measured success mainly in terms of limiting time spent, which is likely to have captured only a subset of people’s actual goals [68]. Broader HCI work on individualised strategies for self-monitoring and self-experimentation in behaviour change (specifically, ‘n-of-1’ studies for managing health conditions requiring individualised insights [42, 63]), suggests that this might be a problem: if individualised goals are not carefully elicited, people may end up with solutions that encourage behaviour that runs counter to their goals [86, 115]. Similarly, analyses of user reviews for DSCTs suggest that people are often frustrated that the tools they try do not match their goals for self-regulated use [72].
In sum, there is an opportunity to extend previous work with an approach that elicits individual goals and lets users actively choose and apply DSCTs. Hence, our second use of workshop data focuses on the following practical question (RQ2): Can an intervention that combines reflection on personal goals for change with self-selection of relevant DSCTs improve digital self-control?

2.3 Understanding individual variation in the usefulness of digital self-control tools

Whereas DSCT research has focused on average effects of one specific intervention, many studies have also reported that people can vary substantially in their preference for, and benefits derived from, those interventions [47, 48, 76]. For example, Mark et al. [76] noticed in an exploratory study of information workers that the benefits of distraction blocking seemed to apply only to participants who at baseline scored lower on self-control. Getting a clear picture of the variance between people is highly practically relevant: should stakeholders like university counselling services direct most students towards a limited number of interventions that are universally useful, or should they support each student through a process to identify the tools that best meet their specific needs?
Psychological research on self-control more broadly suggests that the answer might be the former: applying the ‘process model’ of self-control, Duckworth et al. [26] found that strategies intervening earlier in the cycle of impulse generation (i.e., avoiding exposure to temptation as opposed to trying to overcome it only after unwanted impulses have arisen) are generally more effective [24]. This might also hold true for DSCTs. However, few studies have explicitly compared how multiple DSCT interventions vary in usefulness [95]3.
Again, one exception is HabitLab: Kovacs et al. [58]’s supplementary materials included a comparison of twelve different interventions on Facebook, finding that the most effective (auto-closing the tab after 60 seconds) was twice as effective as the second-best (blocking the site after a user-determined duration). However, the outcome measure was solely reduction in time spent. Given the limitations of this measure (according to which, the most effective solution would be to simply ban an app or website from being used altogether), interpreting their findings would be helped by triangulation with users’ subjective assessment of usefulness.
Hence, our third use of workshop data focuses on the following question (RQ3): How, if at all, do students’ preferences for different types of DSCTs vary?

3 Methods

We developed the workshop informed by prior research on personal informatics, behaviour change, n-of-1 methodologies, and the psychological mechanisms of self-regulation [12, 30, 40, 62, 86], combined with the domain expertise of our counselling service collaborators and feedback from participants in pilot workshops. The study was reviewed and granted ethics approval by the Computer Science Departmental Research Ethics Committee at the University of Oxford (only authors from this institution had access to the raw data).
In the following, we describe the workshop methods, including our adaptation of a commonly used self-control scale from psychology as a measure of digital self-control.

3.1 Preparatory workshop development

We iteratively developed the workshop in collaboration with the counselling service at the University of Oxford, which works one-on-one with nearly 3,000 students each year. Between May 2019 and March 2020 we conducted five pilots of an in-person workshop. We had students engage in small-group discussions about their challenges around digital device use, and explore different types of DSCTs to address these challenges using a card sorting task [78, 105]. A website provided examples of how to apply DSCTs on specific devices. At the end of the workshop, participants verbally committed to try specific DSCTs.
After the onset of the COVID-19 pandemic, we used our experience from these pilots to develop an online version of the workshop. We conducted this version on Zoom, and used the online whiteboard tool Miro (https://miro.com) for components that required participant interaction. We conducted five pilots of the online version in November 2020. In March 2021, we made minor tweaks based on these pilots, made the final workshop version more widely available at the University of Oxford, and began collecting data for the present paper.

3.2 Workshop methods for the version used in the present paper

3.2.1 Overall structure.

The workshop had three parts: reflection, exploration, and commitment, and lasted approximately 70 minutes. All workshops were delivered online (using Zoom), facilitated by the first author, and typically had 4-7 participants. Participants carried out the interactive tasks on a shared online whiteboard (using Miro). If a workshop had more than 4 participants, the reflection and exploration sections took place in breakout rooms, so as to make participants feel more comfortable asking for help in a smaller group. With this exception, the workshop process was identical irrespective of number of participants. Participants were anonymous on the board, allowing them to see and discuss each others’ reflections without disclosing who contributed any specific concern.
Table 1:
TimeTaskDescription
00:00Welcome and workshop outline 
00:10ReflectionThe facilitator presents a simple framework for thinking about challenges with digital device use in terms of ‘external’ triggers (factors in one’s digital environment, e.g., notifications) and ‘internal’ triggers (personal psychological factors, e.g., habits, or emotional states like boredom). The facilitator presents the reflection task and gives participants ∼ 10 minutes to complete it.
00:30ExplorationThe facilitator introduces four categories of digital self-control strategies that can be applied using DSCTs (blocking/removing distractions, self-tracking, goal reminders, and making intended use more attractive). The facilitator presents the card sorting task and accompanying website, and gives participants ∼ 12 minutes to explore.
00:50CommitmentParticipants write down which strategies they would like to commit themselves to try, and how they will do so (e.g., installing a specific browser extension).
01:00Exit survey and Q&A 
Table 1: Timetable for the workshop.
Reflection. The first part aimed to help participants articulate actionable and realistic goals for changing their digital device use [12, 86]. The facilitator suggested that participants think about their current challenges in terms of two kinds of ‘triggers’ [77]: external triggers in the digital environment (e.g., notifications or clickbait-filled recommender feeds), and internal triggers related to their own emotions, impulses, or habits (e.g., using devices in undesirable ways when feeling sad, anxious, or bored). Then, participants wrote down their thoughts in response to four questions [62]:
1.
What concerns you about your use of the internet / your laptop / your phone?
2.
What external triggers (e.g., notifications) and internal triggers (e.g., emotions) drive the uses you’re concerned about?
3.
What have you tried to address your concerns? How did that go?
4.
Imagine your use of digital devices is exactly as you want it to be. What does that look like? (Be specific about context, time of day, apps...)
Exploration. The second part aimed to provide specific, actionable strategies for how participants could apply DSCTs to address their challenges [12, 62], while letting them filter options according to their needs and preferences [66, 72, 76].
The facilitator introduced four categories of strategies: blocking or removing distractions (e.g., blocking distracting apps or hiding distracting website elements), tracking yourself (e.g., visualising time spent, or using productivity timers), keeping your goals in mind (e.g., using browser extensions to place to-do lists on new browser tabs), and making your goals attractive (e.g., using apps that provide rewards for not using one’s smartphone).
Participants then sorted 13 cards that represented specific strategies within these categories (see Table 3 in the Appendix). For each card, they indicated whether they had tried the strategy already, and whether it had been/might be useful (Figure 1b).
The four categories were drawn from Lyngs et al. [73]’s categorisation of features of DSCTs on the Apple App, Google Play, and Chrome Web stores. The 13 specific strategies within those categories represented our attempt at condensing the range of interventions in publicly available DSCTs, based on Lyngs et al. [73] and related papers [21, 72, 74, 94], as well as advice provided by the Center for Human Technology [19] and related tech blogs (e.g., [50]).
Finally, participants investigated how to use specific DSCTs to apply the strategies they liked, using the workshop website (Figure 1c; screenshots and details available on https://osf.io/zmt78/. On the website, participants could click each strategy to see specific ways to apply it with browser extensions, apps, or system settings, and filter the options by device, operating system, and browser type.
Figure 1:
Figure 1: Summary of the workshop materials.
Commitment. The last part aimed to get participants to set a specific and realistic intention for what to do next [30]. Participants were asked to choose 1 or 2 strategies they would commit themselves to try. They indicated their choice by placing a ‘commitment card’ under their chosen strategy/s on the Miro board. They wrote on the card how they, specifically, would apply the strategy (e.g., “Installing the Facebook/YouTube "recommended content" blocker”, P147).

3.2.2 The Brief Digital Self-Control Scale.

As a quantitative indicator of the workshop’s usefulness we adapted the 13-item Brief Self-Control Scale (BSCS [108]), one of the most commonly used instruments for measuring general trait self-control [64, 75]. Example scale items include “I have a hard time breaking bad habits”, “I am good at resisting temptation”, and “I am able to work effectively toward long-term goals”. All items are answered on a scale from 1 (not at all) to 5 (very much). The scale is typically used as a unidimensional measure where a participant gets an overall self-control score by summing all 13 items (most are reverse-scored). A large number of studies have found the overall score on the scale practically useful for predicting a range of outcomes from job search behaviour to eating patterns [10, 23].
We adapted the 13-item BSCS into a 12-item state measure of global digital self-control, i.e., people’s ability to override, change or interrupt undesired impulses and behavioural tendencies across all of the digital devices they regularly use [108]. To adapt the BSCS, we changed its opening prompt to get respondents to indicate which devices they regularly use (smartphone, laptop computer, etc.), and then to fill in the scale based on their experience using those devices over the past week (following Ernala et al. [29]). We adjusted each item to focus on digital device use, while keeping the past week in mind (e.g., the BSCS item “Pleasure and fun sometimes keep me from getting work done” became “In the past week, pleasure and fun on my digital device(s) sometimes kept me from getting work done”).
We iteratively piloted candidate items using the research platform Prolific (with 40 online survey participants). Afterwards, we investigated item distributions and conducted confirmatory factor analysis for the initial version of the scale, using data collected on Prolific (n = 294) and in workshop deployments at the University of Oxford (n = 251). Finally, we made minor adjustments to the scale (e.g., item 6 was dropped due to redundancy with item 1), and conducted confirmatory factor analysis using data from additional workshop deployments (n = 205).
We refer to the resulting 12-item scale as the ‘Brief Digital Self-Control Scale’. All items in the final scale are shown in the Appendix (Table 2). For full details of our scale development and validation process, see the supplementary materials on https://osf.io/zmt78/.

3.2.3 Surveys & interviews.

All surveys were deployed using the open-source survey framework form{r} [9].
Opening survey. Included a consent form, demographics, digital devices used & self-estimated daily time spent, motivation for signing up to the workshop, and the Brief Digital Self-Control Scale.
Exit survey. Participants indicated whether they in the workshop had found strategies that seemed like good solutions to their challenges, re-stated which ones they would try, how confident they were that they would help, and provided feedback on how relevant and accessible they found the workshop tasks.
Follow-up survey. Participants indicated which strategies they had tried, how useful they were, how they applied them, and whether they still used them. They also again provided self-estimated daily time spent on their devices and filled in the Brief Digital Self-Control Scale.
Interviews. In the follow-up survey, participants could opt in to a semi-structured interview. In the interview, they were asked how the workshop had affected them, and how they had applied DSCTs after the workshop.

3.2.4 Participants & recruitment.

We recruited participants using announcements in university newsletters, physical posters at the university counselling and wellbeing services, and word-of-mouth from previous participants. The recruitment materials referred to the workshop as the ‘Reducing Digital Distraction (ReDD) Workshop’, and included direct links to book a slot. The wording used to describe the workshop is shown in Appendix A.

3.2.5 Study procedure.

Figure 2:
Figure 2: The study procedure and number of participants who contributed data at each step. Colouring indicates the different modes of data collection: survey (blue), interactive workshop (tan), or interview (green).
The procedure is summarised in Figure 2. All workshops were facilitated by, and all interviews were conducted by, the first author via Zoom.
Two days before the workshop, participants were emailed a link to the opening survey. The exit survey was filled in as the final step of the workshop, if possible (occasionally, participants had to leave the workshop early). One week after the workshop, participants were emailed a reminder of what they decided to try, generated from their own notes in the exit survey. Either ~1 or ~3 months after the workshop4, participants were emailed a link to the follow-up survey. Interviews were conducted 1-2 weeks after participants had filled in the follow-up survey.
Our study was an open trial without a control group. We investigated our research questions by triangulating quantitative and qualitative data from the workshops, surveys, and interviews.

3.2.6 Data and analysis approach.

For the qualitative data (reflection notes from the workshop, free text responses in the surveys, and interview transcriptions), we conducted inductive thematic analysis following Braun et al. [15]’s ‘reflexive’ approach.
The full qualitative data from the workshop and surveys were iteratively analysed. We split up the dataset into three parts, keeping each participant’s full data together (i.e., each part contained data from ~90 participants). For the first part of the dataset, the first author (UL) and another author (CT), independently read through all responses and did initial coding of recurrent patterns relevant to the research questions. They then iteratively discussed codes, recoded excerpts, and discussed emerging themes. Next, the first author analysed the second part of the dataset, with another co-author (LC). This followed the same process, with the exception that the first author started with the set of codes generated through the previous analysis. Finally, the third part of the dataset was similarly analysed by the first author (UL), with a co-author (HA).
The interview data was analysed by the first author and a co-author (LA), following a similar process: first, the two authors independently read through all transcripts and did initial coding of recurrent patterns of meaning. Subsequently, they iteratively discussed their codes, recoded excerpts, and discussed emerging themes. The first author’s coding of the interviews was informed by his coding of the workshop and survey data, whereas LA’s coding was solely informed by the interview data.
We used the GDPR-compliant transcription tool Konch (https://www.konch.ai) to transcribe the 25 participant interviews. Thematic coding was conducted using NVivo v1.7.1 (workshop and survey data) and Delve (interview data; https://delvetool.com). Quantitative analyses were conducted using R v4.3.1 (see (osf.io/zmt78) for R package versions).

4 Results

Participant demographics.Between March 2021 and November 2022, 280 students who took part in a workshop consented to have their data used for research. The data were derived from 62 workshops conducted across six academic terms. The median number of participants in a workshop was 5 (interquartile range = 4 to 7, max = 17). 57% identified as women, 39% as men, 1.7% (n=5) as non-binary, 1.4% (n=4) preferred to self-describe, and 1% (n=2) preferred not to disclose. Participants were a mix of PhD (40%), undergraduate (35%) and master’s students (25%). In terms of age, 44% were 18-23 years old, 38% were 24-29, 12% 30-35, and 7% 36 or older.
Device usage. 98% of participants used a smartphone (62% iPhone, 38% Android), 99% a laptop and/or desktop computer (Mac: 51%, Windows: 46%, Linux: 4%), and 31% a tablet. The modal self-estimated time spent per day actively using a laptop or desktop computer was 7-8 hours, smartphone 2-3 hours, and tablet 1-2 hours (distributions shown in Figure 3 a; the limitations of self-reported time are discussed in section 5.4).
Figure 3:
Figure 3: Summary of participants’ time spent on digital devices (a) and their motivation for signing up to the workshop (b-d). The prompt for time spent was “In the past week, on average, approximately how much time PER DAY have you spent actively using your [computer/smartphone/tablet]?”. In terms of motivation for signing up (b), mental health concerns almost invariably included productivity concerns, but the inverse was not the case. In terms of device types (“On which of the following devices do you want to change, or be in better control of, your use?”), nearly all participants (93%) wanted to change how they use their smartphones, but only just over a third (37%) solely wanted to change their smartphone use.
Participant motivations.When asked in the opening survey about the kind of concerns motivating them to sign up to the workshop (Figure 3 b), 48% of participants chose ‘productivity’, and 49% chose both ‘productivity’ and ‘mental health’ (only 3% chose ‘mental health’ on its own). 36% of participants found it ‘moderately important’ to change or be in better control of their use, 44% found it ‘very important’, and 15% found it ‘extremely important’.
Strategy selection. At the follow-up, the median number of strategies that respondents had tried was 3 (IQR = 2 to 5, min = 0, max = 14; only one respondent had not tried anything). The summary statistics of number of strategies tried were nearly identical for 1-month and 3-month follow-ups, except for the max value, which was 10 for the 1-month follow-up. This suggests most experimentation happened in the first month after the workshop, which was supported by the interview data. There was no correlation between the number of participants in a workshop and the number of strategies tried (r = -.02, p = .77). At the follow-up, 95% of respondents still used at least one of the strategies they had tried, and the median number of strategies still used was 2 (IQR = 2 to 4, min = 0, max = 10). The number of strategies still used was slightly higher for 1-month than for 3-month follow-ups (1-month: median = 3, IQR = 2 to 6, min = 0, max = 8; 3-month: median = 2, IQR = 2 to 4, min = 0, max = 10).
As shown in Figure 5, strategies were typically applied on a specific type of device. For example, grey scale was almost exclusively applied on smartphone, and hiding distracting features on websites was mainly applied on a computer. However, most participants (79%) tried strategies on multiple devices, usually by applying different strategies on different devices (e.g., Focus in bursts with a timer on computer, and Limit notifications on smartphone). 21% of the time, the same strategy was applied on multiple devices, most commonly on both smartphone and laptop computer. The strategy most likely to be applied on multiple devices was Understand how you use your devices, which was applied across multiple devices 35% of the time. 13% of respondents applied strategies only on smartphone; 8% applied them only on a computer.

4.1 RQ1: How do students want to change their use of digital devices?

In terms of device types, 93% of participants said in the opening survey that they wanted to “change, or be more in control of”, their smartphone use. However, only a little more than one third (37%) were solely concerned with their smartphone use. 63% of participants wanted to change their use of other devices, mainly laptop / desktop computers (59%), with a small minority (7%) mentioning tablets (Figure 3 c).
The final reflection prompt, after participants had written down their concerns, triggers, and what they had previously tried, was “Imagine your use of digital devices is exactly as you want it to be. What does that look like? (Be specific about context, time of day, apps...)”. We captured participants’ responses to this question with two higher-level goals, alongside three more specific goals that we interpreted as means of achieving the higher-level goals. The goals were closely related to participants’ concerns, and the triggers that drove them, a summary of which is included in the Appendix, section D.

4.1.1 Higher-level goals.

Use devices intentionally and regain a sense of control (66% of participants). Participants commonly described that they wanted to use their devices more intentionally (“Actively using digital devices for a purpose. Not mindlessly scrolling”, P14). Intentional use included feeling more in control, be able to sustain focus on a specific task (especially tasks related to working / studying), and be comfortable leaving one’s devices elsewhere for a bit (“don’t feel the need to constantly check my phone and can e.g. sit through a class without being distracted”, P108). This goal mirrored participants’ common concern about excessively checking devices, getting ‘sucked in’, and feeling unable to stop use once started (section D.1).
Importantly, the prominence of this goal also suggested that the Brief Digital Self-Control Scale was a meaningful quantitative measure to assess if the workshop was helpful.
Make device use fit into life in a balanced way (19%). Participants wanted to achieve a balance in which their use of social media and other routinely ‘distracting’ functionality had a natural place in their daily life without disrupting work, hobbies, or in-person socialising (“a better ratio of ‘doing things that are important’ and ‘doing things not important’”, P120; “I spend more time in a day reading books than I do on social media”, P123). This goal mirrored a general concern expressed by almost every participant (95%), namely that digital device use disproportionately diverted their time and attention away from personally meaningful activities, thereby interfering with their ability to be productive or focus on their hobbies and social life.

4.1.2 Specific goals.

Having clear boundaries around time and context of use (63%). More than half of participants expressed goals around delineating device use at specific times and contexts. This was in particular related to boundaries between work and leisure, where participants wanted distraction-free work sessions, and clear, dedicated times for use that they believed might distract from their productivity (“I focus on work in the mornings and don’t check email or social media before noon”, P136). Participants commonly wished to limit their use specifically in the mornings and/or evenings (“not actively using or checking my digital devices an hour before bed and before I have gotten ready in the morning”, P59).
Limit amount or frequency of use (34%). Participants often described wanting to limit the amount of time they spent on their devices overall, in specific functionality, or in a specific session (“Using my phone for less than 1 hour a day”, P49; “never binge for more than 2 hours”, P50). Similarly, they often described wanting to limit the frequency with which they checked their devices or a specific functionality (“check phone only every 3 hours after successful work session”, P26; “only checking emails once a day”, P12).
Having a healthier digital environment where self-regulation is easier (14%). Participants also wanted to change digital environments to make them more on ‘their side’ (“a device that respects me”, P7). This sometimes took the form of wanting to get rid of specific apps or services altogether (“Permanent deletion of instagram and tiktok - these provide me with very little joy”, P145; “No insta/FB or the like - but diff when you want to stay in touch with friends/family/work”, P68). However, it was also expressed as goals to change within-service environments. This could either be within the constraints of a service’s ordinary functionality (“On Instagram, I follow accounts that make me feel happy”, P100), or wishing to more directly change features that tempted them to behave in unintended ways (“bypass the way that apps are designed to make me addicted and keep on going even though it won’t be good for me”, P84).
When comparing goals between participants who wanted to change smartphone use only, and those who wanted to change use of multiple devices, the emphasis varied only slightly: keeping clear boundaries around time and context of use was slightly more prevalent for participants wanting to change use of multiple devices (67% vs 54% of participants); limiting amount or frequency of use was slightly more prevalent for participants wanting to change only smartphone use (40% vs 28%).

4.2 RQ2: Can an intervention that combines reflection on personal goals for change with self-selection of relevant DSCTs improve digital self-control?

Figure 4:
Figure 4: Quantitative summary of the workshop’s usefulness as assessed by the end of the workshop (a-b), and at the follow-up (c-d). The workshop’s main elements were overwhelmingly perceived as ‘completely’ or ‘mostly’ useful, and participants thought the workshop provided good solutions to their concerns about digital device use (b) Respondents to the follow-up survey showed a large increase in digital self-control, d = 0.91 (c). When assigning non-respondents the same digital self-control score as in the opening survey, the estimated effect size was d = 0.56. There was also a statistically significant reduction in self-reported time on laptop computer (but not on desktop), smartphone, and tablet (d).
Among participants who filled in the follow-up survey (52% of total5), we observed a large increase in self-reported digital self-control, compared to before the workshop (Cohen’s d = 0.93, mean increase = 7.9 points, scale ranges from 12 to 60, t(145) = 11.2, p < .00001; Figure 4 c). The effect size was larger for follow-ups after 1 month (d = 1.13, n = 36) than after 3 months (d = 0.87, n = 110).
As is the case for all open trials, the increase in digital self-control scores could be influenced by a number of factors, including response bias (students who find the workshop more useful might be more likely to respond) and regression to the mean (students might attend a workshop at the point in time where they struggle the most with digital distraction). When adjusting for response bias by assigning non-respondents the same score as in their opening survey (‘last observation carried forward’ [33]), the estimated increase in digital self-control remained medium-sized, d = 0.56 (t(279) = 9.4, p < .00001).
Moreover, the quantitative data suggested that discovering appropriate DSCTs, rather than simply regression to the mean, was related to digital self-control improvement: among participants who had found at most one ‘very useful’ strategy (n = 108), the effect size of the digital self-control increase was d = 0.84. However, the effect size was substantially larger (d = 1.21) among participants who had managed to find two or more ‘very useful’ strategies (n = 38). Number of ‘very useful’ strategies discovered was unrelated to the number of participants in a workshop (r = -0.002, p = .98), but strongly correlated to the number of strategies participants had tried (r = .53, p < .001).
Finally, as indicated in Figure 4 d, participants’ self-reported daily time spent declined on laptop computer (r = .29, p = .0016, median difference = -60 minutes, interquartile range of differences = -120 to 60 minutes,), smartphone (r = .29, p = .004, median difference = 0, IQR = -60 to 45 minutes), and tablet (r = .38, p = .04, median difference = -25 minutes, IQR = -60 to 0 minutes), but not on desktop computer (p = .9).

4.2.1 Qualitative themes.

Corroborating the quantitative data, participants’ qualitative reports in the interviews and surveys described three themes of positive changes they had noticed after the workshop: First, participants described being more aware and mindful of their device use (“I noticed that I’m more aware of how digital devices distract me. I built a habit to stop and think about what I’m going to do on my phone and laptop before diving into apps and emails and webpages”, P59). Second, participants described feeling more empowered and in control, because the workshop provided them with specific tools they could apply when needed (“I finally felt that I have the tools to kind of control this, these urges that are coming on and my habits that I consider unhealthy”, P42). Third, participants said they had managed to make their device use better balanced with other valued activities (“I still have a lot of progress to do, but I have been able to reconnect with reading a bit more since the workshop, which has been good for me”, P78).
Participants also pointed out that the workshop was not a panacea: echoing the quantitative finding that the effect size was larger after 1-month than 3-month follow-ups, some participants noticed improvements after the workshop, when they felt motivated to act and try out different tools, but said that the effect faded over time (“In the beginning it was much easier to control my use of digital devices. I felt in control of it again, and felt that it was an important step in the right direction (...) but I’ve let it slip since”, P42, 3-month follow-up).

4.2.2 Active elements of the workshop.

The interview data suggested that a basic benefit of the workshop was that it helped participants set aside the time to reflect and get motivated to act (“sometimes it’s just somebody forcing you to sit down and take stock which is really what you need (...) like, ‘you know, you don’t like being distracted. You know you want to do something about it. So right now, you’re required for 30 minutes to actually think about this”’, P62). An important part of the motivational benefit was that the workshop helped participants see and feel that they were not the only ones struggling to control their digital device use (“just seeing other people in the workshop and knowing that, yeah, I’m not alone in this digital disaster right now”, P59)
In terms of more specific elements, the reflection, card sorting, and website were all overwhelmingly found ‘completely’ or ‘mostly’ useful according to the exit survey. (Figure 4 a). In the interviews, participants shared that the reflection helped clarify their challenges and initiate an active exploration process (“I think just by writing down what the problem was that made me sort of, you know, a small like light bulb moment which made you go, oh yeah, that is what the problem is”, P32). Thus, some participants described that their reflection process continued after the workshop, as they explored different ways to address their challenges (“the reflection in the workshop helped me begin to identify stuff. But then afterwards, like, I kind of built on it and spent a lot more time kind of thinking about it”, P11).
Second, the card sorting of strategies, combined with the website’s provision of specific DSCTs to apply them, turned the abstract challenge of digital self-control into an actionable problem, with concrete steps to follow (“It can feel very, like, out of my control. Like, there’s nothing I can really do about it. Having those different kind of things, and especially split up very clearly into the different ways of managing it, I think was very useful in terms like, ’Oh, there are things that I can do”’, P182). Thus, participants often mentioned that the workshop introduced them to tools they were unaware of and unlikely to have discovered on their own (“YouTube was my weak spot and the plug-in I installed in the workshop really helped me cut that out (...) I would otherwise never have thought of searching for something like that”, P10). Similarly, when asked in the exit survey if they had found strategies that seemed like good solutions to their challenges, the median response on a scale from 1 (‘not at all’) to 5 (‘very much’) was 4 (44% scored 5, 48% scored 4, 6% scored 3; Figure 4 b). Moreover, 74% felt ‘moderately confident’ the strategies they decided to try would be helpful (12% felt ‘highly confident’, and 13% ‘slightly confident’).

4.3 RQ3: How much do students’ preferences for different types of digital self-control tools vary?

Figure 5:
Figure 5: The number of participants who tried each strategy, how useful they found them, which device(s) they applied them on, and whether they were still using them at the time when they filled in the follow-up survey.
Participants’ assessment of the usefulness of the strategies is shown in Figure 5. The strategies were not created equal: the proportion of respondents who rated a strategy ‘moderately useful’ or higher ranged from 26% (Replace distractions on the web with a to-do list) to 84% (Limit notifications), and the proportion of specifically ‘very useful’ ratings ranged from 5% (Replace distractions on the web with a to-do list) to 55% (Hide distracting features on websites). Conversely, the proportion of ‘not at all useful’ ratings ranged from 0% (Reduce your device to the tools you need) to 37% (Replace distractions on the web with a to-do list).
The top 6 strategies all seemed generally useful, receiving ‘moderately’ to ‘very useful’ ratings from at least 76% of respondents, and ‘not at all useful’ from at most 10%. All but one of these strategies were from the blocking or removing distractions category. The exception was Focus in bursts with a timer (which ranked second on proportion of ‘very useful’ ratings, at 49%).
The remaining strategies saw greater amounts of variation. For example, grey scale was the second-most frequently tried strategy, but had a somewhat uniform distribution of usefulness ratings and the second-lowest retention rate (54%). Importantly, no strategy was outright rejected: even the ‘worst’ strategy, ‘Replace distractions on the web with a to-do list’, which was found ‘not at all useful’ by 37%, was still considered at least ‘moderately useful’ by 26% of those who tried it.
We also observed a high degree of variation in how participants combined strategies. Among the 78% of respondents who still used 2 or more strategies at the follow-up (n = 114), 91% of the specific combinations of strategies were unique to a single participant. Thus, even the most frequent unique combination (Block distracting websites or apps + Hide distracting features on websites) was applied by just 4 respondents.

4.4 Design considerations for supporting digital self-control via reflection and self-experimentation

Based on the survey and interview data from the workshop, we identified several design considerations for researchers and practitioners working in the domain of digital self-control.
Illustrate full capability range of DSCTs to empower participants to identify tailored solutions. Compared to other domains of behaviour change, DSCTs have a unique power to not simply remind or recommend actions, but also to change the digital environment (an exercise app cannot support you to take the stairs by removing the elevator at your workplace, but a DSCT can hide an unwanted newsfeed [55]). However, most people’s considerations for changing digital device use are anchored by simple all-or-nothing approaches (blocking an app entirely or not at all) [111], a notion reinforced by the limitations of pre-installed screen time tools.
Therefore, a key benefit of the workshop, which participants repeatedly stressed, was that it made them aware of the range of actionable ways in which they could take back control. This empowered them to search for ways to implement a strategy tailored to their situation, even if the specific solution they needed was not among the DSCTs included on the workshop website. For example, knowing that it is possible to use browser extensions to modify websites, P11 deleted the Instagram app off their Android phone and used an extension to remove the search area when using Instagram in the browser. This effectively helped them use Instagram for messaging without getting distracted, but was not among the tools on the website.
Demonstrating the range of what is possible is therefore key, but complexity should be carefully managed to avoid overwhelming participants. For example, P198 felt that the number of strategies included was “too many. I think you should have done eight”. Others wished they had more time to review the options before being asked to commit. Our workshop implementation attempted to reduce cognitive load by using layers of abstraction, combined with decision steps that gradually honed in on specific options. For example, we grouped DSCTs into 13 strategies, which themselves were grouped into just 4 main categories; the card sorting task supported initial reduction of options; and we limited the number of ‘commitment cards’ on the Miro board to just two for each participant, to help funnel towards action. While not perfect, this approach seemed sufficient to manage information overload for most (“the strategies part and definitely, I think, the commitment part (...) those two together sort of symbiotically were high impact for me, the commitment specifically because it forced me to narrow down my choices”, P118).
Scaffold self-experimentation in a way that is sensitive to how DSCTs differ in attrition challenges. Researchers have suggested that DSCTs face a general challenge of attrition over time [57, 58, 84]. Our interview data suggest7, however, that the nature of this challenge differs between tools. For tools that require active initiation, like focusing in bursts with a timer, participants described the usual challenges of sustaining use over time. However, strategies that were set-it-and-forget-it could benefit from inertia (“the list app replacing the YouTube newsfeed. That’s not something I have to go up and change again every morning, which made it very easy to stick with”, P118) and help them habituate to a desired state of the world (“it doesn’t even cross my mind anymore when I use these apps, it’s like I just don’t have a newsfeed anymore”, P86). This benefit seemed to apply specifically to the aspects of a set-it-and-forget-it intervention that involved removing distracting elements (“The motivational quotes themselves weren’t particularly helpful or inspirational, but the fact that they covered my Facebook and YouTube feeds was far more useful.”, P237)
From our participants’ descriptions of the post-workshop period8, we conjecture that strategies which involve active initiation should be tried one at a time to avoid feeling overwhelmed, and allow time for a new habit to form (e.g., P59 applied multiple things at once, but described in the interview that this approach had led him to not give each a proper try). We also note that whereas some DSCTs are naturally distinct in this respect, many can be used in both an actively initiated as well as set-it-and-forget-it way. For example, tools for blocking distracting websites or apps can often be used either actively to initiate a delineated blocking session, or be prospectively programmed to block distractions during particular hours and/or days of the week. Our data suggest that people in the set-up phase should be encouraged to consider set-it-and-forget options where relevant (e.g., setting up a weekly blocking schedule), to make inertia work to their advantage.
Prioritise self-report measures that capture individual differences in goals. Most participants wanted to change both their smartphone and laptop use, and had varying goals including gaining a sense of control, establishing clearer usage boundaries, and reducing the duration or frequency of use in various ways. Instead of introducing multiple distinct measures to capture this wide range of goals, which would make surveys overly long, we included a broader measure applicable across multiple goals and devices. Considering the range of goals our participants reported, the self-control scale we adapted from basic psychological research seemed a better fit than commonly used alternatives in DSCT research, which tend to focus on a single device (e.g., smartphone addiction scale [59]), a specific system (e.g., the NASA-TLX task load scale [35]), and/or a narrower goal (e.g., cognitive absorption [6]). Our study is a first use of this type of scale in DSCT research. We expect this or similar scale adaptations of standard self-regulation measures from psychology to be useful for other researchers in this field looking for alternatives to ‘addiction’ measures [71].
To interpret the scores from this scale, we suggest triangulating with other data. In our case, we used quantitative and qualitative survey data on the usefulness of specific DSCTs, as well as self-reported time spent on devices. Although our surveys did not ask participants to provide any objective measures (e.g. a screenshot of their devices’ screen time report), including them may aid researchers in interpreting the self-report data. For instance, the decline in subjective time spent on devices at the follow up might not correspond to actual time spent [27, 28, 99].
However, we strongly suggest that objective usage measures such as screen time are mainly deployed in the service of the subjective outcome measures. One reason for this is that assessing DSCTs based on objective outcome measures such as screen time may lead to overly narrow conclusions; interventions that block devices from being used altogether may be highly effective for reducing screen time, but interfere with intended use. Moreover, there are no universally accepted guidelines as to what constitutes ‘healthy’ screen time for adults [113], so we instead recommend prioritising self-assessments of whether device use is aligned with one’s goals. Indeed, even if we did have objective standards for how adults ‘should’ be using digital devices, there would be strong reasons to keep a focus on subjective sense of control and self-efficacy within digital environments [70]: self-determination theory posits autonomy as a basic human need and common principles in HCI design guidelines encourage support of user control as a goal in its own right [22, 68, 98, 102].

5 Discussion

To sum up, a little more than a third of participants solely wanted to change their smartphone use, but most also wanted to change their use of other devices (in particular laptop / desktop computer). Their overall goals related to using devices more intentionally, and achieving a better balance in the way they spent time. They wanted to achieve this via clearer boundaries around time and context of use, reduced amount or frequency of use, and having digital environments that supported self-control.
1-3 months after the workshop, the median participant had tried three different digital self-control strategies using DSCTs and were still using two, typically in unique combinations across multiple devices (mainly smartphone + computer). The follow-up suggested a significant improvement in digital self-control, with a large effect size in our open trial (Cohen’s d = 0.93). Moreover, participants’ self-reported time spent declined on laptop computer, smartphone, and tablet. The qualitative data corroborated the quantitative findings, with participants reporting feeling more aware and mindful of their device use, more empowered and in control; and that their use had become better balanced with other valued activities in their life.
The 13 strategies varied greatly in average usefulness: DSCTs for blocking or hiding distractions, as well as focus timers, were generally useful. Others, such as grey scale, exhibited a high degree of interpersonal variation. In the following, we discuss implications for future work leveraging DSCTs to empower end users.

5.1 Confirming the effectiveness of interventions to empower people to identify DSCTs that support their personal needs

HabitLab demonstrated that applying a range of relevant DSCTs cross-device (Chrome on PC + Android) can effectively reduce time spent on websites and apps that a person wishes to use less [56]. Our study supplements this work (which was based on objective screen time measures) with self-report data, and confirms the potential to support digital wellbeing via bespoke, cross-device applications of DSCTs [82].
Our findings were generated by triangulating quantitative and qualitative self-report data from a relatively large sample of students. However, as is the case for most studies of DSCTs [69, 95], our study did not have a control group. To assess the magnitude of the potential benefits, follow-up studies with, at minimum, a waitlist control group are needed [61, 107]. Such work could also use automated rather than self-reported screen time measures to obtain a more accurate indication of usage [27, 28, 99].
Moreover, as has been encountered in studies of DSCTs, our data suggested that the effect of the workshop waned over time. Bringing about longitudinal behavioural change is a general challenge for behaviour change interventions [56, 90], but long-term evaluations are rare in DSCT research [95]. The practical approach of the present work, where we attempted to address a local challenge in a student population actively seeking out the workshop, may represent an opportunity in this respect: in the interviews, participants commonly requested that the workshop be expanded with recurrent group support (with suggestions ranging from weekly to quarterly follow-ups) to help themselves stay motivated and overcome technical challenges. Deploying regular workshops and/or ‘booster’ sessions might be an effective way to undertake longitudinal studies of self-experimentation habit-formation with DSCTs [36].
Finally, we designed the workshop to holistically integrate components that previous research suggested would be effective for behaviour change [12, 30, 62, 86, 103]. The qualitative data suggested that the components worked well together: the group setting normalised digital self-control struggles, the reflection clarified participants’ goals, and the exploration and application of DSCTs provided actionable steps. However, controlled studies could assess the relative contributions of different elements and facilitate iterative development of variant interventions [49]. For example, group support is often a crucial factor for supporting behaviour change, and in our study helped normalise digital self-control struggles. However, a self-guided version might be valuable for people who are not willing, or able, to take part in a live group setting and/or for stakeholders who lack the resources to train a facilitator. How might the workshop elements be effectively delivered in a form which does not require a live facilitator and/or group context? Investigating such questions will be highly practically relevant for stakeholders such as university counselling services to implement cost- and time-effective interventions to support digital wellbeing [80].

5.2 Understanding variation in DSCT usefulness to guide effective interventions

Comparing how different DSCTs fare in practical settings may provide important data for crafting interventions that effectively guide people towards the strategies most likely to be useful [25]. To this end, our work provides one of the first direct comparisons of subjective usefulness of multiple DSCTs.
One takeaway is that hiding distracting features on websites is one of the most consistently useful strategies in DSCTs (highest proportion of ‘very useful’ ratings in our study). This confirms suggestions from previous work, which has encouraged researchers to study ways to provide ‘internal support’, i.e., changing problematic user interfaces directly, as opposed to simply provide ‘external support’ such as usage tracking or blocking [68, 74, 117]. Similarly, while not directly comparable, our ranking of strategies’ usefulness showed similarities to Kovacs et al. [58]’s comparison of 12 different interventions’ influence on time spent on Facebook. Here, the five most effective interventions all related to blocking or removing distractions, from force-closing the tab after 60 seconds to removing the newsfeed or comments.
When applying a theoretical lens to understand the patterns we observe, our findings broadly align with the ‘process model’ of self-control widely used in psychology [26, 31]. Thus, our overall pattern in which strategies from the ‘block or remove distractions’ category were most likely to be found useful, and strategies within the ‘goal reminders’ category least likely, mirrors the finding that self-control strategies which help people avoid exposure to temptation, rather than trying to overcome unwanted urges only after they have arisen, tend to be more effective [24]. Intervention designers might wish to keep this in mind.
However, our data also suggest that additional theoretical lenses, such as the dual systems model previously used in DSCT research [48, 73, 74], are important: focus in bursts with a timer had the second-highest proportion of ‘very useful’ ratings, even though it was not about reducing exposure to temptation. Rather, this strategy seemed to help people maintain control by breaking up their focus into smaller segments that feels manageable and easier to get started with. From a dual-systems perspective, this can be analysed as a way to support conscious System 2 control, by increasing confidence in one’s own ability to maintain focus [73].
We encourage further research to directly compare the practical usefulness of different DSCTs, to better understand their relative challenges and opportunities, and how each may fit within a unique personal device ecosystem and behaviour change process [73, 91]. In turn, this may help researchers generate recommender systems to support self-experimentation with DSCTs (e.g., ‘People who reported challenges similar to yours have found this useful’), for which we hope our open dataset can provide a useful starting point.

5.3 Encouraging digital infrastructures that enable DSCTs to have large-scale public impact

Overall, our participants expressed three more practical goals — keeping boundaries around time and context of use, reducing amount and frequency of use, and changing digital environments such that self-regulation is easier in the first place. Support for the two former goals is well underway: Most DSCTs focus on limiting time spent and/or frequency of use [95], and support for scheduled or geofenced notification limiting or app blocking is both possible and an active area of research [3, 46, 104].
As mentioned, however, research on DSCTs that provide ‘internal support’ and directly reorganise problematic within-service features has been more sporadic, even though it is one of the most potent strategies at our disposal [68, 74, 95, 117]. We want to draw attention to a major infrastructural challenge in this respect: the ‘hide distracting features’ strategy, with which people can directly address problematic aspects of their digital environment, was almost exclusively applied on laptop or desktop computer. This was not accidental. In principle, participants could apply this strategy on smartphone. But on mobile, people are used to accessing services such as social media via apps, browser extensions are less well supported (Chrome does not support extensions on Android), and some services (such as Snapchat) are not accessible in a web browser at all. However, no simple solution currently exists for users to adjust user interfaces in mobile apps according to their personal needs, beyond what the developer chooses to provide by design [53, 54].
Lukoff et al. [67] recently demonstrated the benefits that this might bring: changing the YouTube mobile app such that users can turn recommendations on or off increases sense of agency, satisfaction, and goal alignment. However, it requires substantial technical effort to re-implement even a limited version of the YouTube app for a single research study. Other recent work has explored technical prototypes that allow end-users to change apps by themselves, and discussed the risks and opportunities if regulators introduced a ‘right to repair’ for mobile apps (i.e., a right for end-users to exercise granular control over their user interfaces) [53]. Our findings suggest that pursuing this avenue of work could be highly valuable. We encourage researchers and regulators alike to discuss how an equivalent to browser extensions might be created for mobile apps, which could help address an infrastructural bottleneck for large-scale impact of DSCTs.

5.4 Limitations and future work

Sampling: All participants actively chose to sign up to the workshop. Hence, the distribution of challenges and goals that we observed, or of choice and assessment of DSCT usefulness, may not generalise to students, or other target populations, at large.
Follow-up response rate: 52% of participants filled in the follow-up survey. We conservatively adjusted for response bias by carrying forward the last observed digital self-control scores from non-respondents [33], but future work may consider how to increase response rate (e.g., by collecting data at recurrent in-person group support sessions, rather than solely in follow-up emails).
Subjective outcome measures: Our follow-up data included subjective measures only, collected via surveys and interviews. Self-reported screen time in particular, can correlate poorly with actual time spent [27, 28, 99]. We do believe subjective sense of time spent is valuable in its own right, not merely as a flawed proxy of objective time spent. That is, if a person has a goal to reduce time spent on their devices, it is helpful to know whether a subjective increase in digital self-control is accompanied by a subjective decrease in time spent. Nevertheless, in the best of all worlds, researchers may wish to collect both objective as well as subjective data, for example by asking survey respondents to provide numbers from Screen Time or Digital Wellbeing features on their devices, which could be triangulated with the qualitative data.
Open trial: Our study did not include a control condition, and so we were unable to quantify the influence of factors like regression to the mean or response bias. Future work could more accurately estimate the benefits of the intervention by comparing with a waitlist condition, and/or partial or modified versions.
Selection of DSCTs: Our grouping of DSCTs into 13 strategies represented our best attempt at summarising design features in current tools, in a way that was manageable for participants to consider in a card sorting exercise. The DSCTs could be grouped in other ways, and other options for specific tools than the ones we included on the workshop website could have been made. The tools we included in the present version of the workshop can be found on the Open Science Framework (osf.io/zmt78 and in Appendix C.

6 Conclusion

Digital self-control tools can help people take back control of their device use. To facilitate real-world impact of these tools, we attempted to address current research-to-practice gaps: existing research on DSCTs has focused on how a single intervention on single device affects screen time, but the practical challenge end-users face is to browse multiple DSCTs and apply a bespoke combination in a multi-device context. By collaborating with the University of Oxford counselling service, we explored how to empower students to achieve their personal device use goals by combining reflection and self-experimentation with relevant DSCTs across their personal devices. Our results suggest that this approach can not only support digital self-control, but also generate data at scale for evaluating different DSCTs and anchor outcome measures in a deep understanding of users’ goals. We hope the materials, data, and design considerations generated by our work may inspire further research and practical interventions to give people the tools they need to thrive in their digital life.

Acknowledgments

This work was funded by an Oxford Visiting Fellowship from the Carlsberg Foundation [grant number CF20-0678] to UL. UL also acknowledges kind support from the Lucy Halsall Fund at Linacre College, University of Oxford, and the Copenhagen Center for Social Data Science’s DISTRACT project [‘The Political Economy of Distraction in Digitized Denmark’, supported by the H2020 European Research Council, grant number 834540].

A Recruitment Text

The recruitment materials contained variations of the following:
Smartphones, computers, and tablets are powerful and essential tools for us to study, socialise, and connect to the outside world. But they can also be a source of endless distraction that undermine our capacity to focus and lead to long stretches of unproductive or unrewarding time. If you want to take back control, this workshop can help! You will be supported to: reflect on your relationship with digital devices; identify the role you want them to play in your life; and get support to make real, practical changes.
Up until the autumn of 2022, the recruitment materials included a reference to the pandemic, such as “During covid, you may have become more dependent on your digital devices than ever before”.

B The Brief Digital Self-Control Scale

See Table 2. All items are summed (after reverse-scored items have been reversed) for a single, overall score on current digital self-control ability (min: 12, max: 60).
Table 2:
Item #Final version of BDSCSResponse optionsScoring
 Which of the following digital devices do you regularly use?checkbox: smartphone, tablet, laptop computer, desktop computer, smartwatch, gaming console, internet-connected TV 
 Think about how you use your {devices the participant checked}. Please indicate how much each of the following statements reflects your experience using these digital devices in the past week:  
1In the past week, I was good at resisting temptation on my digital devices1 (not at all), 2, 3, 4, 5 (very much) 
2In the past week, I had a hard time breaking bad habits related to my digital device useas abovereversed
3In the past week, my digital devices made me lazyas abovereversed
4In the past week, I used my digital devices at times when I shouldn’t haveas abovereversed
5In the past week, I followed distractions on my digital devices, if they were funas abovereversed
6Based on the past week, I wish I had more self-discipline in how I use my digital devicesas abovereversed
7Based on the past week, people would say that I have iron self-discipline over how I use my digital devicesas above 
8In the past week, pleasure and fun on my digital devices sometimes kept me from getting work doneas abovereversed
9In the past week, my digital devices made it difficult for me to concentrateas abovereversed
10In the past week, my digital devices distracted me from my long-term goalsas abovereversed
11In the past week, sometimes I couldn’t stop myself from doing something on my digital devices, even when I knew it was wrongas abovereversed
12In the past week, I often acted without thinking on my digital devicesas abovereversed
Table 2: The Brief Digital Self-Control Scale

C Digital Self-Control Strategies Included in the Workshop

See Table 3.
Table 3:
StrategyExplanationTools
Block or remove distractions
Block distracting websites or appsBlocking access to distracting functionality, typically on a session (e.g. ‘block YouTube the next 2 hours’) or scheduled basis (e.g. blocking Facebook between 10am and 3pm on weekdays).Cold Turkey Blocker, 1Focus, AppBlock, LeechBlock, Screen Time, Digital Wellbeing
Reduce your device to the tools you needBlocking on allow-list basis (i.e., choosing apps/websites to allow with all else blocked, e.g., reducing laptop to Word + Wikipedia for two hours), or deleting unnecessary apps.Cold Turkey Blocker, Micromanager, 1Focus, Cold Turkey Writer, Screen Time
Hide distracting features on websitesUsing browser extensions to customise distracting websites, for example by removing recommended videos on YouTube.Newsfeed Eradicator, #BlockIt, Newsfeed toggle for Facebook, Facebook Demetricator, Twitter Demetricator, Unhook - Remove YouTube Recommended Videos, “No distractions” for YouTube, Inbox When Ready, Click to Remove Element, Remove HTML Elements
Use focus modeUsing app and system settings to reduce visual distraction, for example by using Microsoft Word’s Focus View to get a distraction-free interface.Distraction Free Mode — Google Docs & Slides, MS Word Focus mode, Typora, JotterPad, Full screen mode
Limit notificationsAllowing only necessary notifications, at necessary times. For example, turning off notifications from gaming apps or scheduling Do Not Disturb.Clean up which notifications you receive, Control when you receive notifications
Track yourself
Understand how you use your devicesUsing tracking tools to see how much time is spent on one’s devices. For example, RescueTime provides a weekly overview of how time is spent on computer, categorised by ‘productive’ and ‘distracting’ use.Digital Wellbeing, Screen Time, RescueTime, StayFree
Focus in bursts with a timerUsing countdown timers to focus for a specific amount of time (the ‘pomodoro’ technique). Numerous tools support it, e.g. menu bar tools that make it effortless to start a timer on one’s computer.Be Focused, FocusMe, Marinara: Pomodoro Assistant, Pomodoro clock extension, Minimalist Pomodoro Timer
Keep your goals in mind
Put motivational quotes or to-do’s on new tabsUsing browser extensions to replace the content of newly opened browser tabs from the typical default of ‘favourite’ websites to content that instead serves as a reminder of one’s goals.Todo Tab, Daily Motivation, Daily Motivational Tab, Motivation
Replace distractions on the web with a to-do listUsing browser extensions to replace distracting content on websites with reminders of one’s goals. For example, replacing recommended videos on YouTube with a to-do list.Todobook
Redirect yourself away from distracting sitesSetting automatic redirects from one website to another. For example, getting automatically redirected to Wikipedia when trying to access Reddit.Timewarp, Nudge
Make your goals attractive
Set up reward or punishmentChanging incentives around use. For example, in the app Forest, one nurtures a virtual tree by not using one’s smartphone – or by not using specific websites – while a timer counts down.Forest, Flora
Go grey scaleTurning one’s device (or just distracting apps) in greyscale when color is not needed, to reduce visual distraction.Grey scale
Move distracting apps out of sightRearranging apps (typically on smartphone) to make it more tedious to find and initiate use of distractions.Moving apps off home screen
Write under time pressureUsing writing tools where one must continue working until a goal is reached (e.g. writing 200 words, or writing for 5 minutes), to force oneself into flow. If one stops more than a few seconds, all is deleted.The Most Dangerous Writing App
Table 3: The digital self-control strategies included in the workshop.

D Participants’ Concerns About Their Digital Device Use, and the Triggers Driving Those Concerns

In the following, we summarise the themes we developed to capture participants’ reflections on the concerns that brought them to the workshop (prompt: “What concerns you about your use of the internet / your laptop / your phone?”), and which internal and external triggers they thought were the cause of those concerns (“What external triggers (e.g., notifications) and internal triggers (e.g., emotions) drive the uses you’re concerned about?).

D.1 Concerns

Device use diverts time and attention away from personally meaningful activities (95% of participants). This was most commonly expressed in relation to productivity (68% of participants), but also in relation to other valued activities such as personal hobbies or social interactions (22%). Thus, participants were concerned about the amount of time they wasted on meaningless activities on their devices (34%), because it left less time available for other things (“Waste time that could be used writing thesis (...) prevents me from investing time in learning new skills”, P117). They were also concerned that the frequency with which they, e.g., checked social media, interrupted their focus on work and studying (“Disrupting my work; making it take longer than it should do, and it being of a lesser quality”, P47), as well as on social activities (“They prevent me from connecting to the present moment / to other people (in person)”, P98).
Feeling unable to control, and dependent on, device use (52% of participants). Participants described that they found themselves using or checking their devices excessively without awareness (“Finding myself browsing social media without thinking”, P142), and/or that they felt ‘sucked in’ and unable to stop using their devices (“Looking at YouTube starts with something for work/study and then you can go down a rabbit hole because of clickbait videos”, P68, “Once I start, I can’t stop”, P61). Many felt uneasy about the dependency and pervasiveness of digital devices in their lives (14%, “I feel like I cannot leave the house without my phone”, P31), and were worried that their behaviour over time made them addicted to stimulation (7%).
Device use negatively impacts mood or mental health (43%). Participants were concerned that the content on, in particular, social media or news negatively affected their mood (“the content of the social media platforms can sometimes make me anxious or in a ‘negative space’”, P273, “Social media makes me feel inadequate”, P14). They also expressed feeling overwhelmed by the sheer amount of content/notifications/emails/messages they had to respond to (“I always feel behind on messages & guilty [...] It makes me anxious that people can contact me at any time”, P190; “Too much information that is not relevant”, P67).
Device use harms sleep or physical health (31%). Finally, participants were concerned that device use disrupted their sleep (20%, “I spend a lot of time on YouTube before bed, and that has been taking away my scheduled sleep time”, P59). Similarly, time spent on digital devices might distract them from healthier physical activities (“It ends up disrupting my exercise, food and sleep schedule”, P48, or even negatively affect their eye sight or posture (“I’m worried about my eyes because I spend so much time looking at a screen”, P25).

D.2 Triggers

D.2.1 External triggers.

Receiving notifications and communication from others (48%). Participants commonly mentioned that receiving notifications from their apps or otherwise receiving communication led to disrupted focus and/or longer time spent on their devices than they intended (“getting notifications while I am trying to work!”, P29, “I like to keep my notifications clear - so I check every little one & then spend ages on that app.”, P14).
Ubiquitous, easy & persuasive digital distractions (34%). Participants felt their devices made it difficult to avoid procrastination, because potential distractions were always just “one click away” (P81). Moreover, many complained that potential distractions, such as social media, also contained design features that made it difficult to stay on track (“endless newsfeed = difficult to put limitations in place”, P63). Sometimes participants complained that institutional rules, such as mandatory two-factor authentication, increased their exposure to distractions (“the authentication request for library access - that is so ironic - I have to use my phone at the beginning of a time i’m meant to be focusing!”, P258).

D.2.2 Internal triggers.

Attempting to avoid, or remedy, a negative internal state (88%). The most prominent theme related to inner triggers was negative emotional states, such as boredom or anxiety, feeling overwhelmed or stressed, feeling sad or lonely, or simply feeling tired. When in these states, digital devices provided easy access to functionality that could be used as an escape (“Feeling lonely at uni, or overwhelmed emotionally. Wanting to distract from said emotions.”, P144). However, they were also used as an active remedy to those feelings (“I use my phone to access inspiration (thoughts and images) when I feel bored or unsure of myself”, P53).
Negative emotions such as boredom or restlessness often arose in response to working on difficult, tedious, or vaguely defined tasks (“When I feel stuck on a problem. I have had to tell myself before ‘you won’t find the answer on social media’”, P14). In these situations, alternative activities on their devices provided the route of least effort (“Not knowing what to do next in terms my of actual work and finding youtube/social media easier to deal with”, P142).
Using digital devices when feeling down could lead to negative spirals (“when i feel anxious i go on my phone to escape but that makes it worse”, P94). For example, device use driven by avoidance of emotions elicited by work would reinforce participants’ concern that their use of digital devices undermined their productivity and leave them feeling worse (“I waste time at work by checking my phone and feel very unfulfilled after”, P256).
Expectations of availability and of what one might be missing out on (38%). Participants commonly described their challenges as driven by others’, or their own, expectations of availability. That is, participants often gave examples of (real or imagined) pressure from others to be instantly available (“Feeling like if I don’t respond immediately to friends they will become distant”, P97), and how they in turn also wanted to be available in case someone needed them (“thinking people might need me and I won’t be there”, P82).
We included within this theme people’s broader expectations of what they might be missing out on on their devices. This was often mentioned as a driver of social media use, typically in relation to staying informed or involved with social activities (“I don’t want to miss out on things if my friends are making plans etc via social media”, P74). It was also related to a desire to stay up-to-date more broadly (“Fear of missing something interesting on Twitter - usually sport or politics related - leads to endless scrolling”, P34).
Spontaneous and habitual urges to check (15%). Finally, participants often reported that they found themselves using their devices in unwanted ways out of habit (“Habitual checking of email and social media accounts for new notifications”, P107). These habits were commonly described as happening without conscious awareness (“fingers automatically click to distractions when I’m not thinking”, P238).

Footnotes

1
In psychology, the term ‘self-regulation’ is used broadly about all processes by which we regulate behaviour in line with our goals, including via automatic habits. ‘Self-control’ is sometimes used more restrictively about instances of self-regulation that involve deliberate and conscious effort [34]. The terms are also often used synonymously, as we do in this paper: by ‘digital self-control’ (which may equally be referred to as ‘digital self-regulation’) we will refer generally to people’s ability to self-regulate their use of digital devices. Following psychological research on the strategies by which people can improve self-control [24, 25], we focus on ways in which people can use DSCTs as strategies for digital self-control. This includes use of DSCTs to facilitate conscious and deliberate control, for example by reminding oneself of one’s goals; as well as use of DSCTs to reduce the need for conscious control in the first place, for example by removing potential distractions from the user interface [73].
2
We condensed the main ways in which current DSCTs support self-control into 13 different types of interventions (e.g., ‘Block distracting websites or apps’) that we refer to as ‘strategies’.
3
Incidentally, this is similar to psychological research on self-control, where a 2018 review noted that “almost no research directly compares interventions from diverse traditions”, and called for “direct comparisons of the efficacy, scalability, and cost effectiveness of different approaches to reducing self-control failures” [25, p. 119].
4
We explored the longevity of benefits from the workshop by varying the length before the follow-up: in the first four academic terms (75% of participants), we followed up in the subsequent term (~3 months later; median time from participants filled in the opening to the follow-up survey = 87 days, IQR = 73 to 97 days; the gap was longest for students who participated in the term before the summer break). In the last two academic terms, we followed up within the same term, ~1 month after the workshop (median = 36 days, IQR = 32 to 40 days).
5
Response rate for 3-month follow-up: 53%. Response rate for 1-month follow-up: 51%.
6
Wilcoxon signed-rank test.
7
The follow-up survey’s inquiry on retention was limited to “Do you still use this strategy?” (yes/no), which provided only coarse information (e.g., a participant might answer ‘yes’ if they use a strategy once every three weeks). Therefore, we rely here mainly on participants’ qualitative descriptions of the post-workshop period in the interviews.
8
Some participants tried multiple things at once after the workshop and saw what stuck (“I did it all in one go because I thought if I don’t do it now, I’m never going to do it”, P179), whereas others experimented in an iterative fashion (e.g., P39 spent the first week only using time tracking to get an accurate picture of their current usage, and afterwards began to try specific interventions sequentially).

Supplemental Material

MP4 File - Video Presentation
Video Presentation
Transcript for: Video Presentation

References

[1]
[n. d.]. Unhook - Remove YouTube Recommended Videos and More. https://unhook.app/.
[2]
2023. Introduction to the Children’s Code. https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/childrens-information/childrens-code-guidance-and-resources/introduction-to-the-childrens-code/.
[3]
2023. Use Focus on Your iPhone or iPad. https://support.apple.com/en-us/HT212608.
[4]
Jesper Aagaard. 2015. Drawn to Distraction: A Qualitative Study of off-Task Use of Educational Technology. Computers & Education 87 (Sept. 2015), 90–97. https://doi.org/10.1016/j.compedu.2015.03.010
[5]
Jesper Aagaard. 2021. Beyond the Rhetoric of Tech Addiction: Why We Should Be Discussing Tech Habits Instead (and How). Phenomenology and the Cognitive Sciences 20, 3 (July 2021), 559–572. https://doi.org/10.1007/s11097-020-09669-z
[6]
Ritu Agarwal and Elena Karahanna. 2000. Time Flies When You’re Having Fun: Cognitive Absorption and Beliefs about Information Technology Usage. MIS Quarterly 24, 4 (2000), 665–694.
[7]
Sami Abdo Radman Al-Dubai, Kurubaran Ganasegeran, Mustafa Ahmed Mahdi Al-Shagga, Hematram Yadav, and John T. Arokiasamy. 2013. Adverse Health Effects and Unhealthy Behaviors among Medical Students Using Facebook. https://www.hindawi.com/journals/tswj/2013/465161/. https://doi.org/10.1155/2013/465161
[8]
Reem S. Al-Mansoori, Dena Al-Thani, and Raian Ali. 2023. Designing for Digital Wellbeing: From Theory to Practice a Scoping Review. Human Behavior and Emerging Technologies 2023 (Aug. 2023), 1–24. https://doi.org/10.1155/2023/9924029
[9]
Ruben C. Arslan, Matthias P. Walther, and Cyril S. Tata. 2020. Formr: A Study Framework Allowing for Automated Feedback Generation and Complex Longitudinal Experience-Sampling Studies Using R. Behavior Research Methods 52, 1 (Feb. 2020), 376–387. https://doi.org/10.3758/s13428-019-01236-y
[10]
Pieter E. Baay, Denise T.D. de Ridder, Jacquelynne S. Eccles, Tanja van der Lippe, and Marcel A.G. van Aken. 2014. Self-Control Trumps Work Motivation in Predicting Job Search Behavior. Journal of Vocational Behavior 85, 3 (Dec. 2014), 443–451. https://doi.org/10.1016/j.jvb.2014.09.006
[11]
Eric Baumer and Phil Adams. 2013. Limiting, Leaving, and (Re) Lapsing: An Exploration of Facebook Non-Use Practices and Experiences. Chi 2013 (2013), 3257–3266. https://doi.org/10.1145/2470654.2466446
[12]
Eric P.S. Baumer, Vera Khovanskaya, Mark Matthews, Lindsay Reynolds, Victoria Schwanda Sosik, and Geri Gay. 2014. Reviewing Reflection: On the Use of Reflection in Interactive System Design. In Proceedings of the 2014 Conference on Designing Interactive Systems(DIS ’14). Association for Computing Machinery, New York, NY, USA, 93–102. https://doi.org/10.1145/2598510.2598598
[13]
Daniel Biedermann, Stella Kister, Jasmin Breitwieser, Joshua Weidlich, and Hendrik Drachsler. 2023. Use of Digital Self-Control Tools in Higher Education – a Survey Study. Education and Information Technologies (Sept. 2023). https://doi.org/10.1007/s10639-023-12198-2
[14]
Joël Billieux, Pierre Philippot, Cécile Schmid, Pierre Maurage, Jan De Mol, and Martial Van der Linden. 2015. Is Dysfunctional Use of the Mobile Phone a Behavioural Addiction? Confronting Symptom-Based Versus Process-Based Approaches. Clinical Psychology & Psychotherapy 22, 5 (2015), 460–468. https://doi.org/10.1002/cpp.1910
[15]
Virginia Braun, Victoria Clarke, Nikki Hayfield, and Gareth Terry. 2018. Thematic Analysis. In Handbook of Research Methods in Health Social Sciences, Pranee Liamputtong (Ed.). Springer Singapore, Singapore, 1–18. https://doi.org/10.1007/978-981-10-2779-6_103-1
[16]
Miriam Brinberg, Nilam Ram, Xiao Yang, Mu-Jung Cho, S. Shyam Sundar, Thomas N. Robinson, and Byron Reeves. 2021. The Idiosyncrasies of Everyday Digital Lives: Using the Human Screenome Project to Study User Behavior on Smartphones. Computers in Human Behavior 114 (Jan. 2021), 106570. https://doi.org/10.1016/j.chb.2020.106570
[17]
Hilarie Cash, Cosette D Rae, Ann H Steel, and Alexander Winkler. 2012. Internet Addiction: A Brief Summary of Research and Practice. Current Psychiatry Reviews 8, 4 (Nov. 2012), 292–298. https://doi.org/10.2174/157340012803520513
[18]
Marta E. Cecchinato and Anna L. Cox. 2020. Boundary Management and Communication Technologies.
[19]
Center for Humane Technology. [n. d.]. Take Control - Center for Humane Technology. https://humanetech.com/resources/take-control/.
[20]
Justin Cheng, Moira Burke, and Elena Goetz Davis. 2019. Understanding Perceptions of Problematic Facebook Use: When People Experience Negative Life Impact and a Lack of Control. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems(CHI ’19). ACM, New York, NY, USA, 199:1–199:13. https://doi.org/10.1145/3290605.3300429
[21]
Emily I M Collins, Anna L Cox, Jon Bird, and Cassie Cornish-Tresstail. 2014. Barriers to Engagement with a Personal Informatics Productivity Tool. Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures the Future of Design - OzCHI ’14 (2014), 370–379. https://doi.org/10.1145/2686612.2686668
[22]
David Coyle, James Moore, Per Ola Kristensson, Paul Fletcher, and Alan Blackwell. 2012. I Did That! Measuring Users’ Experience of Agency in Their Own Actions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’12). Association for Computing Machinery, New York, NY, USA, 2025–2034. https://doi.org/10.1145/2207676.2208350
[23]
Denise T. D. de Ridder, Gerty Lensvelt-Mulders, Catrin Finkenauer, F. Marijn Stok, and Roy F. Baumeister. 2011-08-30, 2011-08. Taking Stock of Self-Control. Personality and Social Psychology Review 16, 1 (2011-08-30, 2011-08), 76–99. https://doi.org/10.1177/1088868311418749
[24]
A. L. Duckworth, T. S. Gendler, and J. J. Gross. 2016. Situational Strategies for Self-Control. Perspectives on Psychological Science 11, 1 (2016). https://doi.org/10.1177/1745691615623247
[25]
Angela L. Duckworth, Katherine L. Milkman, and David Laibson. 2018. Beyond Willpower: Strategies for Reducing Failures of Self-Control. Psychological Science in the Public Interest 19, 3 (Dec. 2018), 102–129. https://doi.org/10.1177/1529100618821893
[26]
Angela L. Duckworth, Rachel E. White, Alyssa J. Matteucci, Annie Shearer, and James J. Gross. 2016. A Stitch in Time: Strategic Self-Control in High School and College Students. Journal of Educational Psychology 108, 3 (April 2016), 329–341. https://doi.org/10.1037/edu0000062
[27]
David A. Ellis, Brittany I. Davidson, Heather Shaw, and Kristoffer Geyer. 2019. Do Smartphone Usage Scales Predict Behavior?International Journal of Human-Computer Studies (May 2019). https://doi.org/10.1016/j.ijhcs.2019.05.004
[28]
David A. Ellis, Linda K. Kaye, Thomas D.W. Wilcockson, and Francesca C. Ryding. 2018. Digital Traces of Behaviour Within Addiction: Response to Griffiths (2017). International Journal of Mental Health and Addiction 16, 1 (Feb. 2018), 240–245. https://doi.org/10.1007/s11469-017-9855-7
[29]
Sindhu Kiranmai Ernala, Moira Burke, Alex Leavitt, and Nicole B. Ellison. 2020. How Well Do People Report Time Spent on Facebook?: An Evaluation of Established Survey Questions with Recommendations. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, Honolulu HI USA, 1–14. https://doi.org/10.1145/3313831.3376435
[30]
Peter M. Gollwitzer and Paschal Sheeran. 2006. Implementation Intentions and Goal Achievement: A Meta-analysis of Effects and Processes. Advances in Experimental Social Psychology 38 (2006), 69–119. https://doi.org/10.1016/S0065-2601(06)38002-1
[31]
James J. Gross and Angela L. Duckworth. 2021. Beyond Willpower. Behavioral and Brain Sciences 44 (2021). https://doi.org/10.1017/s0140525x20000722
[32]
David J. Grüning, Frederik Riedel, and Philipp Lorenz-Spreen. 2023. Directing Smartphone Use through the Self-Nudge App One Sec. Proceedings of the National Academy of Sciences 120, 8 (2023), e2213114120. https://doi.org/10.1073/pnas.2213114120
[33]
Sandeep K. Gupta. 2011. Intention-to-treat concept: A review. Perspectives in Clinical Research 2, 3 (2011), 109–112. https://doi.org/10.4103/2229-3485.83221 21897887 PMCID: PMC3159210.
[34]
Martin S Hagger, Chantelle Wood, Chris Stiff, and Nikos L D Chatzisarantis. 2010. Ego Depletion and the Strength Model of Self-Control: A Meta-Analysis.Psychological bulletin 136, 4 (2010), 495–525. https://doi.org/10.1037/a0019486
[35]
Sandra G. Hart and Lowell E. Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. In Human Mental Workload, Peter A. Hancock and Najmedin Meshkati (Eds.). Advances in Psychology, Vol. 52. North-Holland, 139–183.
[36]
Gillian R. Hayes. 2011. The Relationship of Action Research to Human-computer Interaction. ACM Trans. Comput.-Hum. Interact. 18, 3 (Aug. 2011), 15:1–15:20. https://doi.org/10.1145/1993060.1993065
[37]
Eric B Hekler, Predrag Klasnja, Jon E Froehlich, and Matthew P Buman. 2013. Mind the Theoretical Gap: Interpreting, Using, and Developing Behavioral Theory in HCI Research. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI ’13. 3307–3316. https://doi.org/10.1145/2470654.2466452
[38]
Alexis Hiniker, Sungsoo Ray Hong, Tadayoshi Kohno, and Julie A Kientz. 2016. MyTime: Designing and Evaluating an Intervention for Smartphone Non-Use. (2016), 4746–4757. https://doi.org/10.1145/2858036.2858403
[39]
Alexis Hiniker, Jenny S. Radesky, Sonia Livingstone, and Alicia Blum-Ross. 2019. Moving Beyond "The Great Screen Time Debate" in the Design of Technology for Children. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (New York, NY, USA) (CHI EA ’19). ACM, panel01:1–panel01:6. https://doi.org/10.1145/3290607.3311745
[40]
Michael Inzlicht, Kaitlyn M. Werner, Julia L. Briskin, and Brent W. Roberts. 2021. Integrating Models of Self-Regulation. Annual Review of Psychology 72, 1 (2021), 319–345. https://doi.org/10.1146/annurev-psych-061020-105721
[41]
Daniel Kardefelt-Winther, Alexandre Heeren, Adriano Schimmenti, Antonius van Rooij, Pierre Maurage, Michelle Carras, Johan Edman, Alexander Blaszczynski, Yasser Khazaal, and Joël Billieux. 2017. How Can We Conceptualize Behavioural Addiction without Pathologizing Common Behaviours?Addiction 112, 10 (2017), 1709–1715. https://doi.org/10.1111/add.13763
[42]
Ravi Karkar, Jasmine Zia, Roger Vilardaga, Sonali R Mishra, James Fogarty, Sean A Munson, and Julie A Kientz. 2015. A framework for self-experimentation in personalized health. Journal of the American Medical Informatics Association 23, 3 (Dec. 2015), 440–448. https://doi.org/10.1093/jamia/ocv150
[43]
Betul Keles, Niall McCrae, and Annmarie Grealish. 2020. A Systematic Review: The Influence of Social Media on Depression, Anxiety and Psychological Distress in Adolescents. International Journal of Adolescence and Youth 25, 1 (2020), 79–93. https://doi.org/10.1080/02673843.2019.1590851
[44]
Jiraporn Khumsri, Rungmanee Yingyeun, null Mereerat Manwong, Nitt Hanprathet, and Muthita Phanasathit. 2015. Prevalence of Facebook Addiction and Related Factors Among Thai High School Students. Journal of the Medical Association of Thailand = Chotmaihet Thangphaet 98 Suppl 3 (April 2015), S51–60.
[45]
Inyeop Kim, Hwarang Goh, Nematjon Narziev, Youngtae Noh, and Uichin Lee. 2020. Understanding User Contexts and Coping Strategies for Context-aware Phone Distraction Management System Design. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 4, 4 (Dec. 2020), 134:1–134:33. https://doi.org/10.1145/3432213
[46]
Inyeop Kim, Gyuwon Jung, Hayoung Jung, Minsam Ko, and Uichin Lee. 2017. Let’s FOCUS: Mitigating Mobile Phone Use in College Classrooms. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 3 (Sept. 2017), 63:1—-63:29. https://doi.org/10.1145/3130928
[47]
Jaejeung Kim, Chiwoo Cho, and Uichin Lee. 2017. Technology Supported Behavior Restriction for Mitigating Self-Interruptions in Multi-device Environments. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 3 (Sept. 2017), 1–21. https://doi.org/10.1145/3130932
[48]
Jaejeung Kim, Hayoung Jung, Minsam Ko, and Uichin Lee. 2019. GoalKeeper: Exploring Interaction Lockout Mechanisms for Regulating Smartphone Use. 3, 1 (2019), 16:1–16:29. https://doi.org/10.1145/3314403
[49]
Predrag Klasnja, Eric B. Hekler, Elizabeth V. Korinek, John Harlow, and Sonali R. Mishra. 2017. Toward Usable Evidence: Optimizing Knowledge Accumulation in HCI Research on Health Behavior Change. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems(CHI ’17). ACM, New York, NY, USA, 3071–3082. https://doi.org/10.1145/3025453.3026013
[50]
Jake Knapp. 2013. The Distraction-Free iPhone (or ‘Why I’m Happier since I Disabled Safari’).
[51]
Minsam Ko, Seungwoo Choi, Koji Yatani, and Uichin Lee. 2016. Lock N’ LoL: Group-Based Limiting Assistance App to Mitigate Smartphone Distractions in Group Activities. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (New York). ACM, 998–1010. https://doi.org/10.1145/2858036.2858568
[52]
Minsam Ko, Kyong-Mee Chung, Subin Yang, Joonwon Lee, Christian Heizmann, Jinyoung Jeong, Uichin Lee, Daehee Shin, Koji Yatani, and Junehwa Song. 2015. NUGU: A Group-based Intervention App for Improving Self-Regulation of Limiting Smartphone Use. Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing - CSCW ’15 (2015), 1235–1245. https://doi.org/10.1145/2675133.2675244
[53]
Konrad Kollnig, Siddhartha Datta, Thomas Serban Von Davier, Max Van Kleek, Reuben Binns, Ulrik Lyngs, and Nigel Shadbolt. 2023. ‘We Are Adults and Deserve Control of Our Phones’: Examining the Risks and Opportunities of a Right to Repair for Mobile Apps. In Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency(FAccT ’23). Association for Computing Machinery, New York, NY, USA, 22–34. https://doi.org/10.1145/3593013.3593973
[54]
Konrad Kollnig, Siddhartha Datta, and Max Van Kleek. 2021. I Want My App That Way: Reclaiming Sovereignty Over Personal Devices. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems(CHI EA ’21). Association for Computing Machinery, New York, NY, USA, 1–8. https://doi.org/10.1145/3411763.3451632
[55]
Geza Kovacs. 2019. HabitLab: In-the-Wild Behavior Change Experiments at Scale. https://www.gkovacs.com/phd_thesis/thesis.pdf
[56]
Geza Kovacs, Drew Mylander Gregory, Zilin Ma, Zhengxuan Wu, Golrokh Emami, Jacob Ray, and Michael S. Bernstein. 2019. Conservation of Procrastination: Do Productivity Interventions Save Time Or Just Redistribute It?. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems(CHI ’19). ACM, New York, NY, USA, 330:1–330:12. https://doi.org/10.1145/3290605.3300560
[57]
Geza Kovacs, Zhengxuan Wu, and Michael S Bernstein. 2018. Rotating Online Behavior Change Interventions Increases Effectiveness But Also Increases Attrition. 2 (2018), 95:1—-95:25. Issue CSCW. https://doi.org/10.1145/3274364
[58]
Geza Kovacs, Zhengxuan Wu, and Michael S. Bernstein. 2021. Not Now, Ask Later: Users Weaken Their Behavior Change Regimen Over Time, But Expect To Re-Strengthen It Imminently. arXiv:2101.11743 [cs] (Jan. 2021). arxiv:2101.11743 [cs]
[59]
Min Kwon, Joon-Yeop Lee, Wang-Youn Won, Jae-Woo Park, Jung-Ah Min, Changtae Hahn, Xinyu Gu, Ji-Hye Choi, and Dai-Jin Kim. 2013. Development and Validation of a Smartphone Addiction Scale (SAS). PLOS ONE 8, 2 (Feb. 2013), e56936. https://doi.org/10.1371/journal.pone.0056936
[60]
Simone Lanette and Melissa Mazmanian. 2018. The Smartphone "Addiction" Narrative Is Compelling, but Largely Unfounded. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems(CHI EA ’18). ACM, New York, NY, USA, LBW023:1–LBW023:6. https://doi.org/10.1145/3170427.3188584
[61]
David M Levy, Jacob O Wobbrock, Alfred W Kaszniak, and Marilyn Ostergren. 2012. The Effects of Mindfulness Meditation Training on Multitasking in a High-Stress Information Environment. (2012), 8.
[62]
Ian Li, Anind Dey, and Jodi Forlizzi. 2010. A Stage-Based Model of Personal Informatics Systems. In Proceedings of the 28th International Conference on Human Factors in Computing Systems - CHI ’10. 557. https://doi.org/10.1145/1753326.1753409
[63]
Elizabeth O Lillie, Bradley Patay, Joel Diamant, Brian Issell, Eric J Topol, and Nicholas J Schork. 2011. The n-of-1 clinical trial: the ultimate strategy for individualizing medicine?Personalized Medicine 8, 2 (March 2011), 161–173. https://doi.org/10.2217/pme.11.7
[64]
Christoph Lindner, Gabriel Nagy, and Jan Retelsdorf. 2015. The Dimensionality of the Brief Self-Control Scale—An Evaluation of Unidimensional and Multidimensional Applications. Personality and Individual Differences 86 (Nov. 2015), 465–473. https://doi.org/10.1016/j.paid.2015.07.006
[65]
Kai Lukoff. 2019. Digital Wellbeing Is Way More than Just Reducing Screen Time. https://uxdesign.cc/digital-wellbeing-more-than-just-reducing-screen-time-46223db9f057
[66]
Kai Lukoff, Ulrik Lyngs, and Lize Alberts. 2022. Designing to Support Autonomy and Reduce Psychological Reactance in Digital Self-Control Tools. Self-Determination Theory in HCI: Shaping a Research Agenda. Workshop at the ACM CHI Conference on Human Factors in Computing Systems (CHI’22) (May 2022), 6.
[67]
Kai Lukoff, Ulrik Lyngs, Karina Shirokova, Raveena Rao, Larry Tian, Himanshu Zade, Sean A. Munson, and Alexis Hiniker. 2023. SwitchTube: A Proof-of-Concept System Introducing “Adaptable Commitment Interfaces” as a Tool for Digital Wellbeing. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems(CHI ’23). Association for Computing Machinery, New York, NY, USA, 1–22. https://doi.org/10.1145/3544548.3580703
[68]
Kai Lukoff, Ulrik Lyngs, Himanshu Zade, J. Vera Liao, James Choi, Kaiyue Fan, Sean A. Munson, and Alexis Hiniker. 2021. How the Design of YouTube Influences User Sense of Agency. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. ACM, Yokohama Japan, 1–17. https://doi.org/10.1145/3411764.3445467
[69]
Ulrik Lyngs. 2021. Ulysses in Cyberspace: Examining the Effectiveness of Design Patterns for Digital Self-Control. Ph. D. Dissertation. University of Oxford.
[70]
Ulrik Lyngs, Reuben Binns, Max Van Kleek, and Nigel Shadbolt. 2018. "So, Tell Me What Users Want, What They Really, Really Want!". In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems (New York, NY, USA) (CHI EA ’18). ACM, alt04:1—-alt04:10. https://doi.org/10.1145/3170427.3188397
[71]
Ulrik Lyngs and Kai Lukoff. 2022. Leveraging Self-Regulation Research When Designing for Digital Wellbeing. Workshop Paper. International Conference on Advanced Visual Interfaces, Designing for Meaningful Interactions and Digital Wellbeing, AVI 2022.
[72]
Ulrik Lyngs, Kai Lukoff, Laura Csuka, Petr Slovák, Max Van Kleek, and Nigel Shadbolt. 2022. The Goldilocks Level of Support: Using User Reviews, Ratings, and Installation Numbers to Investigate Digital Self-Control Tools. International Journal of Human-Computer Studies 166 (2022), 102869. https://doi.org/10.1016/j.ijhcs.2022.102869
[73]
Ulrik Lyngs, Kai Lukoff, Petr Slovak, Reuben Binns, Adam Slack, Michael Inzlicht, Max Van Kleek, and Nigel Shadbolt. 2019. Self-Control in Cyberspace: Applying Dual Systems Theory to a Review of Digital Self-Control Tools. In CHIConference on Human Factors in Computing Systems Proceedings (CHI 2019). ACM, New York, NY, USA. https://doi.org/10.1145/3290605.3300361
[74]
Ulrik Lyngs, Kai Lukoff, Petr Slovak, William Seymour, Helena Webb, Marina Jirotka, Jun Zhao, Max Van Kleek, and Nigel Shadbolt. 2020. ’I Just Want to Hack Myself to Not Get Distracted’: Evaluating Design Interventions for Self-Control on Facebook. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, Honolulu HI USA, 1–15. https://doi.org/10.1145/3313831.3376672
[75]
Patrick D. Manapat, Michael C. Edwards, David P. MacKinnon, Russell A. Poldrack, and Lisa A. Marsch. 2019. A Psychometric Analysis of the Brief Self-Control Scale. Assessment 28, 2 (12 2019), 395–412. https://doi.org/10.1177/1073191119890021
[76]
Gloria Mark, Mary Czerwinski, and Shamsi T Iqbal. 2018. Effects of Individual Differences in Blocking Workplace Distractions. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems(CHI ’18). ACM, New York, NY, USA, 92:1—-92:12. https://doi.org/10.1145/3173574.3173666
[77]
Gloria Mark, Victor M. Gonzalez, and Justin Harris. 2005. No Task Left behind?: Examining the Nature of Fragmented Work. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, Portland Oregon USA, 321–330. https://doi.org/10.1145/1054972.1055017
[78]
Bella Martin and Bruce Hanington. 2012. Universal Methods of Design: 100 Ways to Research Complex Problems, Develop Innovative Ideas, and Design Effective Solutions. Rockport Publishers.
[79]
John McAlaney, Manal Aldhayan, Mohamed Basel Almourad, Sainabou Cham, and Raian Ali. 2020. On the Need for Cultural Sensitivity in Digital Wellbeing Tools and Messages: A UK-China Comparison. In Trends and Innovations in Information Systems and Technologies(Advances in Intelligent Systems and Computing), Álvaro Rocha, Hojjat Adeli, Luís Paulo Reis, Sandra Costanzo, Irena Orovic, and Fernando Moreira (Eds.). Springer International Publishing, Cham, 723–733. https://doi.org/10.1007/978-3-030-45691-7_68
[80]
David C. Mohr, Heleen Riper, and Stephen M. Schueller. 2018. A Solution-Focused Research Approach to Achieve an Implementable Revolution in Digital Mental Health. JAMA Psychiatry 75, 2 (Feb. 2018), 113. https://doi.org/10.1001/jamapsychiatry.2017.3838
[81]
Alberto Monge Roffarello and Luigi De Russis. 2021. Coping with Digital Wellbeing in a Multi-Device World. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems(CHI ’21). Association for Computing Machinery, New York, NY, USA, 1–14. https://doi.org/10.1145/3411764.3445076
[82]
Alberto Monge Roffarello and Luigi De Russis. 2021. Towards Multi-device Digital Self-control Tools. In Human-Computer Interaction – INTERACT 2021(Lecture Notes in Computer Science), Carmelo Ardito, Rosa Lanzilotti, Alessio Malizia, Helen Petrie, Antonio Piccinno, Giuseppe Desolda, and Kori Inkpen (Eds.). Springer International Publishing, Cham, 122–131. https://doi.org/10.1007/978-3-030-85610-6_8
[83]
Alberto Monge Roffarello and Luigi De Russis. 2023. Teaching and learning “Digital Wellbeing”. Future Generation Computer Systems 149 (2023), 494–508. https://doi.org/10.1016/j.future.2023.08.003
[84]
Alberto Monge Roffarello and Luigi De Russis. 2024. Hey StepByStep! Can You Teach Me How to Use My Phone Better?International Journal of Human-Computer Studies 183 (March 2024), 103195. https://doi.org/10.1016/j.ijhcs.2023.103195
[85]
Megan Moreno, Karyn Riddle, Marina C. Jenkins, Ajay Paul Singh, Qianqian Zhao, and Jens Eickhoff. 2022. Measuring Problematic Internet Use, Internet Gaming Disorder, and Social Media Addiction in Young Adults: Cross-sectional Survey Study. JMIR public health and surveillance 8, 1 (Jan. 2022), e27719. https://doi.org/10.2196/27719
[86]
Sean A. Munson, Jessica Schroeder, Ravi Karkar, Julie A. Kientz, Chia-Fang Chung, and James Fogarty. 2020. The Importance of Starting With Goals in N-of-1 Studies. Frontiers in Digital Health 2 (2020). https://doi.org/10.3389/fdgth.2020.00003
[87]
Amy Orben. 2019. Teens, Screens and Well-Being: An Improved Approach. Ph. D. Dissertation. University of Oxford, Oxford.
[88]
Amy Orben, Pete Etchells, and Andy Przybylski. 2018. Three Problems with the Debate around Screen Time. (2018). https://www.theguardian.com/science/head-quarters/2018/aug/09/three-problems-with-the-debate-around-screen-time
[89]
Amy Orben and Andrew K Przybylski. 2019. The Association between Adolescent Well-Being and Digital Technology Use. (2019). https://doi.org/10.1038/s41562-018-0506-1
[90]
Charlie Pinder, Jo Vermeulen, Benjamin R. Cowan, and Russell Beale. 2018. Digital Behaviour Change Interventions to Break and Form Habits. ACM Transactions on Computer-Human Interaction 25, 3 (2018), 1–66. https://doi.org/10.1145/3196830
[91]
James O. Prochaska, Carlo C. DiClemente, and John C. Norcross. 1993. In Search of How People Change: Applications to Addictive Behaviors. Journal of Addictions Nursing 5, 1 (1993), 2–16. https://doi.org/10.3109/10884609309149692
[92]
Aditya Kumar Purohit, Torben Jan Barev, Sofia Schöbel, Andreas Janson, and Adrian Holzer. 2023. Designing for Digital Wellbeing on a Smartphone: Co-creation of Digital Nudges to Mitigate Instagram Overuse. In Hawaii International Conference on System Sciences (HICSS). Maui, Hawaii, USA.
[93]
Aditya Kumar Purohit, Kristoffer Bergram, Louis Barclay, Valéry Bezençon, and Adrian Holzer. 2023. Starving the Newsfeed for Social Media Detox: Effects of Strict and Self-regulated Facebook Newsfeed Diets. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. ACM, Hamburg Germany, 1–16. https://doi.org/10.1145/3544548.3581187
[94]
Alberto Monge Roffarello and Luigi De Russis. 2019. The Race Towards Digital Wellbeing: Issues and Opportunities. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems(CHI ’19). ACM, New York, NY, USA, 386:1–386:14. https://doi.org/10.1145/3290605.3300616
[95]
Alberto Monge Roffarello and Luigi De Russis. 2022. Achieving Digital Wellbeing Through Digital Self-Control Tools: A Systematic Review and Meta-Analysis. ACM Transactions on Computer-Human Interaction (Nov. 2022). https://doi.org/10.1145/3571810
[96]
Alberto Monge Roffarello, Luigi De Russis, Danielle Lottridge, and Marta E. Cecchinato. 2023. Understanding Digital Wellbeing within Complex Technological Contexts. International Journal of Human-Computer Studies (March 2023), 103034. https://doi.org/10.1016/j.ijhcs.2023.103034
[97]
Larry D. Rosen, L. Mark Carrier, and Nancy A. Cheever. 2013. Facebook and Texting Made Me Do It: Media-induced Task-Switching While Studying. Computers in Human Behavior 29, 3 (2013), 948–958. https://doi.org/10.1016/j.chb.2012.12.001
[98]
Richard M. Ryan and Edward L. Deci. 2017. Self-Determination Theory: Basic Psychological Needs in Motivation, Development, and Wellness. The Guilford Press, New York, NY, US. xii, 756 pages. https://doi.org/10.1521/978.14625/28806
[99]
Michael Scharkow. 2016. The Accuracy of Self-Reported Internet Use - A Validation Study Using Client Log Data. Communication Methods and Measures 10, 1 (Jan. 2016), 13–27. https://doi.org/10.1080/19312458.2015.1118446
[100]
Desirée Schmuck. 2020. Does Digital Detox Work? Exploring the Role of Digital Detox Applications for Problematic Smartphone Use and Well-Being of Young Adults Using Multigroup Analysis. Cyberpsychology, Behavior and Social Networking 23, 8 (Aug. 2020), 526–532. https://doi.org/10.1089/cyber.2019.0578
[101]
Science and Technology Committee. 2019. Impact of Social Media and Screen-Use on Young People’s Health. Technical Report. UK Parliament.
[102]
Ben Shneiderman and Catherine Plaisant. 2004. Designing the User Interface: Strategies for Effective Human-Computer Interaction (4th Edition). Pearson Addison Wesley.
[103]
Petr Slovak, Christopher Frauenberger, and Geraldine Fitzpatrick. 2017. Reflective Practicum: A Framework of Sensitising Concepts to Design for Transformative Reflection. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 2696–2707.
[104]
Cold Turkey Software. [n. d.]. Cold Turkey Blocker. http://getcoldturkey.com/.
[105]
Donna Spencer. 2009. Card Sorting: Designing Usable Categories. Rosenfeld Media.
[106]
Christina Steindl, Eva Jonas, Sandra Sittenthaler, Eva Traut-Mattausch, and Jeff Greenberg. 2015. Understanding Psychological Reactance. Zeitschrift für Psychologie 223, 4 (Oct. 2015), 205–214. https://doi.org/10.1027/2151-2604/a000222
[107]
Christiane Steinert, Katja Stadter, Rudolf Stark, and Falk Leichsenring. 2017. The Effects of Waiting for Treatment: A Meta-Analysis of Waitlist Control Groups in Randomized Controlled Trials for Social Anxiety Disorder: The Effects of Waiting for Treatment. Clinical Psychology & Psychotherapy 24, 3 (May 2017), 649–660. https://doi.org/10.1002/cpp.2032
[108]
June P Tangney, Roy F. Baumeister, and Angie Luzio Boone. 2004. High Self-Control Predicts Good Adjustment, Less Pathology, Better Grades, and Interpersonal Success.Journal of personality 72, 2 (2004), 271–324. https://doi.org/10.1111/j.0022-3506.2004.00263.x :15016066
[109]
Naa Terzimehić and Sarah Aragon-Hahner. 2022. I Wish I Had: Desired Real-World Activities Instead of Regretful Smartphone Use. In Proceedings of the 21st International Conference on Mobile and Ubiquitous Multimedia(MUM ’22). Association for Computing Machinery, New York, NY, USA, 47–52. https://doi.org/10.1145/3568444.3568465
[110]
Nitasha Tiku. 2018. The WIRED Guide to Internet Addiction.
[111]
Jonathan A. Tran, Katie S. Yang, Katie Davis, and Alexis Hiniker. 2019. Modeling the Engagement-Disengagement Cycle of Compulsive Phone Use. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems(CHI ’19). ACM, New York, NY, USA, 312:1–312:14. https://doi.org/10.1145/3290605.3300542
[112]
& Transportation U.S. Senate Committee On Commerce, Science. 2019. Optimizing for Engagement: Understanding the Use of Persuasive Technology on Internet Platforms. https://www.commerce.senate.gov/public/index.cfm/2019/6/optimizing-for-engagement-understanding-the-use-of-persuasive-technology-on-internet-platforms.
[113]
Matti Vuorre and Andrew K. Przybylski. 2023. Global Well-Being and Mental Health in the Internet Age. Clinical Psychological Science (2023), 21677026231207791. https://doi.org/10.1177/21677026231207791
[114]
Chat Wacharamanotham, Lukas Eisenring, Steve Haroz, and Florian Echtler. 2020. Transparency of CHI Research Artifacts: Results of a Self-Reported Survey. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems(CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–14. https://doi.org/10.1145/3313831.3376448
[115]
Peter West, Max Van Kleek, Richard Giordano, Mark J. Weal, and Nigel Shadbolt. 2018. Common Barriers to the Use of Patient-Generated Data across Clinical Settings. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM. https://doi.org/10.1145/3173574.3174058
[116]
Shan Xu, Zheng (Joyce) Wang, and Prabu David. 2016. Media Multitasking and Well-being of University Students. Comput. Hum. Behav. 55, PA (Feb. 2016), 242–250. https://doi.org/10.1016/j.chb.2015.08.040
[117]
Mingrui Ray Zhang, Kai Lukoff, Raveena Rao, Amanda Baughan, and Alexis Hiniker. 2022. Monitoring Screen Time or Redesigning It? Two Approaches to Supporting Intentional Social Media Use. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (New Orleans, LA, USA) (CHI ’22). Association for Computing Machinery, New York, NY, USA, Article 60, 19 pages. https://doi.org/10.1145/3491102.3517722
[118]
Shoshana Zuboff. 2015. Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology 30, 1 (March 2015), 75–89. https://doi.org/10.1057/jit.2015.5

Cited By

View all
  • (2024)Digital Self-Control in Danish High Schools: A Pilot Intervention StudyAdjunct Proceedings of the 2024 Nordic Conference on Human-Computer Interaction10.1145/3677045.3685423(1-6)Online publication date: 13-Oct-2024
  • (2024)Realizing the potential of mobile interventions for educationnpj Science of Learning10.1038/s41539-024-00289-99:1Online publication date: 12-Dec-2024
  • (2024)Treatment and Prevention of Internet Use Disorders in Children and AdolescentsHandbook of Children and Screens10.1007/978-3-031-69362-5_28(203-209)Online publication date: 6-Dec-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '24: Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems
May 2024
18961 pages
ISBN:9798400703300
DOI:10.1145/3613904
This work is licensed under a Creative Commons Attribution-NonCommercial International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 11 May 2024

Check for updates

Badges

  • Honorable Mention

Author Tags

  1. attention
  2. digital self-control
  3. digital wellbeing
  4. distraction

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • Carlsberg Foundation

Conference

CHI '24

Acceptance Rates

Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)2,639
  • Downloads (Last 6 weeks)537
Reflects downloads up to 08 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Digital Self-Control in Danish High Schools: A Pilot Intervention StudyAdjunct Proceedings of the 2024 Nordic Conference on Human-Computer Interaction10.1145/3677045.3685423(1-6)Online publication date: 13-Oct-2024
  • (2024)Realizing the potential of mobile interventions for educationnpj Science of Learning10.1038/s41539-024-00289-99:1Online publication date: 12-Dec-2024
  • (2024)Treatment and Prevention of Internet Use Disorders in Children and AdolescentsHandbook of Children and Screens10.1007/978-3-031-69362-5_28(203-209)Online publication date: 6-Dec-2024

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media