1 Introduction
Smartphones, laptops, and related digital technologies give people effortless access to an enormous range of functionality. Yet, amidst all this digital freedom many people find it difficult to control their time and attention without being distracted by notifications, infinite recommendation feeds, or compulsive urges to check their devices [
38,
52,
111]. This can have serious negative effects on work performance, social relationships, and mental health [
17,
20,
43,
85].
‘Digital self-control tools’ (DSCTs) are a potential solution. These apps, browser extensions, and system settings help people use digital devices in line with their goals and avoid distraction [
96]. They provide interventions such as blocking or delaying access to distracting apps, tracking and visualising how people spend time on their devices, or rewarding intended use [
73]. Some even change the user interface of specific services directly, such as browser extensions that hide recommended videos on YouTube, or the newsfeed on Facebook [
1,
67,
74].
Hundreds of DSCTs are now publicly available, and many have tens of thousands, or even millions, of users [
72]. The emerging research on their effectiveness suggests that they have a clear potential to change behaviour and help people feel more in control [
32,
56,
74,
96,
100]. Can these tools help us address the immediate need for support in populations such as students and information workers, where challenges with self-regulating of device use are widespread [
4,
24,
97]?
To realise the potential for large-scale impact of DSCTs, important research-to-practice gaps need to be filled: in the real world, a person who wants to use DSCTs needs to identify and apply tool(s) that match their personal device ecosystem and goals for changed use. Typically, their device ecosystem involves multiple devices and their goals go beyond ‘reduce screen time’ [
38,
45,
80,
83]. Moreover, people often have limited knowledge of which DSCTs are available, and/or find it difficult to navigate the available options and find the right tool(s) for their situation [
13,
72,
76].
Existing research on DSCTs, however, has mainly focused on how a single intervention on a single device affects screen time or self-reported ‘addiction’ [
95]. This has yielded core evidence on the potential of specific interventions [
49], but the practical challenge of how to scaffold a process by which people can browse multiple DSCTs and apply the options most relevant to their personal goals across their device ecosystem, has so far received limited attention [
55,
80].
Moreover, recognising the limitations of screen time and addiction metrics [
5,
38,
45], researchers have called for measures that are better grounded in local and culturally-dependent notions of digital wellbeing [
39,
65,
72,
76,
79]. Empirical investigations of such notions have tended to focus only on the smartphone [
38,
45,
84,
95,
109,
111] and/or a single platform [
11,
68,
92,
93]. Finding ways to evaluate DSCTs overall impact on self-regulated use across one’s device ecosystem, is likely to be practically useful for stakeholders who wish to provide broadly applicable interventions to support digital wellbeing.
To help fill these research-to-practice gaps, our paper explores the following general question: How can we empower people to identify their personal challenges and goals around digital device use and then select appropriate DSCTs to help?
We explore this question in the context of university students, where prior work has documented widespread challenges with self-regulating digital device use [
7,
44,
97,
116]. We present results from a collaboration with the counselling service at the University of Oxford, where increasing numbers of students in their conversations with counsellors expressed frustration over inability to control their device use. To help address this local challenge, we developed an online workshop intervention in which students reflected on their concerns and goals around digital device use, explored relevant DSCTs, and were supported in applying them on their devices. Since its wider introduction in March 2021, the workshop has become the most popular of any offering ever provided by the counselling service.
We use data from these workshops to address three specific research questions:
•
RQ1: How do students want to change their digital device use? For their ecosystem of digital devices, what positive goals for use do students want to achieve?
•
RQ2: Can an intervention that combines reflection on personal goals for change with self-selection of relevant DSCTs improve digital self-control? Given urgent and widespread needs for support, might this style of intervention provide a simple and accessible solution for student counselling services and other stakeholders?
•
RQ3: How, if at all, do students’ preferences for different types of DSCTs vary? Are certain types of tools generally useful, or are students highly individual in what they choose and benefit from?
We explore these questions using open trial data from 280 students who participated in workshops between March 2021 and November 2022. Using a digital self-control measure adapted from relevant psychological research, we assessed self-regulated device use
1 before and 1-3 months after the workshop, as well as which DSCTs the participants tried and how they had been useful.
We found that most participants wanted to make changes to, or be more in control of, both their smartphone and computer use. Overall, they wanted to use their devices more intentionally, and get better at balancing time spent on different activities, by having clearer boundaries around time and context of device use, reducing overall use, and limiting the ‘addictiveness’ of their digital environments.
At the follow-up (52% of participants responded), 95% of respondents still used at least one of the DSCT strategies
2 they tried. The median respondent had tried three different strategies and were still using two, typically applied across devices (mainly smartphone + laptop computer). Respondents showed a significant quantitative improvement in digital self-control, with a large effect size (Cohen’s
d = 0.93), which was supported by the qualitative data. Self-reported time spent also declined on laptop computer, smartphone, and tablet. Most participants applied a unique combination of the strategies included in the workshop. We observed a combination of general usefulness and individual variability: strategies related to blocking or hiding distractions, as well as focus timers, were generally rated as useful. Other strategies, such as grey scale or setting up rewards and punishment (typically using apps such as
Forest) saw a large amount of variation.
Our paper contributes an identification of key design considerations for combining reflection and self-experimentation to support self-regulated device use via DSCTs. We also contribute the first direct comparison of multiple DSCTs in subjective usefulness, which may help inform theoretical discussions in DSCT research, as well as help stakeholders such as educational institutions provide practical advice. Finally, with our participants’ full consent, we contribute an anonymised qualitative dataset (over 62,000 words) on university students’ challenges and goals, and how useful they found the DSCTs they tried. This open dataset may help accelerate the development of empirically grounded measures of digital wellbeing, and of DSCTs to support it [
69,
79,
114], and can be found on the
Open Science Framework (
https://osf.io/zmt78/).
2 Background
To address widespread concern over negative effects of digital device use on outcomes such as productivity and mental wellbeing, researchers and policymakers have proposed changes to the incentive structure of the attention economy [
112,
118], the development of age-appropriate design directives [
2], and requirements for social media companies to make data available for researchers to independently assess the effects of their platforms [
101].
These initiatives have the potential for systemic impact, but are also likely to take years to implement. In the meantime, many user groups face an immediate need for guidance, including families, students, and information workers [
4,
24,
97]. The interventions that HCI researchers are investigating in the form of digital self-control tools (DSCTs) are imminently available and could help address this need. However, to provide practical guidance on use of DSCTs, important gaps need to be bridged between the current state of research and the challenges individuals face in daily life.
2.1 Aligning evaluation of DSCTs with people’s usage goals in multi-device contexts
Most research on DSCTs has focused on use of a single device, typically the smartphone (see Roffarello and De Russis [
95] for a review). In practice, people’s challenges with self-regulating use often involve multi-device contexts [
18,
81,
96], where unwanted use curbed by a DSCT on one device might ‘spill over’ to another [
56]. Some researchers have started to explore ways to take multi-device use into account [
47,
56,
82,
93]. This work suggests that DSCTs implemented on one device can indeed — when assessed on screen time metrics — help people make progress towards their goals without causing negative second-order effects [
56]. However, which evaluation measures to use remains an open question [
8].
Many DSCTs are explicitly designed to limit time spent [e.g.,
48,
51,
52], and the most common outcome measure in efficacy studies is time spent overall or in specific apps [
95]. This to some extent aligns with users’ goals, as surveys of smartphone use, as well as individual platforms such as YouTube, suggest that goals to reduce time spent are common [
38,
52,
68]. However, time spent is often an unreliable indicator of whether a DSCT is helpful [
39,
65,
88,
89]. Lock-out mechanisms, a common feature in DSCTs, illustrate this conundrum: whereas locking users out of devices after a time limit may seem effective when assessed by reducing time spent, it can lead to strongly negative responses when users find themselves in ‘out-of-routine’ situations where they
need to use their devices [
48].
The most frequently used
subjective outcome measures in DSCT research are questionnaires related to ‘addiction’ (e.g., the ‘Smartphone Addiction Scale’ [
59,
95]). This, too, to some extent aligns with users’ goals: analyses of publicly available user reviews suggest that the most frequent purpose of use for DSCTs is to help oneself stay focused on demanding or boring tasks amidst readily available digital distractions. This capacity is captured to some extent by addiction scales [
72]. However, the ‘addiction’ framing has been criticised as imprecise, and for pathologising everyday self-control struggles [
5,
14,
41,
60,
87,
110]. One recently proposed alternative (which we adopt in the present paper, see Section
3.2.2) is to adapt measures from basic psychology research on everyday self-regulation to the context of digital device use [
71]. This might both help capture global ability to self-regulate behaviour in a more precise manner than the scales related to behavioural addiction, and help fill the ‘theoretical gap’ in DSCT research, by grounding measures in relevant psychological research [
37,
73,
95].
A key consideration, as researchers explore measures of digital wellbeing to evaluate DSCTs against [
8], is to ensure that those measures accurately capture specific populations’ usage goals and preferences for support [
79]. Since most recent empirical investigations of the goals that DSCTs should serve have focused only on smartphone use, it would be useful to collect additional data on these goals and preferences in cross-device contexts. Moreover, open data sharing is rare in DSCT research, but might accelerate the development of empirically grounded measures, and allow for easier comparison between populations [
69,
114].
Hence, our first use of data from the workshop focuses on the following question (RQ1): in the context of their personal ecosystem of digital devices, how do students want to change their use?
2.2 Helping people identify and apply DSCTs relevant to their goals
People vary substantially in the way they use digital devices, both compared to others and to themselves over time [
16]. To empower people to influence their own behaviour using DSCTs, we therefore need effective ways to elicit individualised goals and apply DSCTs in bespoke ways to support those goals [
86].
Work on ‘digital self-nudging’ suggests that empowering people to co-create their own DSCTs (e.g., using the Shortcuts automation app on iOS) can support a sense of agency, accomplishment, and perceived usefulness [
92]. Active involvement is likely to be particularly important for DSCTs, as they often involve interventions that restrict the user. This can be an effective way to influence behaviour, but also cause psychological ‘reactance’, in which the user is motivated to circumvent or otherwise rebel against the intervention [
74,
106]. This is likely to be reduced if the user freely commits themselves to a DSCT because it supports their personal goals and values, as opposed to having it assigned by experimental condition or algorithmic allocation [
66].
The research to date, however, has almost invariably focused on a single intervention on a single device or service, pre-assigned to research participants [
95]. Few studies have investigated the practical challenge for which stakeholders such as university counselling services urgently need advice: how can we help people identify and apply the right combination of DSCT(s) for their personal needs (see also [
80])?
One notable exception is
HabitLab [
55,
57,
58]. This project deployed an in-the-wild intervention that helped more than 12,000 daily Chrome and Android users meet time limiting goals on websites and apps via a wide range of interventions, from hiding content feeds to displaying timers. However, a limitation of this work was that it measured success mainly in terms of limiting time spent, which is likely to have captured only a subset of people’s actual goals [
68]. Broader HCI work on individualised strategies for self-monitoring and self-experimentation in behaviour change (specifically, ‘n-of-1’ studies for managing health conditions requiring individualised insights [
42,
63]), suggests that this might be a problem: if individualised goals are not carefully elicited, people may end up with solutions that encourage behaviour that runs counter to their goals [
86,
115]. Similarly, analyses of user reviews for DSCTs suggest that people are often frustrated that the tools they try do not match
their goals for self-regulated use [
72].
In sum, there is an opportunity to extend previous work with an approach that elicits individual goals and lets users actively choose and apply DSCTs. Hence, our second use of workshop data focuses on the following practical question (RQ2): Can an intervention that combines reflection on personal goals for change with self-selection of relevant DSCTs improve digital self-control?
2.3 Understanding individual variation in the usefulness of digital self-control tools
Whereas DSCT research has focused on average effects of one specific intervention, many studies have also reported that people can vary substantially in their preference for, and benefits derived from, those interventions [
47,
48,
76]. For example, Mark et al. [
76] noticed in an exploratory study of information workers that the benefits of distraction blocking seemed to apply only to participants who at baseline scored lower on self-control. Getting a clear picture of the variance between people is highly practically relevant: should stakeholders like university counselling services direct most students towards a limited number of interventions that are universally useful, or should they support each student through a process to identify the tools that best meet their specific needs?
Psychological research on self-control more broadly suggests that the answer might be the former: applying the ‘process model’ of self-control, Duckworth et al. [
26] found that strategies intervening earlier in the cycle of impulse generation (i.e., avoiding exposure to temptation as opposed to trying to overcome it only after unwanted impulses have arisen) are generally more effective [
24]. This might also hold true for DSCTs. However, few studies have explicitly compared how multiple DSCT interventions vary in usefulness [
95]
3.
Again, one exception is
HabitLab: Kovacs et al. [
58]’s supplementary materials included a comparison of twelve different interventions on Facebook, finding that the most effective (auto-closing the tab after 60 seconds) was twice as effective as the second-best (blocking the site after a user-determined duration). However, the outcome measure was solely reduction in time spent. Given the limitations of this measure (according to which, the most effective solution would be to simply ban an app or website from being used altogether), interpreting their findings would be helped by triangulation with users’ subjective assessment of usefulness.
Hence, our third use of workshop data focuses on the following question (RQ3): How, if at all, do students’ preferences for different types of DSCTs vary?
4 Results
Participant demographics.Between March 2021 and November 2022, 280 students who took part in a workshop consented to have their data used for research. The data were derived from 62 workshops conducted across six academic terms. The median number of participants in a workshop was 5 (interquartile range = 4 to 7, max = 17). 57% identified as women, 39% as men, 1.7% (n=5) as non-binary, 1.4% (n=4) preferred to self-describe, and 1% (n=2) preferred not to disclose. Participants were a mix of PhD (40%), undergraduate (35%) and master’s students (25%). In terms of age, 44% were 18-23 years old, 38% were 24-29, 12% 30-35, and 7% 36 or older.
Device usage. 98% of participants used a smartphone (62% iPhone, 38% Android), 99% a laptop and/or desktop computer (Mac: 51%, Windows: 46%, Linux: 4%), and 31% a tablet. The modal self-estimated time spent per day actively using a laptop or desktop computer was 7-8 hours, smartphone 2-3 hours, and tablet 1-2 hours (distributions shown in Figure
3 a; the limitations of self-reported time are discussed in section
5.4).
Participant motivations.When asked in the opening survey about the kind of concerns motivating them to sign up to the workshop (Figure
3 b), 48% of participants chose ‘productivity’, and 49% chose both ‘productivity’ and ‘mental health’ (only 3% chose ‘mental health’ on its own). 36% of participants found it ‘moderately important’ to change or be in better control of their use, 44% found it ‘very important’, and 15% found it ‘extremely important’.
Strategy selection. At the follow-up, the median number of strategies that respondents had tried was 3 (IQR = 2 to 5, min = 0, max = 14; only one respondent had not tried anything). The summary statistics of number of strategies tried were nearly identical for 1-month and 3-month follow-ups, except for the max value, which was 10 for the 1-month follow-up. This suggests most experimentation happened in the first month after the workshop, which was supported by the interview data. There was no correlation between the number of participants in a workshop and the number of strategies tried (r = -.02, p = .77). At the follow-up, 95% of respondents still used at least one of the strategies they had tried, and the median number of strategies still used was 2 (IQR = 2 to 4, min = 0, max = 10). The number of strategies still used was slightly higher for 1-month than for 3-month follow-ups (1-month: median = 3, IQR = 2 to 6, min = 0, max = 8; 3-month: median = 2, IQR = 2 to 4, min = 0, max = 10).
As shown in Figure
5, strategies were typically applied on a specific type of device. For example, grey scale was almost exclusively applied on smartphone, and hiding distracting features on websites was mainly applied on a computer. However, most participants (79%) tried strategies on multiple devices, usually by applying different strategies on different devices (e.g.,
Focus in bursts with a timer on computer, and
Limit notifications on smartphone). 21% of the time, the same strategy was applied on multiple devices, most commonly on both smartphone and laptop computer. The strategy most likely to be applied on multiple devices was
Understand how you use your devices, which was applied across multiple devices 35% of the time. 13% of respondents applied strategies only on smartphone; 8% applied them only on a computer.
4.1 RQ1: How do students want to change their use of digital devices?
In terms of device types, 93% of participants said in the opening survey that they wanted to “change, or be more in control of”, their smartphone use. However, only a little more than one third (37%) were
solely concerned with their smartphone use. 63% of participants wanted to change their use of other devices, mainly laptop / desktop computers (59%), with a small minority (7%) mentioning tablets (Figure
3 c).
The final reflection prompt, after participants had written down their concerns, triggers, and what they had previously tried, was
“Imagine your use of digital devices is exactly as you want it to be. What does that look like? (Be specific about context, time of day, apps...)”. We captured participants’ responses to this question with two higher-level goals, alongside three more specific goals that we interpreted as means of achieving the higher-level goals. The goals were closely related to participants’ concerns, and the triggers that drove them, a summary of which is included in the Appendix, section
D.
4.1.1 Higher-level goals.
Use devices intentionally and regain a sense of control (66% of participants). Participants commonly described that they wanted to use their devices more intentionally (
“Actively using digital devices for a purpose. Not mindlessly scrolling”, P14). Intentional use included feeling more in control, be able to sustain focus on a specific task (especially tasks related to working / studying), and be comfortable leaving one’s devices elsewhere for a bit (“
don’t feel the need to constantly check my phone and can e.g. sit through a class without being distracted”, P108). This goal mirrored participants’ common concern about excessively checking devices, getting ‘sucked in’, and feeling unable to stop use once started (section
D.1).
Importantly, the prominence of this goal also suggested that the Brief Digital Self-Control Scale was a meaningful quantitative measure to assess if the workshop was helpful.
Make device use fit into life in a balanced way (19%). Participants wanted to achieve a balance in which their use of social media and other routinely ‘distracting’ functionality had a natural place in their daily life without disrupting work, hobbies, or in-person socialising (“a better ratio of ‘doing things that are important’ and ‘doing things not important’”, P120; “I spend more time in a day reading books than I do on social media”, P123). This goal mirrored a general concern expressed by almost every participant (95%), namely that digital device use disproportionately diverted their time and attention away from personally meaningful activities, thereby interfering with their ability to be productive or focus on their hobbies and social life.
4.1.2 Specific goals.
Having clear boundaries around time and context of use (63%). More than half of participants expressed goals around delineating device use at specific times and contexts. This was in particular related to boundaries between work and leisure, where participants wanted distraction-free work sessions, and clear, dedicated times for use that they believed might distract from their productivity (“I focus on work in the mornings and don’t check email or social media before noon”, P136). Participants commonly wished to limit their use specifically in the mornings and/or evenings (“not actively using or checking my digital devices an hour before bed and before I have gotten ready in the morning”, P59).
Limit amount or frequency of use (34%). Participants often described wanting to limit the amount of time they spent on their devices overall, in specific functionality, or in a specific session (“Using my phone for less than 1 hour a day”, P49; “never binge for more than 2 hours”, P50). Similarly, they often described wanting to limit the frequency with which they checked their devices or a specific functionality (“check phone only every 3 hours after successful work session”, P26; “only checking emails once a day”, P12).
Having a healthier digital environment where self-regulation is easier (14%). Participants also wanted to change digital environments to make them more on ‘their side’ (“a device that respects me”, P7). This sometimes took the form of wanting to get rid of specific apps or services altogether (“Permanent deletion of instagram and tiktok - these provide me with very little joy”, P145; “No insta/FB or the like - but diff when you want to stay in touch with friends/family/work”, P68). However, it was also expressed as goals to change within-service environments. This could either be within the constraints of a service’s ordinary functionality (“On Instagram, I follow accounts that make me feel happy”, P100), or wishing to more directly change features that tempted them to behave in unintended ways (“bypass the way that apps are designed to make me addicted and keep on going even though it won’t be good for me”, P84).
When comparing goals between participants who wanted to change smartphone use only, and those who wanted to change use of multiple devices, the emphasis varied only slightly: keeping clear boundaries around time and context of use was slightly more prevalent for participants wanting to change use of multiple devices (67% vs 54% of participants); limiting amount or frequency of use was slightly more prevalent for participants wanting to change only smartphone use (40% vs 28%).
4.2 RQ2: Can an intervention that combines reflection on personal goals for change with self-selection of relevant DSCTs improve digital self-control?
Among participants who filled in the follow-up survey (52% of total
5), we observed a large increase in self-reported digital self-control, compared to before the workshop (Cohen’s
d = 0.93, mean increase = 7.9 points, scale ranges from 12 to 60,
t(145) = 11.2, p < .00001; Figure
4 c). The effect size was larger for follow-ups after 1 month (
d = 1.13, n = 36) than after 3 months (
d = 0.87, n = 110).
As is the case for all open trials, the increase in digital self-control scores could be influenced by a number of factors, including response bias (students who find the workshop more useful might be more likely to respond) and regression to the mean (students might attend a workshop at the point in time where they struggle the most with digital distraction). When adjusting for response bias by assigning non-respondents the same score as in their opening survey (‘last observation carried forward’ [
33]), the estimated increase in digital self-control remained medium-sized,
d = 0.56 (
t(279) = 9.4, p < .00001).
Moreover, the quantitative data suggested that discovering appropriate DSCTs, rather than simply regression to the mean, was related to digital self-control improvement: among participants who had found at most one ‘very useful’ strategy (n = 108), the effect size of the digital self-control increase was d = 0.84. However, the effect size was substantially larger (d = 1.21) among participants who had managed to find two or more ‘very useful’ strategies (n = 38). Number of ‘very useful’ strategies discovered was unrelated to the number of participants in a workshop (r = -0.002, p = .98), but strongly correlated to the number of strategies participants had tried (r = .53, p < .001).
Finally, as indicated in Figure
4 d, participants’ self-reported daily time spent declined on laptop computer (
r = .29,
p = .001
6, median difference = -60 minutes, interquartile range of differences = -120 to 60 minutes,), smartphone (
r = .29,
p = .004, median difference = 0, IQR = -60 to 45 minutes), and tablet (
r = .38,
p = .04, median difference = -25 minutes, IQR = -60 to 0 minutes), but not on desktop computer (
p = .9).
4.2.1 Qualitative themes.
Corroborating the quantitative data, participants’ qualitative reports in the interviews and surveys described three themes of positive changes they had noticed after the workshop: First, participants described being more aware and mindful of their device use (“I noticed that I’m more aware of how digital devices distract me. I built a habit to stop and think about what I’m going to do on my phone and laptop before diving into apps and emails and webpages”, P59). Second, participants described feeling more empowered and in control, because the workshop provided them with specific tools they could apply when needed (“I finally felt that I have the tools to kind of control this, these urges that are coming on and my habits that I consider unhealthy”, P42). Third, participants said they had managed to make their device use better balanced with other valued activities (“I still have a lot of progress to do, but I have been able to reconnect with reading a bit more since the workshop, which has been good for me”, P78).
Participants also pointed out that the workshop was not a panacea: echoing the quantitative finding that the effect size was larger after 1-month than 3-month follow-ups, some participants noticed improvements after the workshop, when they felt motivated to act and try out different tools, but said that the effect faded over time (“In the beginning it was much easier to control my use of digital devices. I felt in control of it again, and felt that it was an important step in the right direction (...) but I’ve let it slip since”, P42, 3-month follow-up).
4.2.2 Active elements of the workshop.
The interview data suggested that a basic benefit of the workshop was that it helped participants set aside the time to reflect and get motivated to act (“sometimes it’s just somebody forcing you to sit down and take stock which is really what you need (...) like, ‘you know, you don’t like being distracted. You know you want to do something about it. So right now, you’re required for 30 minutes to actually think about this”’, P62). An important part of the motivational benefit was that the workshop helped participants see and feel that they were not the only ones struggling to control their digital device use (“just seeing other people in the workshop and knowing that, yeah, I’m not alone in this digital disaster right now”, P59)
In terms of more specific elements, the reflection, card sorting, and website were all overwhelmingly found ‘completely’ or ‘mostly’ useful according to the exit survey. (Figure
4 a). In the interviews, participants shared that the
reflection helped clarify their challenges and initiate an active exploration process (
“I think just by writing down what the problem was that made me sort of, you know, a small like light bulb moment which made you go, oh yeah, that is what the problem is”, P32). Thus, some participants described that their reflection process continued after the workshop, as they explored different ways to address their challenges (
“the reflection in the workshop helped me begin to identify stuff. But then afterwards, like, I kind of built on it and spent a lot more time kind of thinking about it”, P11).
Second, the card sorting of strategies, combined with the website’s provision of specific DSCTs to apply them,
turned the abstract challenge of digital self-control into an actionable problem, with concrete steps to follow (
“It can feel very, like, out of my control. Like, there’s nothing I can really do about it. Having those different kind of things, and especially split up very clearly into the different ways of managing it, I think was very useful in terms like, ’Oh, there are things that I can do”’, P182). Thus, participants often mentioned that the workshop
introduced them to tools they were unaware of and unlikely to have discovered on their own (
“YouTube was my weak spot and the plug-in I installed in the workshop really helped me cut that out (...) I would otherwise never have thought of searching for something like that”, P10). Similarly, when asked in the exit survey if they had found strategies that seemed like good solutions to their challenges, the median response on a scale from 1 (‘not at all’) to 5 (‘very much’) was 4 (44% scored 5, 48% scored 4, 6% scored 3; Figure
4 b). Moreover, 74% felt ‘moderately confident’ the strategies they decided to try would be helpful (12% felt ‘highly confident’, and 13% ‘slightly confident’).
4.3 RQ3: How much do students’ preferences for different types of digital self-control tools vary?
Participants’ assessment of the usefulness of the strategies is shown in Figure
5. The strategies were
not created equal: the proportion of respondents who rated a strategy ‘moderately useful’ or higher ranged from 26% (
Replace distractions on the web with a to-do list) to 84% (
Limit notifications), and the proportion of specifically ‘very useful’ ratings ranged from 5% (
Replace distractions on the web with a to-do list) to 55% (
Hide distracting features on websites). Conversely, the proportion of ‘not at all useful’ ratings ranged from 0% (
Reduce your device to the tools you need) to 37% (
Replace distractions on the web with a to-do list).
The top 6 strategies all seemed generally useful, receiving ‘moderately’ to ‘very useful’ ratings from at least 76% of respondents, and ‘not at all useful’ from at most 10%. All but one of these strategies were from the blocking or removing distractions category. The exception was Focus in bursts with a timer (which ranked second on proportion of ‘very useful’ ratings, at 49%).
The remaining strategies saw greater amounts of variation. For example, grey scale was the second-most frequently tried strategy, but had a somewhat uniform distribution of usefulness ratings and the second-lowest retention rate (54%). Importantly, no strategy was outright rejected: even the ‘worst’ strategy, ‘Replace distractions on the web with a to-do list’, which was found ‘not at all useful’ by 37%, was still considered at least ‘moderately useful’ by 26% of those who tried it.
We also observed a high degree of variation in how participants combined strategies. Among the 78% of respondents who still used 2 or more strategies at the follow-up (n = 114), 91% of the specific combinations of strategies were unique to a single participant. Thus, even the most frequent unique combination (Block distracting websites or apps + Hide distracting features on websites) was applied by just 4 respondents.
4.4 Design considerations for supporting digital self-control via reflection and self-experimentation
Based on the survey and interview data from the workshop, we identified several design considerations for researchers and practitioners working in the domain of digital self-control.
Illustrate full capability range of DSCTs to empower participants to identify tailored solutions. Compared to other domains of behaviour change, DSCTs have a unique power to not simply remind or recommend actions, but also to change the digital environment (an exercise app cannot support you to take the stairs by removing the elevator at your workplace, but a DSCT can hide an unwanted newsfeed [
55]). However, most people’s considerations for changing digital device use are anchored by simple all-or-nothing approaches (blocking an app entirely or not at all) [
111], a notion reinforced by the limitations of pre-installed screen time tools.
Therefore, a key benefit of the workshop, which participants repeatedly stressed, was that it made them aware of the range of actionable ways in which they could take back control. This empowered them to search for ways to implement a strategy tailored to their situation, even if the specific solution they needed was not among the DSCTs included on the workshop website. For example, knowing that it is possible to use browser extensions to modify websites, P11 deleted the Instagram app off their Android phone and used an extension to remove the search area when using Instagram in the browser. This effectively helped them use Instagram for messaging without getting distracted, but was not among the tools on the website.
Demonstrating the range of what is possible is therefore key, but complexity should be carefully managed to avoid overwhelming participants. For example, P198 felt that the number of strategies included was “too many. I think you should have done eight”. Others wished they had more time to review the options before being asked to commit. Our workshop implementation attempted to reduce cognitive load by using layers of abstraction, combined with decision steps that gradually honed in on specific options. For example, we grouped DSCTs into 13 strategies, which themselves were grouped into just 4 main categories; the card sorting task supported initial reduction of options; and we limited the number of ‘commitment cards’ on the Miro board to just two for each participant, to help funnel towards action. While not perfect, this approach seemed sufficient to manage information overload for most (“the strategies part and definitely, I think, the commitment part (...) those two together sort of symbiotically were high impact for me, the commitment specifically because it forced me to narrow down my choices”, P118).
Scaffold self-experimentation in a way that is sensitive to how DSCTs differ in attrition challenges. Researchers have suggested that DSCTs face a general challenge of attrition over time [
57,
58,
84]. Our interview data suggest
7, however, that the nature of this challenge differs between tools. For tools that require active initiation, like focusing in bursts with a timer, participants described the usual challenges of sustaining use over time. However, strategies that were set-it-and-forget-it could
benefit from inertia (
“the list app replacing the YouTube newsfeed. That’s not something I have to go up and change again every morning, which made it very easy to stick with”, P118) and help them habituate to a desired state of the world (
“it doesn’t even cross my mind anymore when I use these apps, it’s like I just don’t have a newsfeed anymore”, P86). This benefit seemed to apply specifically to the aspects of a set-it-and-forget-it intervention that involved removing distracting elements (
“The motivational quotes themselves weren’t particularly helpful or inspirational, but the fact that they covered my Facebook and YouTube feeds was far more useful.”, P237)
From our participants’ descriptions of the post-workshop period
8, we conjecture that strategies which involve active initiation should be tried one at a time to avoid feeling overwhelmed, and allow time for a new habit to form (e.g., P59 applied multiple things at once, but described in the interview that this approach had led him to not give each a proper try). We also note that whereas some DSCTs are naturally distinct in this respect, many can be used in both an actively initiated as well as set-it-and-forget-it way. For example, tools for blocking distracting websites or apps can often be used either actively to initiate a delineated blocking session, or be prospectively programmed to block distractions during particular hours and/or days of the week. Our data suggest that people in the set-up phase should be encouraged to consider set-it-and-forget options where relevant (e.g., setting up a weekly blocking schedule), to make inertia work to their advantage.
Prioritise self-report measures that capture individual differences in goals. Most participants wanted to change both their smartphone and laptop use, and had varying goals including gaining a sense of control, establishing clearer usage boundaries, and reducing the duration or frequency of use in various ways. Instead of introducing multiple distinct measures to capture this wide range of goals, which would make surveys overly long, we included a broader measure applicable across multiple goals and devices. Considering the range of goals our participants reported, the self-control scale we adapted from basic psychological research seemed a better fit than commonly used alternatives in DSCT research, which tend to focus on a single device (e.g., smartphone addiction scale [
59]), a specific system (e.g., the NASA-TLX task load scale [
35]), and/or a narrower goal (e.g., cognitive absorption [
6]). Our study is a first use of this type of scale in DSCT research. We expect this or similar scale adaptations of standard self-regulation measures from psychology to be useful for other researchers in this field looking for alternatives to ‘addiction’ measures [
71].
To interpret the scores from this scale, we suggest triangulating with other data. In our case, we used quantitative and qualitative survey data on the usefulness of specific DSCTs, as well as self-reported time spent on devices. Although our surveys did not ask participants to provide any objective measures (e.g. a screenshot of their devices’ screen time report), including them may aid researchers in interpreting the self-report data. For instance, the decline in subjective time spent on devices at the follow up might not correspond to actual time spent [
27,
28,
99].
However, we strongly suggest that objective usage measures such as screen time are mainly deployed in the service of the subjective outcome measures. One reason for this is that assessing DSCTs based on objective outcome measures such as screen time may lead to overly narrow conclusions; interventions that block devices from being used altogether may be highly effective for reducing screen time, but interfere with intended use. Moreover, there are no universally accepted guidelines as to what constitutes ‘healthy’ screen time for adults [
113], so we instead recommend prioritising self-assessments of whether device use is aligned with one’s goals. Indeed, even if we
did have objective standards for how adults ‘should’ be using digital devices, there would be strong reasons to keep a focus on subjective sense of control and self-efficacy within digital environments [
70]: self-determination theory posits autonomy as a basic human need and common principles in HCI design guidelines encourage support of user control as a goal in its own right [
22,
68,
98,
102].
5 Discussion
To sum up, a little more than a third of participants solely wanted to change their smartphone use, but most also wanted to change their use of other devices (in particular laptop / desktop computer). Their overall goals related to using devices more intentionally, and achieving a better balance in the way they spent time. They wanted to achieve this via clearer boundaries around time and context of use, reduced amount or frequency of use, and having digital environments that supported self-control.
1-3 months after the workshop, the median participant had tried three different digital self-control strategies using DSCTs and were still using two, typically in unique combinations across multiple devices (mainly smartphone + computer). The follow-up suggested a significant improvement in digital self-control, with a large effect size in our open trial (Cohen’s d = 0.93). Moreover, participants’ self-reported time spent declined on laptop computer, smartphone, and tablet. The qualitative data corroborated the quantitative findings, with participants reporting feeling more aware and mindful of their device use, more empowered and in control; and that their use had become better balanced with other valued activities in their life.
The 13 strategies varied greatly in average usefulness: DSCTs for blocking or hiding distractions, as well as focus timers, were generally useful. Others, such as grey scale, exhibited a high degree of interpersonal variation. In the following, we discuss implications for future work leveraging DSCTs to empower end users.
5.1 Confirming the effectiveness of interventions to empower people to identify DSCTs that support their personal needs
HabitLab demonstrated that applying a range of relevant DSCTs cross-device (Chrome on PC + Android) can effectively reduce time spent on websites and apps that a person wishes to use less [
56]. Our study supplements this work (which was based on objective screen time measures) with self-report data, and confirms the potential to support digital wellbeing via bespoke, cross-device applications of DSCTs [
82].
Our findings were generated by triangulating quantitative and qualitative self-report data from a relatively large sample of students. However, as is the case for most studies of DSCTs [
69,
95], our study did not have a control group. To assess the magnitude of the potential benefits, follow-up studies with, at minimum, a waitlist control group are needed [
61,
107]. Such work could also use automated rather than self-reported screen time measures to obtain a more accurate indication of usage [
27,
28,
99].
Moreover, as has been encountered in studies of DSCTs, our data suggested that the effect of the workshop waned over time. Bringing about longitudinal behavioural change is a general challenge for behaviour change interventions [
56,
90], but long-term evaluations are rare in DSCT research [
95]. The practical approach of the present work, where we attempted to address a local challenge in a student population actively seeking out the workshop, may represent an opportunity in this respect: in the interviews, participants commonly requested that the workshop be expanded with recurrent group support (with suggestions ranging from weekly to quarterly follow-ups) to help themselves stay motivated and overcome technical challenges. Deploying regular workshops and/or ‘booster’ sessions might be an effective way to undertake longitudinal studies of self-experimentation habit-formation with DSCTs [
36].
Finally, we designed the workshop to holistically integrate components that previous research suggested would be effective for behaviour change [
12,
30,
62,
86,
103]. The qualitative data suggested that the components worked well together: the group setting normalised digital self-control struggles, the reflection clarified participants’ goals, and the exploration and application of DSCTs provided actionable steps. However, controlled studies could assess the relative contributions of different elements and facilitate iterative development of variant interventions [
49]. For example, group support is often a crucial factor for supporting behaviour change, and in our study helped normalise digital self-control struggles. However, a self-guided version might be valuable for people who are not willing, or able, to take part in a live group setting and/or for stakeholders who lack the resources to train a facilitator. How might the workshop elements be effectively delivered in a form which does not require a live facilitator and/or group context? Investigating such questions will be highly practically relevant for stakeholders such as university counselling services to implement cost- and time-effective interventions to support digital wellbeing [
80].
5.2 Understanding variation in DSCT usefulness to guide effective interventions
Comparing how different DSCTs fare in practical settings may provide important data for crafting interventions that effectively guide people towards the strategies most likely to be useful [
25]. To this end, our work provides one of the first direct comparisons of subjective usefulness of multiple DSCTs.
One takeaway is that hiding distracting features on websites is one of the most consistently useful strategies in DSCTs (highest proportion of ‘very useful’ ratings in our study). This confirms suggestions from previous work, which has encouraged researchers to study ways to provide ‘internal support’, i.e., changing problematic user interfaces directly, as opposed to simply provide ‘external support’ such as usage tracking or blocking [
68,
74,
117]. Similarly, while not directly comparable, our ranking of strategies’ usefulness showed similarities to Kovacs et al. [
58]’s comparison of 12 different interventions’ influence on time spent on Facebook. Here, the five most effective interventions all related to blocking or removing distractions, from force-closing the tab after 60 seconds to removing the newsfeed or comments.
When applying a theoretical lens to understand the patterns we observe, our findings broadly align with the ‘process model’ of self-control widely used in psychology [
26,
31]. Thus, our overall pattern in which strategies from the ‘block or remove distractions’ category were most likely to be found useful, and strategies within the ‘goal reminders’ category least likely, mirrors the finding that self-control strategies which help people avoid exposure to temptation, rather than trying to overcome unwanted urges only after they have arisen, tend to be more effective [
24]. Intervention designers might wish to keep this in mind.
However, our data also suggest that additional theoretical lenses, such as the dual systems model previously used in DSCT research [
48,
73,
74], are important:
focus in bursts with a timer had the second-highest proportion of ‘very useful’ ratings, even though it was not about reducing exposure to temptation. Rather, this strategy seemed to help people maintain control by breaking up their focus into smaller segments that feels manageable and easier to get started with. From a dual-systems perspective, this can be analysed as a way to support conscious System 2 control, by increasing confidence in one’s own ability to maintain focus [
73].
We encourage further research to directly compare the practical usefulness of different DSCTs, to better understand their relative challenges and opportunities, and how each may fit within a unique personal device ecosystem and behaviour change process [
73,
91]. In turn, this may help researchers generate recommender systems to support self-experimentation with DSCTs (e.g., ‘People who reported challenges similar to yours have found this useful’), for which we hope our open dataset can provide a useful starting point.
5.3 Encouraging digital infrastructures that enable DSCTs to have large-scale public impact
Overall, our participants expressed three more practical goals — keeping boundaries around time and context of use, reducing amount and frequency of use, and changing digital environments such that self-regulation is easier in the first place. Support for the two former goals is well underway: Most DSCTs focus on limiting time spent and/or frequency of use [
95], and support for scheduled or geofenced notification limiting or app blocking is both possible and an active area of research [
3,
46,
104].
As mentioned, however, research on DSCTs that provide ‘internal support’ and directly reorganise problematic within-service features has been more sporadic, even though it is one of the most potent strategies at our disposal [
68,
74,
95,
117]. We want to draw attention to a major infrastructural challenge in this respect: the
‘hide distracting features’ strategy, with which people can directly address problematic aspects of their digital environment, was almost exclusively applied on laptop or desktop computer. This was not accidental. In principle, participants
could apply this strategy on smartphone. But on mobile, people are used to accessing services such as social media via apps, browser extensions are less well supported (Chrome does not support extensions on Android), and some services (such as Snapchat) are not accessible in a web browser at all. However, no simple solution currently exists for users to adjust user interfaces in mobile apps according to their personal needs, beyond what the developer chooses to provide by design [
53,
54].
Lukoff et al. [
67] recently demonstrated the benefits that this might bring: changing the YouTube mobile app such that users can turn recommendations on or off increases sense of agency, satisfaction, and goal alignment. However, it requires substantial technical effort to re-implement even a limited version of the YouTube app for a single research study. Other recent work has explored technical prototypes that allow end-users to change apps by themselves, and discussed the risks and opportunities if regulators introduced a ‘right to repair’ for mobile apps (i.e., a right for end-users to exercise granular control over their user interfaces) [
53]. Our findings suggest that pursuing this avenue of work could be highly valuable. We encourage researchers and regulators alike to discuss how an equivalent to browser extensions might be created for mobile apps, which could help address an infrastructural bottleneck for large-scale impact of DSCTs.
5.4 Limitations and future work
Sampling: All participants actively chose to sign up to the workshop. Hence, the distribution of challenges and goals that we observed, or of choice and assessment of DSCT usefulness, may not generalise to students, or other target populations, at large.
Follow-up response rate: 52% of participants filled in the follow-up survey. We conservatively adjusted for response bias by carrying forward the last observed digital self-control scores from non-respondents [
33], but future work may consider how to increase response rate (e.g., by collecting data at recurrent in-person group support sessions, rather than solely in follow-up emails).
Subjective outcome measures: Our follow-up data included subjective measures only, collected via surveys and interviews. Self-reported screen time in particular, can correlate poorly with actual time spent [
27,
28,
99]. We do believe subjective sense of time spent is valuable in its own right, not merely as a flawed proxy of objective time spent. That is, if a person has a goal to reduce time spent on their devices, it is helpful to know whether a subjective increase in digital self-control is accompanied by a subjective decrease in time spent. Nevertheless, in the best of all worlds, researchers may wish to collect both objective as well as subjective data, for example by asking survey respondents to provide numbers from Screen Time or Digital Wellbeing features on their devices, which could be triangulated with the qualitative data.
Open trial: Our study did not include a control condition, and so we were unable to quantify the influence of factors like regression to the mean or response bias. Future work could more accurately estimate the benefits of the intervention by comparing with a waitlist condition, and/or partial or modified versions.
Selection of DSCTs: Our grouping of DSCTs into 13 strategies represented our best attempt at summarising design features in current tools, in a way that was manageable for participants to consider in a card sorting exercise. The DSCTs could be grouped in other ways, and other options for specific tools than the ones we included on the workshop website could have been made. The tools we included in the present version of the workshop can be found on the Open Science Framework (
osf.io/zmt78 and in Appendix
C.
D Participants’ Concerns About Their Digital Device Use, and the Triggers Driving Those Concerns
In the following, we summarise the themes we developed to capture participants’ reflections on the concerns that brought them to the workshop (prompt: “What concerns you about your use of the internet / your laptop / your phone?”), and which internal and external triggers they thought were the cause of those concerns (“What external triggers (e.g., notifications) and internal triggers (e.g., emotions) drive the uses you’re concerned about?).
D.1 Concerns
Device use diverts time and attention away from personally meaningful activities (95% of participants). This was most commonly expressed in relation to productivity (68% of participants), but also in relation to other valued activities such as personal hobbies or social interactions (22%). Thus, participants were concerned about the amount of time they wasted on meaningless activities on their devices (34%), because it left less time available for other things (“Waste time that could be used writing thesis (...) prevents me from investing time in learning new skills”, P117). They were also concerned that the frequency with which they, e.g., checked social media, interrupted their focus on work and studying (“Disrupting my work; making it take longer than it should do, and it being of a lesser quality”, P47), as well as on social activities (“They prevent me from connecting to the present moment / to other people (in person)”, P98).
Feeling unable to control, and dependent on, device use (52% of participants). Participants described that they found themselves using or checking their devices excessively without awareness (“Finding myself browsing social media without thinking”, P142), and/or that they felt ‘sucked in’ and unable to stop using their devices (“Looking at YouTube starts with something for work/study and then you can go down a rabbit hole because of clickbait videos”, P68, “Once I start, I can’t stop”, P61). Many felt uneasy about the dependency and pervasiveness of digital devices in their lives (14%, “I feel like I cannot leave the house without my phone”, P31), and were worried that their behaviour over time made them addicted to stimulation (7%).
Device use negatively impacts mood or mental health (43%). Participants were concerned that the content on, in particular, social media or news negatively affected their mood (“the content of the social media platforms can sometimes make me anxious or in a ‘negative space’”, P273, “Social media makes me feel inadequate”, P14). They also expressed feeling overwhelmed by the sheer amount of content/notifications/emails/messages they had to respond to (“I always feel behind on messages & guilty [...] It makes me anxious that people can contact me at any time”, P190; “Too much information that is not relevant”, P67).
Device use harms sleep or physical health (31%). Finally, participants were concerned that device use disrupted their sleep (20%, “I spend a lot of time on YouTube before bed, and that has been taking away my scheduled sleep time”, P59). Similarly, time spent on digital devices might distract them from healthier physical activities (“It ends up disrupting my exercise, food and sleep schedule”, P48, or even negatively affect their eye sight or posture (“I’m worried about my eyes because I spend so much time looking at a screen”, P25).
D.2 Triggers
D.2.1 External triggers.
Receiving notifications and communication from others (48%). Participants commonly mentioned that receiving notifications from their apps or otherwise receiving communication led to disrupted focus and/or longer time spent on their devices than they intended (“getting notifications while I am trying to work!”, P29, “I like to keep my notifications clear - so I check every little one & then spend ages on that app.”, P14).
Ubiquitous, easy & persuasive digital distractions (34%). Participants felt their devices made it difficult to avoid procrastination, because potential distractions were always just “one click away” (P81). Moreover, many complained that potential distractions, such as social media, also contained design features that made it difficult to stay on track (“endless newsfeed = difficult to put limitations in place”, P63). Sometimes participants complained that institutional rules, such as mandatory two-factor authentication, increased their exposure to distractions (“the authentication request for library access - that is so ironic - I have to use my phone at the beginning of a time i’m meant to be focusing!”, P258).
D.2.2 Internal triggers.
Attempting to avoid, or remedy, a negative internal state (88%). The most prominent theme related to inner triggers was negative emotional states, such as boredom or anxiety, feeling overwhelmed or stressed, feeling sad or lonely, or simply feeling tired. When in these states, digital devices provided easy access to functionality that could be used as an escape (“Feeling lonely at uni, or overwhelmed emotionally. Wanting to distract from said emotions.”, P144). However, they were also used as an active remedy to those feelings (“I use my phone to access inspiration (thoughts and images) when I feel bored or unsure of myself”, P53).
Negative emotions such as boredom or restlessness often arose in response to working on difficult, tedious, or vaguely defined tasks (“When I feel stuck on a problem. I have had to tell myself before ‘you won’t find the answer on social media’”, P14). In these situations, alternative activities on their devices provided the route of least effort (“Not knowing what to do next in terms my of actual work and finding youtube/social media easier to deal with”, P142).
Using digital devices when feeling down could lead to negative spirals (“when i feel anxious i go on my phone to escape but that makes it worse”, P94). For example, device use driven by avoidance of emotions elicited by work would reinforce participants’ concern that their use of digital devices undermined their productivity and leave them feeling worse (“I waste time at work by checking my phone and feel very unfulfilled after”, P256).
Expectations of availability and of what one might be missing out on (38%). Participants commonly described their challenges as driven by others’, or their own, expectations of availability. That is, participants often gave examples of (real or imagined) pressure from others to be instantly available (“Feeling like if I don’t respond immediately to friends they will become distant”, P97), and how they in turn also wanted to be available in case someone needed them (“thinking people might need me and I won’t be there”, P82).
We included within this theme people’s broader expectations of what they might be missing out on on their devices. This was often mentioned as a driver of social media use, typically in relation to staying informed or involved with social activities (“I don’t want to miss out on things if my friends are making plans etc via social media”, P74). It was also related to a desire to stay up-to-date more broadly (“Fear of missing something interesting on Twitter - usually sport or politics related - leads to endless scrolling”, P34).
Spontaneous and habitual urges to check (15%). Finally, participants often reported that they found themselves using their devices in unwanted ways out of habit (“Habitual checking of email and social media accounts for new notifications”, P107). These habits were commonly described as happening without conscious awareness (“fingers automatically click to distractions when I’m not thinking”, P238).