Received: 1 November 2022
|
Accepted: 30 April 2023
DOI: 10.1002/poi3.342
RESEARCH ARTICLE
The changing role of nation states in online
content governance: A case of Google's
handling of government removal requests
Soyoung Park1
|
Yoonmo Sang2
1
Humanities Research Institute, Chung‐Ang
University, Seoul, South Korea
2
Department of Media Communication,
Sungshin Women's University, Seoul, South
Korea
Correspondence
Yoonmo Sang, Department of Media
Communication, Sungshin Women's
University, 2, Bomun‐ro 34 da‐gil, Seongbuk‐
gu, Seoul 02844, South Korea.
Email: ymsang@sungshin.ac.kr
Funding information
National Research Foundation of Korea,
Grant/Award Number: NRF‐
2017S1A6A3A01078538; Ministry of
Education
Abstract
Building upon previous studies that divide the
governance of digital platforms into three eras (Flew,
2021; Bowers & Zittrain, 2020), this study investigates how one of the most influential digital platforms, Google, has handled removal requests from
governments. By sketching the regulatory terrain of
Google, the current study seeks a more balanced
understanding of content moderation. This study
selected an exploratory case study approach using
Google's Transparency reports and accompanying
public data sets relating to governments' content
removal requests filed for all Google products from
2009 to 2021. The findings reveal a growing influence
of nation states on moderating online content,
delineating the surging point as occurring in 2016.
In addition to the sheer increase of take‐down
requests raised in various areas, since 2016 the
trend of government interventions has outpaced
those of courts; during the same period, Google,
which had been more compliant with court decisions
than with governmental entities' requests, showed
similar compliance with government requests. The
results also demonstrate how the practices of
requesting content removal differed by political
system while singling out Russia's distinctive characteristics. This study sheds light on our understanding of the role of nation states in shaping online
environments in the era of platformization.
KEYWORDS
content moderation, digital platform, Google, platformization,
regulation
Policy Internet. 2023;1–19.
wileyonlinelibrary.com/journal/poi3 © 2023 Policy Studies Organization.
|
1
|
PARK and SANG
INTRODUCTION
As evidenced by international platform inquiries, policymakers across the globe
have recently raised scrutiny over digital platforms to address accumulating harms
relating to digital platforms (Popiel, 2022). There has been growing concern that
digital platforms are benefiting from the spread of disinformation and hatred through
their platforms, as they fall short of monitoring and regulating illegal or harmful
content circulated thereon (Calo & Hartzog, 2021). Some even argue that digital
platforms have exercised “almost unfettered, sovereign power at global scale”
(Gorwa, 2019, p. 861).
With rampant image‐based sexual abuse materials, extremist content, and mis‐ and
disinformation hosted and amplified through digital platforms, we see a regulatory
return across the globe (Flew et al., 2021). Among legislature and policy agencies,
there have been numerous discussions on the necessity of changing laws to hold
digital platforms more responsible for real‐world harms (Flew & Wilding, 2020; Gillespie
et al., 2020). As a result, new or updated laws such as Australia's Online Safety Act
2021, UK's proposed Online Safety Bill, and Germany's NetzDG, have been enacted
and introduced into national legislatures to combat illegal and/or harmful content
hosted on digital platforms. In the United States, there has also been a heated debate
around modifying the scope of immunity granted to platforms by amending Section 230
of the Communications Decency Act (Smith & Van Alstyne, 2021).
Building upon previous studies that divide the governance of digital platforms into
three eras, namely, the eras of “rights”, “public health,” and “process” in Bowers and
Zittrain (2020) and the stages of “libertarian Internet,” “platformized Internet,” and
“regulated Internet” in Flew (2021), this study investigates how one of the most
influential digital platforms, Google, has handled government removal requests from
2009 to 2021.1 Google has become an integral part of our lives, albeit with varying
degrees of influence in different countries and as Haider and Rödl (2023) aptly noted,
“the platform must be understood as a key participant in the creation of meaning in
society” (p. 2). By sketching the regulatory terrain of Google as a case study, this study
seeks a more balanced understanding of content moderation. The number of content‐
removal requests Google receives from countries across the globe continues to grow
(Brown, 2022).
The major scholarly discussions (Bowers & Zittrain, 2020; Flew, 2021) have laid out
a “big picture” describing digital governance on a global scale and some previous
studies (e.g., Garbe et al., 2021) explored how nation states with different political
regimes engage in content regulation differently. However, from an empirical
perspective, relatively little is known about how content‐removal requests from
governments are handled by Google and other global digital platforms. Additionally,
it is worth taking a closer look at the big picture of each of these three eras to develop a
more robust theory of Internet governance. To fill this gap in the literature, we selected
an exploratory case study approach using as our primary data sources Google's
Transparency reports and accompanying public data sets relating to governments'
content‐removal requests directed at all Google products from 2009 to 2021. With a
world‐class search engine and its leading content‐hosting platform YouTube, Google
could be a good case study given its international influence over the global digital
public sphere and the long‐standing struggle between internal governance and external
oversight to mitigate potential harms caused by online content.
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
2
|
3
LITERATURE REVIEW
The governance of digital platforms
Our online activities are increasingly dependent upon digital platforms. As digital platforms
mediate our daily lives, their corporate purveyors have effectively accumulated platform
power and sought economic, cultural, and political influences in society. It is worth noting
that Google Search has come under scrutiny by researchers who contend that it lacks
algorithmic transparency and sometimes creates “data voids” (Haider & Rödl, 2023; Steiner
et al., 2022). However, these corporations have argued that their platforms are simply “the
conduits for communication activities of others” (Flew, Martin, & Suzor, 2019, p. 45). This
argument has enabled such companies to evade strict regulatory oversight that applies to
traditional media and telecom companies. Distrust of digital platforms has galvanized policy
debates about how to effectively regulate illegal or harmful content produced and distributed
on digital platforms (Popiel & Sang, 2021).
The claim that digital platforms should be exempted from liability is increasingly under
challenge (Napoli, 2019). As Napoli and Caplan (2017) argued, the framing of digital
platforms “purely as technology companies marginalizes the increasingly prominent political
and cultural dimensions of their operation, which grow more pronounced as these platforms
become central gatekeepers of news and information in the contemporary media
ecosystem” (para. 45).
Previous studies have argued that a laissez‐faire regulatory approach to digital platforms
is posing risks and real harms to both users and society as a whole (Ananny &
Gillespie, 2016; Flew et al., 2019; Napoli & Caplan, 2017). As Fay (2019) aptly noted, “while
platforms are pervasive in everyday life, the governance across the scale of their activities is
ad hoc, incomplete and insufficient” (para. 2). Based on continued criticism over limited
regulatory oversight, the focus of the discussion has shifted from whether or not to regulate,
to who should be in charge of regulating harmful online content and what form of
accountability and transparency should be required of digital platforms in relation to content‐
moderation decisions. Flew, Gillett, et al. (2021) found that the global regulatory
environment is increasingly turning towards greater external regulation.
The regulatory shift in content moderation
Previous studies have argued that digital platforms are not doing enough to monitor and
regulate illegal or harmful content online. As Picard and Pickard (2017) noted, it is also true
that digital platforms are “increasingly monitoring, regulating, and deleting content, and
restricting and blocking some users, functions that are very akin to editorial choices” (p. 6).
At the same time, across the globe, there has been a growing array of platform inquiries
(Popiel, 2022).
As digital platforms have seen their influence grow in various aspects of our society, the
policy debates around platform governance have also evolved. Bowers and Zittrain (2020)
proposed that Internet governance can be divided into three eras: the era of “Rights,” the era
of “Public Health,” and the era of “Process” (p. 1). During the “Rights” era from the early
1990s to about 2010, the regulatory approach was to protect then‐nascent Internet
intermediaries from regulatory oversight. During the “Public Health” era (since about 2010),
digital platforms were expected to mitigate and address aggregate harms resulting from user
interactions on a massive scale. Finally, they argued that we are on the verge of or aiming
for the “Process” era that pays attention to the legitimacy of content‐moderation decisions by
seeking broad consensus around those decisions and their implementation.
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
POLICY AND INTERNET
|
PARK and SANG
Flew (2021) also discussed three stages of Internet governance. The first is the open
Internet or “libertarian internet,” spanning from 1990 to 2005. In this period, there was little
intervention by governments or digital platforms themselves. The second stage, from 2006
to the present, is the “platformized internet,” where big tech companies dominate users'
online activities. In this period, there has been a call for greater regulation of online activities
to mitigate social harms as well as demands for antitrust intervention. Our society
is only now entering the third stage, the “regulated internet.” In this period, nation
states are “becoming increasingly important actors in shaping online environments”
(Flew, 2021, p. 12).
Most of the existing research on content moderation is based on the US or western
democracies. Concerning the rest of the world, though, empirical research is lacking on how
content moderation on platforms is performed over time, especially including details of these
performances within a larger paradigm. In addition, little is known about how regime‐specific
factors have an influence on regulatory decisions (Garbe et al., 2021).
We still do not know much about how the world's most influential platforms such as
Google undertakes content‐moderation decisions and conforms to government content‐
removal requests from different parts of the world. One recent study found that countries
where social vulnerability is high and levels of freedom of speech is low are likely to send
more content‐removal requests to Google (Min et al., 2021). Min et al. (2021) also reported
that different patterns are observed between democratic and nondemocratic societies in
terms of what factors lead to the variation in the number of government content‐removal
requests.
Departing from Min et al.'s (2021) study, this study focuses on the evolution of the roles
of governments in regulating online content and shaping online environments. There is still a
dearth of empirical research on the roles nation states are playing in shaping online
regulatory environments in tandem with digital platforms such as Google. This study aims to
fill the gap by examining the following research questions: (1) How have nation states been
working in making content‐removal requests to Google in regulating harmful content online,
and how has Google responded to these content‐removal requests over time? (2) How do
government content‐removal requests vary across different political systems over time? In
doing so, we pay attention to the changes in recent times, namely, the era of “Public Health”
or “Platformized Internet,” which provide a starting point for proving each “big picture” with
concrete evidence as well as a glimpse into the transition to a new era.
ME THODS
Data set
Content take‐down requests to platform giants like Google are becoming a common way
among governments to regulate online content. Since its launch of the first Transparency
Report in 2010, Google has received over two hundred fifty thousand removal requests
across all of its products from a total of 154 countries' government bodies comprising courts,
national or local government agencies, or law enforcement professionals (Google, n.d. a,
b)2; Google reviews those requests and decides whether to comply with them based on
each country's laws as well as Google's product policies and community guidelines. Records
of these requests, reviews, and subsequent actions have been made available to the public
on a 6‐monthly basis from the second half of 2009 to the first half of 2021. The study chose
to use this entire set of government content‐removal request data from 2009 to 2021
(Google, 2022), which correspond to the “Public Health” era and “Platformized Internet”
period.
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
4
|
5
CONTENT ANALYSIS
Definition and measures
Drawing on Google's public data sets for government content‐removal requests, the content
analysis examined the growing role of nation states over time, using the measures of
government removal requests such as the number of requests, the number of items
requested for removal, and what percentage of them were complied with by Google in each
country. Each measure is broken down by judicial and executive branches, respectively. We
applied this classification in our analysis, considering that executive and judiciary authorities
function in many countries as separate powers (Vibert, 2007). Although it depends on the
type of governance arrangements (e.g., parliamentarism, presidentialism, and semi‐
presidentialism relying on the interdependence with the legislative), the executive generally
implements laws and exercises policy‐making powers. If highly concentrated, it can lead to
undemocratic rules and severe political conflicts (OECD, 2022). For judiciary, it interprets
and applies the provisions of laws. Grounded by the principle of independence, the court
offers a sort of alternative democratic forum where the public or private bodies could debate
against the state or other public/private entities (CCJE, 2015). We also highlight other
contextual factors such as the rationale for the request. The unit of analysis was the count
and percentage value assessed under each stated item over a 6‐monthly period.
First, the study separately sampled the top 25 counties with the most removal requests
out of all 154 countries on record on top of examining all the countries on the whole
(Figure 1). It is narrowing the scope to articulate the changing influence of nation states over
time because not all 154 countries have the same conditions or are at the same stage when
it comes to regulating content on Google for any reason.3 Given that the countries with low
or late timing of country requests might result in skewing the signal of change, we compared
the global trends with those of the top 25 countries.
FIGURE 1
Global government requests for content removal from Google, 2009–2019.
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
POLICY AND INTERNET
PARK and SANG
Second, we consider the types of political systems as a factor that characterizes the
ways nation states exert their power. Depending on whether it has a feature of democracies
or autocratic states, the form of national interventions in content removal may vary, as
reflected in factors such as the type of requester, the primary reason for the request, pattern,
and intensity of requests over time. Although it has been argued that a long history of
Internet control applies to most countries regardless of the characteristics of the political
system (Flew, 2021), it remains to be answered “how” these controls were exercised and
patterned by different types of political regimes, especially in the realm of content
moderation.
According to Lührmann et al. (2018), political systems can be broadly classified into four
types: closed autocracy, electoral autocracy, electoral democracy, and liberal democracy.
They defined that “democracies are regimes that hold de‐jure multiparty elections and… six
institutional guarantees (elected officials, free and fair elections, freedom of expression,
alternative sources of information, associational autonomy, and inclusive citizenship)”
(p. 62). According to this definition, liberal democracies can be distinguished by inclusive
citizenship, in which citizens enjoy individual and minority rights over electoral democracies.
Based on this, the study employed the classification of the “Regimes of the World” data
(https://ourworldindata.org/regimes-of-the-world-data), that measures democracy by expert
evaluation according to the above criteria. Resultantly, the sampled 25 countries were
categorized accordingly in Table 1.4
Our final note points to one extraordinary case: Russia. According to Nocetti (2015),
Russia has been considered a country that supports “increased state control over online
space—even though this would ultimately result in limitations on the open network concept
that made the internet possible” (p. 117). When it comes to sampling the countries placing
the most content removal requests, Russia appears to have a substantially unique record
regarding the sheer number of requests (see Hovyadinov, 2019). Their removal requests
quantitatively almost always outnumbered all other countries combined. Since these factors
may hinder the understanding of changes in other countries, this study treated the case of
Russia separately for subsequent analyses and analyzed the temporal trends for changes in
other 24 sample countries.
Analysis procedures: Visualizing temporal narratives of changes
For the first research question, the study observed overall temporal changes in the number
and the characteristics of requests for all countries recorded in the data set. It includes the
authority from which the requests were raised (e.g., court or government) as well as how
Google's compliance has changed in response to those requests over time. The study also
examined the type of reason for the removal request to add further context. In doing so, we
TABLE 1
25 Sample countries by political regime.
Political regime
Countries
More democracy
Closed autocracy
China, Libya, Thailand, Vietnam (4)
↓
Electoral autocracy
India, Kazakhstan, Pakistan, Turkey, Russia (5)
Electoral
democracy
Argentina, Brazil, Indonesia, Ukraine (4)
Liberal democracy
United States, United Kingdom, Germany, France, South Korea,
Italy, Israel, Spain, Canada, Japan, Australia, Netherlands (12)
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
|
6
|
7
also compared the world case to the sampled 24 countries with the most requests to capture
a more distinctive and refined trend in the response of national governance for content
moderation.
Note that as more and more countries began to submit removal requests to Google over
the course of years, the number of samples (i.e., countries) in each collection period is not
uniform. Specifically, requests from newly‐involved countries are generally fewer, so
reporting the analysis based on country‐mean may skew the overall trend. Hence, the
worldwide analysis is mainly based on the cumulative sum values; mean‐based analysis can
be accompanied when the number of sample countries is constant (i.e., 24 countries).
The second question was addressed by examining how the efforts of nation states to
moderate online content through removal requests differ based on the political system. To
this end, the study classified the 24 sample countries according to the types of the political
systems consisting of closed autocracy, electoral autocracy, electoral democracy, and
liberal democracy. As mentioned, due to the overwhelming number and uniqueness of
requests, Russia was treated and analyzed as a separate political system. Drawing on these
characteristics, the study investigated the practices of different country groups for content
moderation to determine whether there are common or distinct characteristics under each
political regime.
RESULTS
Changes in government take‐down requests and Google's compliance
over time (RQ1)
Changes in the number of requests
Using the Google case, the study drew insights regarding how governments' actions
toward content moderation have shifted during the past decade. First of all, Figure 2 shows
how many countries worldwide started submitting removal requests to Google each year for
FIGURE 2
Number of countries starting to submit removal requests by year (2009–2021).
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
POLICY AND INTERNET
|
PARK and SANG
content regulation. This number grew in the early 2010s and several new countries
continued to join.
While the number of requesting countries is gradually flattening, the number of requests
has risen over the past decade (blue dash in Figure 3). In the top 24 countries, which
account for most of the world trends, the number shows three phases of uptrends: mid‐2013,
mid‐2018, and late 2020 with around a total of 3.5, 7, and 9 K requests, respectively. It is
worth mentioning that the exact same periods coincided with a marked increase in policy‐
based removal requests (green). Policy‐based requests increased temporarily in mid‐2013,
but it was the year of 2016 that it continued to outpace court‐driven removal requests.
Comparing worldwide requests (left) to those from the top 24 countries (right), also shows
that requesting Google to remove content is primarily dominated by dozens of countries.
Since 2016, the number of items requested to be deleted has also increased accordingly.
The number of items requested to be deleted by government agencies worldwide surged to
about 180,000 in the second half of 2017, nearly nine times the previous level of 20,000. In
the top 24 countries alone, the number reached 400 K, more than doubling its 2017 record,
driven mainly by the government executive.
The presence of nation states also appears to be distinct in terms of the number of items
requested for removal. According to Figure 4 below, government executive branches
requested take‐down of 140 to 200 K items in 2010 and 2017, respectively, compared to
100 K items driven by court decisions in 2016. In the top 24 countries alone, the number
reached 400 K, double the 2017 record, driven mainly by the government executive. These
surges in time will be further expanded later in this section.
Changes in Google's compliance
Our study shows that Google appears to be more responsive to the demands of the judiciary
than the executive. From the world view in Figure 4 (left), court‐ordered removal requests
were more likely to be accepted by Google, except for the 2017 period. Nevertheless, it
appears that Google is gradually complying with more policy‐led requests in the late 2010s
than in the early 2010s. Google's compliance gap with the court and executive requests
gradually narrowed, with the smallest gap over the 2016–2018 period. This pattern appears
similar to that of the top 24 countries: greater compliance with judicial requests rather than
executive requests. With the exception of the same 2 years from 2016 to 2018, Google's
F I G U R E 3 The number of removal requests to Google between 2009 and 2021, worldwide (left), top 24
countries (right). These trends do not include Russian observations.
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
8
FIGURE 4
|
9
Google compliance between 2009 and 2021, worldwide (left), top 24 countries (right).
executive compliance rates are generally lower than those of the judiciary, which is
increasingly diverging in 2021. Although it is noteworthy that Google's average compliance
for these high‐request countries was higher for most of the study period (around 10%), the
recent compliance gap between the executive and judicial branches suggests that Google
may be returning to its “normal” status. Considering the current trend in which the
executive's take‐down requests to Google are clearly increasing, closer attention should be
paid to how Google will respond in the future.
Changes in underlying rationale of requests
Additionally, we can further contextualize the increased direct nation‐state intervention in
content regulation via the aforementioned executive‐led actions by exploring the underlying
reasons for making such a request.
The most common reasons for removal requests were the claims of defamation,
violations of user privacy or personal information, large‐scale security threats, and
obscenity/nudity (Table 2). Specifically, during the early years between 2013 and 2014,
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
POLICY AND INTERNET
|
TABLE 2
PARK and SANG
Top 10 reasons for removal requests.
Rank
World (except Russia)
24 Most‐regulated countries
1
Defamation (29,497)
Defamation (28,153)
2
Privacy and Security (17,830)
Privacy and Security (16,511)
3
National Security (6569)
National Security (6297)
4
Obscenity/Nudity (4110)
Obscenity/Nudity (4000)
5
Other (3465)
Other (3172)
6
Regulated Goods and Services (3220)
Regulated Goods and Services (2933)
7
Copyright (3024)
Fraud (2931)
8
Fraud (3016)
Government Criticism (2818)
9
Government Criticism (2985)
Religious Offense (2473)
10
Religious Offense (2566)
Copyright (2414)
FIGURE 5
Removal requests in the top 24 countries by reasons, 2010–2021.
indecency was the primary reason for content removal. Then defamation and privacy
invasion have become the most principal reasons between 2017 and 2018. Towards the late
2010s, threats to national security and other cybercrimes related to fraud and illegal goods
and services emerged as major drivers of requests (Figure 5).
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
10
|
11
Next, we can look at the main reasons why the executive and the judiciary, respectively,
requested content takedowns (Figure 6). For the court, the requests appear to be due primarily to
defamation and privacy breaches throughout the 2010s. Conversely, the executive bodies'
request was made for diverse reasons: their focus was mainly on the regulation of lewd content in
the early 2010s, but the second half of the decade was spent combating various issues of
concern, such as fraud, breaches of national security, religious offense, violence, and government
criticism in addition to defamation and invasions of privacy.
Comparing government take‐down requests by political systems (RQ2)
Differences in the number of government requests and Google's compliance
Figure 7 compares how the pattern of government content removal requests differed across
different political systems over time and how Google complied with them. The left chart
shows the change over time in the total number of requests, in the following order: electoral
autocracy, liberal democracy, electoral democracy, and closed autocracy systems. Although
take‐down requests increased significantly in most political systems during 2018–2019,
there were relatively few government requests in electoral democracy and closed autocracy.
A plausible explanation would have to do with political incentives to regulate voice online. In
countries with political systems that do not guarantee people the legal right to elect who
governs them (i.e., closed autocracy), regulating political voice online may be less of a
concern than in countries with elections. Other reasons would be a strong separation of
powers and political vocalness. Electoral democracies, unlike electoral autocracies, are
likely to have a firm system to prevent abuse of political power in the online political arena,
and at the same time, unlike liberal democracies, which should be more responsive to
diverse public voices, they are likely to have less “vocal” political cultures. These factors may
have contributed to the relatively fewer take‐down requests.
Regarding Google's compliance (right), electoral autocracies tend to be low, mostly below
40% especially during the late half of the 2010s, in contrast to those of liberal democracies, which
are highest in the range of 60%–80% for most of the time period. While Google and other US‐
based technological companies (Amazon, Meta, Apple, and Microsoft) have long been accused of
“favoring commercial benefits and private interests over public ones” rooted in neoliberal market
values (van Dijck, 2020, p. 2), this result shows that their value systems might also have a deeper
F I G U R E 6 Removal requests in the top 24 countries by reason, 2010–2019: from the judicial branch (left) or the
executive branch (right).
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
POLICY AND INTERNET
|
PARK and SANG
F I G U R E 7 Removal requests and Google's compliance by political systems 2009–2021. Left: mean requests,
right: compliance rate.
FIGURE 8
State of removal requests from Russia: number of requests (left), compliance rate (right).
ideological root that is aligned with “American's view of what is appropriate” (Roberts, 2019, p. 64)
even while operating in different political regimes.
Although its political system is classified as an electoral autocracy, the Russian case is
dealt with separately below due to the scale that overwhelms the changes in other countries
(Figure 8). In Russia, the notable changes appear to have started in 2016, and most of them
can be explained by the actions of government authorities. The year 2016 is interesting in
that it signals that the Russian government is turning its attention to internet regulation right
after the 5‐year period from 2011 to 2016, when “the (Russian) government forced changes
of ownership over several significant newsrooms with pan‐Russian reach, all of them
previously associated with independent reporting” (Schwarz, 2022, pp. 164–165). At its
peak, the number of removal requests from Russia alone is about 20 K, which is twice that of
all other countries combined. Google's compliance with Russian requests is almost identical
to that of its government requests, and since 2013 it has been around 60%–90%, higher
than compliance with other systems or countries.
To examine possible differences in compliance across five different political regimes, we
conducted a two‐way analysis of variance. Simple main effects analysis demonstrates a
statistically significant difference in compliance according to the types of political systems
(F(4, 348) = 27.49, p = 0.001).
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
12
13
Specifically, we performed Tukey's test for multiple comparisons of political systems to
see how compliance rates differed for each political system. Table 3 reveals that Google has
been less compliant with Electoral Autocracy than Closed Autocracy systems
(−16.67 ± 4.82, p = 0.005). In summary, Google's compliance is lowest in Electoral
Autocracy countries, followed by Closed Autocracy and Electoral Democracy, and then
highest in Liberal Democracy and Russia. Surprisingly, though not significantly, Google's
compliance with Russia was higher than with Liberal Democracy countries.
Differences in underlying rationale of requests
The basis for the deletion request that appeared mainly in each system was shown in Table 4 and
Figure 9. While closed autocracy systems primarily sought to prevent government criticism and
violence, other systems raised the most defamation issues, followed by claims of invasion of
privacy. In the system of electoral autocracy, there were more requests made on the grounds of
defamation claims, along with other various reasons such as issue of content obscenity in 2013,
concerns for national security, fraud, and religious offense during the late 2010s. Removal
requests in liberal democracy systems are also being made for similar reasons.
The increase in removal requests appears to be similar in Russia, but the main reasons for the
requests seem to be somewhat different (Table 4 and Figure 10). Between 2016 and 2017, the
Russian government appears to be almost obsessed with the removal for national security
reasons. In 2018, Russians turned to handle the claims of infringement of various local laws,
including illegal sale/trade/advertising of pharmaceuticals, alcohol, tobacco, fireworks, weapons,
gambling, prostitution and/or health and medical devices or services; requests for removal based
on copyright reasons have also shown a steady increase since then.
DISCUSSION AND CO NCLUS IO N
Based on the observations above about government content‐removal requests between
2009 and 2021, we can speculate on the three main phases for an in‐depth understanding of
the “Public Health era,” or the phase of “Platformized Internet.”
TABLE 3
Multiple comparisons of the compliance rate.
Compli ance rate
Contrast
S.E.
p Value
Electoral Autocracy v. Closed Autocracy
−16.666
4.822
0.005
Electoral Democracy v. Closed Autocracy
7.711
4.848
0.504
Liberal Democracy v. Closed Autocracy
13.630
4.137
0.009
Russia v. Closed Autocracy
20.089
6.879
0.030
Electoral Democracy v. Electoral Autocracy
24.378
4.251
0.000
Liberal Democracy v. Electoral Autocracy
30.296
3.418
0.000
Russia v. Electoral Autocracy
36.755
6.473
0.000
5.919
3.454
0.427
12.378
6.492
0.315
6.459
5.979
0.817
Liberal Democracy v. Electoral Democracy
Russia v. Electoral Democracy
Russia v. Liberal Democracy
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
|
POLICY AND INTERNET
14
|
TABLE 4
Top 10 reasons for removal requests by political systems.
Rank Closed autocracy
Electoral autocracy
Electoral democracy
Liberal democracy
Russia
1
Government Criticism (1977)
Defamation (9436)
Defamation (4923)
Defamation (13741)
National Security (45,399)
2
Violence (957)
Privacy and
Security (4307)
Privacy and Security (2311) Privacy and Security (9858)
Copyright (41,489)
3
National Security (187)
Obscenity/Nudity (3360)
Electoral Law (828)
National Security (3306)
Regulated Goods and Services
(23,484)
4
Regulated Goods and
Services (109)
National Security (2702)
Other (530)
Regulated Goods and
Services (2406)
Drug Abuse (11,570)
5
Other (86)
Religious Offense (2411)
Adult Content (440)
Fraud (1762)
Hate Speech (3485)
6
Hate Speech (55)
Fraud (1029)
Impersonation (280)
Other (1754)
Fraud (3149)
7
Defamation (53)
Reason Unspecified (920)
Copyright (270)
Bullying/Harassment (1520)
Violence (3096)
8
Privacy and Security (35)
Other (820)
Reason Unspecified (210)
Copyright (1405)
Other (2408)
9
Reason Unspecified (18)
Copyright (720)
Trademark (168)
Trademark (904)
Suicide Promotion (2215)
10
Copyright (17)
Government
Criticism (666)
Hate Speech (150)
Violence (810)
Government Criticism (1656)
PARK and SANG
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
FIGURE 9
|
15
Reasons for removal requests by political systems.
The first feature surfaced is the growth of moral censorship in the early 2010s, in which
governments' content regulation was mainly done for moral reasons, such as blocking illegal
sexual materials and other indecent content. The next juncture points to the stronger
protection of individual rights, reducing the risk of privacy infringement or defamation, which
prevailed in the mid‐2010s as awareness of the vulnerabilities of the “free” online information
environment grew. This trend continued through the late 2010s, entering into a movement to
strengthen the overall power of the state, enabling the state to “police” the Internet for
various reasons. Content was requested for removal for national security concerns or claims
of criticizing government policy or politicians during this phase. Notably, such state
authorities further extended into policing of other illegal or criminal activities such as fraud,
drugs, or other regulated content or services. In these processes, the overarching trends
delineating the entire decade boil down to increased coordination between national
sovereignty, judicial systems, and platforms, centered on the growing presence of
government authorities. The heightening size and scope of government requests and the
fact that over time, the compliance rate of court‐friendly platforms is balanced with nation‐
state requests, are also indicative of the growing importance of nation states in online
content governance.
The late 2010s, specifically the year of 2016, could be considered a historic marker of
accelerating nation states' impetus for regulating online content: we cannot know exactly
how much they were influenced by the case of Russian interference in the 2016 US
presidential election. Given that the take‐down request trend began to reverse in the first half
of 2016, government officials may have started to become aware of online harms before
that. Still, given the surge in government request to take‐down content during the late 2017,
the time the US government formalized an investigation into the case as a result of
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
POLICY AND INTERNET
|
F I G U R E 10
PARK and SANG
Reasons for removal requests from Russia, 2011–2021.
congressional disputes, we can at least speculate that this event may have served as a
catalyst for tightening government control over the Internet.
This “national turn” signals what the future might hold in the context of our right to
freedom of expression. In addition to the sheer size and scope of regulation, the results of
this study showed how differently online content is regulated by governments and courts
across the globe. Our observations suggest that courts have focused on regulating a
relatively small number of key areas, and the executive branch has gradually expanded its
scope of regulation into a variety of content sectors. The expansion of these executive
interventions, which implies the potential for unchecked and undemocratic powers to be
exercised when highly concentrated, suggests the possibility of increasing uncertainty about
the status of freedom of expression.
Moreover, the Russian case can be another example that shows how even a
“platformized” internet can be tamed if a nation is strongly motivated to keep the platform
under control. Although not statistically significant, Google's rate of compliance with Russian
requests was higher than in countries with liberal democracies, let alone other types of
political systems. Although more democracy does not always guarantee more compliance
from the platform, as evidenced by the fact that compliance with the electoral autocracies
was lower than that of the least democratic closed dictatorships, the general tendency for
Google to be more compliant in democracies than in autocracies is that the value system of
this US‐based platform company, even while operating in a different political system, has a
deeper ideology consistent with “the American view of what is appropriate”
(Roberts, 2019, p. 64).
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
16
|
17
Our findings may be somewhat limited by the nature of the case study approach.
Although Google is one of the key players in global content moderation pipelines, there are
also other similar actors that deserve further scrutiny. Also, our selection of 25 countries with
the most take‐down requests might not fully account for the subtle changes in many other
countries with different backgrounds and circumstances. It should also be noted that
governments' request for content removal is just one form of content regulation that nation
states can enforce. Although it is one of the indispensable cases that can be observed
through long‐term cumulative evidence of the content regulation activities of the government
in connection with a giant platform, there are many other ways in which that power is
exercised or even abused, such as passing laws abridging the freedom of speech,
controlling access to certain messages, blocking the Internet among others. The
implications of our current findings may not be extrapolated to these other forms of
regulation. Lastly, the dimensions of why content is removed are sometimes not clear (e.g.,
ambiguous categorization such as co‐occurrence of “obscenity/nudity” and “adult content”),
but this reflects a lack of commitment by Google to be transparent about what has changed
while developing its own classification system. Nonetheless, the results of this study provide
a deeper insight into how the presence of nation states has been magnified in tackling
problematic online content, gradually constituting a composite regulatory structure of
governments and market. Above all, our research results that illustrate how this structure
could have different trajectories depending on the characteristics of political system provide
further theoretical insight into the regulation of content on the Internet.
ORCID
Yoonmo Sang
http://orcid.org/0000-0003-3439-0417
ENDNOTES
1
Google has made available the government content‐removal requests data set since 2009.
2
As a side note, the top 10 Google products with the most government content‐removal requests from 2009 to
2021 were YouTube (26.3%), Google Search (18.5%), Blogger (13.6%), Google+ (5.0%), Google Images
(4.9%), Gmail (3.9%), Google Docs (3.3%), Google Play Apps (2.8%), Google Sites (2.2%), and Google Photos
(2.2%). (Data collected in March 2022 from: https://transparencyreport.google.com/government-removals/
overview).
3
Google has categorized 20 reasons for removal requests, including national security, defamation, copyright,
regulated goods and services, and privacy and security among others. For further details: https://support.google.
com/transparencyreport/answer/7347744?hl=en#zippy=%2Chow-do-you-define-reasons-for-removal-requests.
4
See Herre (2021), The “Regimes of the World” data: how do researchers measure democracy?, Our World In
Data. Available at: https://ourworldindata.org/regimes-of-the-world-data Our World In Data is the flagship output
of the Oxford Martin Program on Global Development and Global Change Data Lab.
REFERENCES
Ananny, M., & Gillespie, T. (2016). Public platforms: Beyond the cycle of shocks and exceptions. IPP2016 The
Platform Society. https://blogs.oii.ox.ac.uk/ipp-conference/sites/ipp/files/documents/anannyGillespie-public
Platforms-oii-submittedSept8.pdf
Bowers, J., & Zittrain, J. (2020). Answering impossible questions: Content governance in an age of disinformation.
Harvard Kennedy School Misinformation Review, 1(1), 1–8. https://doi.org/10.37016/mr-2020-005
Brown, E. (2022). The top reasons countries ask Google to remove content. Zdnet.com. January 31. https://www.
zdnet.com/article/the-top-reasons-countries-ask-google-to-remove-content/
Calo, R., & Hartzog, W. (2021). Op‐Ed: Banning Trump from Twitter and Facebook isn't enough. Los Angeles
Times. https://www.latimes.com/opinion/story/2021-01-15/facebook-twitter-extremism-donald-trump-violence
Consultative Council of European Judges (CCJE). (2015, October 16). On the position of the judiciary and its
relation with the other powers of state in a modern democracy. Opinion no. 18, Council of Europe, London.
https://rm.coe.int/16807481a1
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
POLICY AND INTERNET
|
PARK and SANG
Fay, R. (2019). Digital platforms require a global governance framework. Centre for International Governance
Innovation. https://www.cigionline.org/articles/digital-platforms-require-global-governance-framework/
Flew, T. (2021). Regulating platforms. Polity Press.
Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital
communication platform governance. Journal of Digital Media & Policy, 10(1), 33–50. https://doi.org/10.1386/
jdmp.10.1.33_1
Flew, T., & Wilding, D. (2020). The turn to regulation in digital communication: The ACCC's digital platforms
inquiryand Australian media policy. Media, Culture & Society, 43(1), 48–65. https://doi.org/10.1177/
0163443720926044
Flew, T., Gillett, R., Martin, F., & Sunman, L. (2021). Return of the regulatory state: A stakeholder analysis of
Australia's digital platforms inquiry and online news policy. The Information Society, 37(2), 128–145.
Garbe, L., Selvik, L.‐M., & Lemaire, P. (2021). How African countries respond to fake news and hate speech.
Information, Communication & Society, 10(1). https://doi.org/10.1080/1369118X.2021.1994623
Gillespie, T., Aufderheide, P., Carmi, E., Gerrard, Y., Gorwa, R., Matamoros‐Fernández, A., Roberts, S. T.,
Sinnreich, A., & Myers West, S. (2020). Expanding the debate about content moderation: Scholarly research
agendas for the coming policy debates. Internet Policy Review, 9(4), 1–29. https://doi.org/10.14763/2020.
4.1512
Google. (n.d. a). Government requests to remove content. Retrieved August 20 2022, from https://
transparencyreport.google.com/government-removals/government-requests
Google. (n.d. b). Google transparency report. Retrieved October 23 2022, from https://transparencyreport.google.
com/about
Google. (2022). Government requests to remove content. [Data set]. Google. https://storage.googleapis.com/
transparencyreport/google-government-removals.zip
Gorwa, R. (2019). What is platform governance? Information, Communication & Society, 22(6), 854–871. https://
doi.org/10.1080/1369118X.2019.1573914
Haider, J., & Rödl, M. (2023). Google search and the creation of ignorance: The case of the climate crisis. Big Data
& Society. https://doi.org/10.1177/20539517231158997
Herre. (2021, December 2). The ‘Regimes of the World’ data: How do researchers measure democracy? Our World
In Data. https://ourworldindata.org/regimes-of-the-world-data#:~:text=Regimes%20of%20the%20World%
20distinguishes%20four%20types%20of,the%20government%20or%20the%20legislature%20through%
20multi-party%20elections
Hovyadinov, S. (2019). Toward a more meaningful transparency: Examining Twitter, Google, and Facebook's
transparency reporting and removal practices in Russia. https://doi.org/10.2139/ssrn.3535671
Lührmann, A., Tannenberg, M., & Lindberg, S. I. (2018). Regimes of the world (RoW): Opening new avenues for the
comparative study of political regimes. Politics and Governance, 6(1), 60–77.
Min, C., Shen, F., & Yu, W. (2021). Removing incivility from Google: What determines the number of government
content take‐down requests? Government Information Quarterly, 38(1), 101542. https://doi.org/10.1016/j.giq.
2020.101542
Napoli, P. (2019). Social media and the public interest: Media regulation in the disinformation age. Columbia
University Press.
Napoli, P., & Caplan, R. (2017). Why media companies insist they're not media companies, why they're wrong, and
why it matters. First Monday, 22(5). https://doi.org/10.5210/fm.v22i5.7051
Nocetti, J. (2015). Contest and conquest: Russia and global internet governance. International Affairs, 91(1),
111–130. https://doi.org/10.1111/1468-2346.12189
Organisation for Economic Co‐operation and Development (OECD). (2022, February 28). Constitutions in OECD
Countries: A comparative study. Background report in the context of Chile's constitutional process, OECD
iLibrary. https://doi.org/10.1787/ccb3ca1b-en
Picard, R. G., & Pickard, V. (2017). Essential principles for contemporary media and communications policy
making. Reuters Institute for the Study of Journalism.
Popiel, P. (2022). Regulating datafication and platformization: Policy silos and tradeoffs in international platform
inquiries. Policy & Internet, 14, 28–46. https://doi.org/10.1002/poi3.283
Popiel, P., & Sang, Y. (2021). Platforms' governance: Analyzing digital platforms' policy preferences. Global
Perspectives, 2(1), 1–13.
Roberts, S. T. (2019). Behind the screen. Yale University Press.
Schwarz, A. (2022). The hitchhiker's guide to web‐mediated text: Method handbook for quantification of online
linguistic data in a country‐specific context. Official research report, Linguistic Explorations of Societies (Work
Package 1). University of Gothenburg.
Smith, M., & Van Alstyne, M. (2021). It's time to update section 230. Harvard Business Review. https://hbr.org/
2021/08/its-time-to-update-section-230
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
18
|
19
Steiner, M., Magin, M., Stark, B., & Geiß, S. (2022). Seek and you shall find? A content analysis on the diversity of
five search engines' results on political queries. Information, Communication & Society, 25(2), 217–241.
https://doi.org/10.1080/1369118X.2020.1776367
van Dijck, J. (2020). Governing digital societies: Private platforms, public values. Computer Law & Security
Review, 36, 105377.
Vibert, F. (2007). The rise of the unelected: Democracy and the new separation of powers. Cambridge University
Press.
How to cite this article: Park, S., & Sang, Y. (2023). The changing role of nation
states in online content governance: A case of Google's handling of government
removal requests. Policy & Internet, 1–19. https://doi.org/10.1002/poi3.342
19442866, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/poi3.342 by University of Canberra Library, Wiley Online Library on [11/05/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
POLICY AND INTERNET