ContentServer PDF
ContentServer PDF
ContentServer PDF
531
Masashi Crete-Nishihata
Citizen Lab and Canada Centre for Global Security Studies, Munk School of Global
Affairs, University of Toronto
Introduction
In December of 2008, researchers from the University of Bonn and RWTH Aachen
University presented a talk at the 25th Chaos Communication Congress. The
researchers revealed they had the ability to take over command and control (C&C)
functions of the Storm botnet (a huge network of compromised computers), includ-
ing instructing compromised computers to download and execute codes. Although
the Storm botnet was being used to engage in illegal activities worldwide, the
security group cautioned that intervening in ways that affect compromised com-
puters is prohibited in many jurisdictions. They ultimately decided to monitor and
document, but not interfere, in the botnet’s activities (Danchev, 2009).
In April 2011, the Federal Bureau of Investigation and the U.S. Justice Depart-
ment issued a warrant that allowed Microsoft security personnel to take over the
Coreflood botnet, which had used stolen user credentials to steal an estimated $100
million from victims. With a court order permitting them to take over the C&C
servers, the personnel sent codes directly to compromised machines, stopping the
bots from contacting the C&C servers and effectively disabling the botnet. Govern-
ment filings claimed that the commands sent would not cause any damage to
compromised machines and would not provide officials with access to any user data.
However, some questioned how certain investigators could be that this approach
would not damage sensitive equipment, while others questioned whether this would
open the floodgates to other requests to disrupt the computers of individuals
engaged in questionable activity for a wide range of reasons (Richmond, 2011;
Zetter, 2011).
These two examples offer just a few illustrations of the growing number of
complex ethical and legal issues around cyberspace research, particularly in the
area of security. Our research group, the Citizen Lab (http://www.citizenlab.org/),
has been involved in several cases of cyber security research where ethical and legal
issues like these have confronted us directly.1 The cases include documenting major
global cyber espionage networks infecting thousands of computers in hundreds of
countries (Information Warfare Monitor, 2009), recovering copies of hundreds of
classified documents that were exfiltrated from victims in national security estab-
Review of Policy Research, Volume 28, Number 5 (2011)
© 2011 by The Policy Studies Organization. All rights reserved.
532 Ronald Deibert and Masashi Crete-Nishihata
security secrets. With whom and how should the data be shared? How should the
data be stored? What about data that is owned by a government and classified as
“restricted” or “secret,” or proprietary information that is stolen from a business?
Researchers may ask themselves under what conditions should sensitive or other-
wise confidential/private information be published, or redacted in some manner,
which bears upon questions of self-censorship and academic freedom. A decision
has to be rendered as to whether the publication of such information constitutes a
“public good,” and whether it needlessly endangers individuals or organizations, or
violates their rights to privacy. In some cases, such as handling of personally
identifiable information, there are clear guidelines from other areas of research and
public law that should be employed in the domain of cyberspace research. In other
areas—whether to publish national security secrets or proprietary information—
there is not a large prior precedence from which to draw guidance. As several
recent high-profile leaks and hacking attacks testify, the area as a whole is highly
turbulent and many sensitive and proprietary documents have been pushed to the
public domain
Notification in such circumstances may help rid the Internet of malicious activity,
but it could also amount to a form of extra-judicial vigilantism, whereby users are
severed from the Internet without due process because of the actions taken by
researchers in coordination with hosting companies.
Conclusion
The rapid and extensive growth of cyberspace has made the domain a global
environment that increasingly pervades all human activity. This expansion has
brought about steadily increasing digital footprints and information saturation. We
now live in the era of “big data.” As the deluge of information continues so do efforts
to develop data analysis and fusion capabilities to monitor and understand the
properties and interactions of cyberspace. At the forefront of these efforts are
technology companies and government agencies who have clear interests and
extensive resources to advance analytical tools and techniques (Lazer et al., 2009).
The demand for such development has created a major commercial sector and
there are estimates that the global cyber security market is between USD 80 and 140
billion annually (Seetharaman, 2010). As cyberspace continues to expand, and
malicious activity increases, the questions outlined in our paper will only grow and
become more critical.
Although best practices for cyberspace research will evolve in time, until then
there are several steps that can be taken to nurture the processes of good research
while guarding against unethical and even illegal choices.
Further engagement on all of the issues stated earlier, through the vehicles of
interdisciplinary workshops and case studies of real-world research projects, will
536 Ronald Deibert and Masashi Crete-Nishihata
help illuminate the ethical and legal puzzles facing researchers in this area. Cyber-
space researchers should engage in ethics review committees and research ethics
should be introduced in computer security education and conferences to help
promote dialogue and increased knowledge transfer between these communities.3
The Need for Explicit Research Rationales and the Use of Research Warrants
In lieu of a research ethics board review for specific projects, which may not be
appropriate in every circumstance of cyberspace research (for example, technical
research that does not involve human subjects), researchers should aim to set the
highest possible standards for every step of their research, including methods, data
handling, and notification and outreach. The best way to ensure that these stan-
dards are met is through careful, clear, and explicit documentation of research
methods and justification of the choices that are made along each step of a particu-
lar project. Doing so can build up reference points and a knowledge base for future
research. In the reports in which the Citizen Lab has faced vexing ethical and legal
questions, we have employed “research warrants,” written by the principal investi-
gator, that outline the nature and justification for all aspects of the research, which
are then incorporated into the text of the published reports. Researchers facing
similar uncertainties may want to employ the same “research warrant” model as a
way to build up case-studies and lessons learned for future research.
Research Advocacy
Research Autonomy
Acknowledgements
We are grateful for the input and assistance of Rafal Rohozinski, Adam Senft, Howard
Simkevitz, Nart Villeneuve, and our colleagues at the OpenNet Initiative and the Information
Warfare Monitor projects. We would like to especially thank all the participants of CTRL X
Ethics workshop held on January 17–18, 2011 at the Munk School of Global Affairs (University
of Toronto) for engaging the issues with us and sharing their thoughts and feedback. Support
for this project was provided by the Canada Centre for Global Security Studies, the John D. and
Catherine T. MacArthur Foundation, and the SecDev Group (Ottawa).
Notes
1 Dr. Rafal Rohozinski, Senior Scholar at the Canada Centre for Global Security Studies, Munk School
of Global Affairs (University of Toronto) and CEO of the SecDev Group and Psiphon Inc. is a principal
investigator of the Information Warfare Monitor and OpenNet Initiative. Dr. Rohozinski was instru-
mental in all of these projects as well as the workshop from which this paper draws.
2 A synthesis and analytical report on the workshop is available at http://www.infowar-monitor.net/
ethicsreport
3 There has been some recent progress in this area. For example, the Symposium on Usable Privacy and
Security now requires that article submissions outline “how the authors addressed any ethical consid-
erations applicable to the research and user studies, such as passing an Institutional review” (see
http://cups.cs.cmu.edu/soups/2011/cfp.html). In other efforts the U.S. Department of Homeland Secu-
rity has convened a multi-stakeholder working group to draft ethical guidelines for cyber security
research (see Kenneally et al., 2011).
References
Danchev, D. (2009, January 16). Legal concerns stop researchers from disrupting the Storm Worm botnet. ZD Net.
Retrieved from http://www.zdnet.com/blog/security/legal-concerns-stop-researchers-from-disrupting-the-
storm-worm-botnet
Deibert, R., & Rohozinski, R. (2011, March 28). The new cyber military-industrial complex. Globe and mail.
Retrieved from http://www.theglobeandmail.com/news/opinions/opinion/the-new-cyber-military-
industrial-complex/article1957159
Dittrich, D., Bailey, M., & Dietrich, S. (2010). Building an active computer security ethics community. IEEE
Security and Privacy, 99, 88–93.
Garfinkel, S. L. (2008). IRBs and security research: Myths, facts and mission creep. In proceedings of USENIX
Usability, Privacy, and Security 2008, April 14, San Francisco, CA. Retrieved from http://www.usenix.org/
events/upsec08/tech/full_papers/garfinkel/garfinkel.pdf
Information Warfare Monitor. (2009). Tracking ghostnet: Investigating a cyber espionage network. Retrieved from
http://infowar-monitor.net/ghostnet
Information Warfare Monitor. (2010). Shadows in the cloud: Investigating cyber espionage 2.0. Retrieved from
http://shadows-in-the-cloud.net
Kenneally, E., Stavrou, A., McHugh, J., & Christin, N. (2011). Moving forward, building an ethics community (panel
statements). In proceedings of 2nd Workshop on Ethics in Computer Security Research 2011. Retrieved
from http://www.caida.org/ . . . /moving_forward_building_ethics/moving_forward_building_ethics.pdf
Ku, V. (2004–2005). A critique of the digital millennium copyright act’s exemption on encryption research: Is
the exemption too narrow? Yale Journal of Law & Technology, 7, 466–490.
Lazer, D., Pentland, A., Adamic, L., Aral, S., Barabási, A., Brewer, D., et al. (2009). Computational social science.
Science, 329, 721–723.
Richmond, R. (2011, April 14). U.S. says it shut down password theft network. New York Times. Retrieved
from http://bits.blogs.nytimes.com/2011/04/14/u-s-says-it-shut-down-password-theft-network/?scp=4&
sq=rustock&st=cse
Seetharaman, D. (2010, September 10). Arms makers turn focus from bombs to bytes. Reuters. Retrieved from http:
//www.reuters.com/article/2010/09/10/us-aero-arms-summit-cybersecurity-idUSTRE6893EI20100910
Villeneuve, N. (2008). Breaching trust: An analysis of surveillance and security practices on China’s TOM-Skype platform.
Information Warfare Monitor. Retrieved from http://infowar-monitor.net/breachingtrust
Villeneuve, N. (2010). Koobface: Inside a crimeware network. Information Warfare Monitor. Retrieved from
http://infowar-monitor.net/koobface
Zetter, K. (2011, April 13). With court order, FBI hijacks “Coreflood” botnet, sends kill signal. Wired: Threat level.
Retrieved from http://www.wired.com/threatlevel/2011/04/coreflood/
Copyright of Review of Policy Research is the property of Wiley-Blackwell and its content may not be copied
or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission.
However, users may print, download, or email articles for individual use.