Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Creating Repeatable Computer Science and Networking Experiments on Shared, Public Testbeds

Published: 20 January 2015 Publication History

Abstract

There are many compelling reasons to use a shared, public testbed such as GENI, Emulab, or PlanetLab to conduct experiments in computer science and networking. These testbeds support creating experiments with a large and diverse set of resources. Moreover these testbeds are constructed to inherently support the repeatability of experiments as required for scientifically sound research. Finally, the artifacts needed for a researcher to repeat their own experiment can be shared so that others can readily repeat the experiment in the same environment.
However using a shared, public testbed is different from conducting experiments on resources either owned by the experimenter or someone the experimenter knows. Experiments on shared, public testbeds are more likely to use large topologies, use scarce resources, and need to be tolerant to outages and maintenances in the testbed. In addition, experimenters may not have access to low-level debugging information.
This paper describes a methodology for new experimenters to write and deploy repeatable and sharable experiments which deal with these challenges by: having a clear plan; automating the execution and analysis of an experiment by following best practices from software engineering and system administration; and building scalable experiments. In addition, the paper describes a case study run on the GENI testbed which illustrates the methodology described.

References

[1]
Adaptable profile-driven testbed. https://aptlab.net/.
[2]
Amazon web services website. http://aws.amazon.com.
[3]
Bitbucket. https://bitbucket.org/.
[4]
Case study archive: Measuring ospf updates. http://groups.geni.net/geni/wiki/PaperOSRMethodology.
[5]
Github. https://github.com/.
[6]
J. Albrecht and D. Y. Huang. Managing distributed applications using gush. In Testbeds and Research Infrastructures. Development of Networks and Communities, pages 401--411. Springer, 2011.
[7]
M. P. Barcellos, R. S. Antunes, H. H. Muhammad, and R. S. Munaretti. Beyond network simulators: Fostering novel distributed applications and protocols through extendible design. J. Netw. Comput. Appl., 35(1):328--339, Jan. 2012.
[8]
N. Bastin. geni-lib. http://geni-lib.readthedocs.org, 2013-2014.
[9]
M. Berman, J. S. Chase, L. Landweber, A. Nakao, M. Ott, D. Raychaudhuri, R. Ricci, and I. Seskar. Geni: A federated testbed for innovative network experiments. Computer Networks, 61(0):5--23, 2014. Special issue on Future Internet Testbeds Part I.
[10]
J. Chase, L. Grit, D. Irwin, V. Marupadi, P. Shivam, and A. Yumerefendi. Beyond virtual data centers: Toward an open resource control architecture. In in Selected Papers from the International Conference on the Virtual Computing Initiative (ACM Digital Library), ACM, 2007.
[11]
F. Desprez, G. Fox, E. Jeannot, K. Keahey, M. Kozuch, D. Margery, P. Neyron, L. Nussbaum, C. Perez, O. Richard, W. Smith, G. von Laszewski, and J. Voeckler. Supporting experimental computer science. 03/2012 2012.
[12]
J. Duerig and et al. Automatic ip address assignment on network topologies, 2006.
[13]
S. Fdida, J. Wilander, T. Friedman, A. Gavras, L. Navarro, M. Boniface, S. MacKeith, S. Avéssta, and M. Potts. FIRE Roadmap Report 1 Part II, Future Internet Research and Experimentation (FIRE), 2011. http://www.ict-fire.eu/home.html.
[14]
D. G. Feitelson. From repeatability to reproducibility and corroboration. Operating Systems Review, January 2015. Special Issue on Repeatability and Sharing of Experimental Artifacts.
[15]
M. Fernandez, S. Wahle, and T. Magedanz. A new approach to ngn evaluation integrating simulation and testbed methodology. In Proceedings of the The Eleventh International Conference on Networks (ICN12), 2012.
[16]
R. Jain. The Art of Computer Systems Performance Analysis: Techniques for Experimental Design, Measurement, Simulation, and Modeling. John Wiley & Sons, Inc., 1991.
[17]
J. Q. Jonathan Klinginsmith. Using cloud computing for scalable, reproducible experimentation. Technical report, School of Informatics and Computing Indiana University, Indiana University, Bloomington, Indiana, August 2012.
[18]
G. Jourjon, T. Rakotoarivelo, C. Dwertmann, and M. Ott. Labwiki: An executable paper platform for experiment-based research. Procedia Computer Science, 4(0):697--706, 2011. Proceedings of the International Conference on Computational Science, ICCS 2011.
[19]
L. Keller, P. Upadhyaya, and G. Candea. Conferr: A tool for assessing resilience to human configuration errors. In Dependable Systems and Networks With FTCS and DCC, 2008. DSN 2008. IEEE International Conference on, pages 157--166, June 2008.
[20]
R. LeVeqije, I. Mitchell, and V. Stodden. Reproducible research for scientific computing: Tools and strategies for changing the culture. Computing in Science Engineering, 14(4):13--17, July 2012.
[21]
D. Lilja. Measuring Computer Performance: A Practitioner's Guide. Cambridge University Press, 2000.
[22]
X. Liu, P. Juluri, and D. Medhi. An experimental study on dynamic network reconfiguration in a virtualized network environment using autonomic management. In F. D. Turck, Y. Diao, C. S. Hong, D. Medhi, and R. Sadre, editors, IM, pages 616--622. IEEE, 2013.
[23]
J. P. Mesirov. Computer science. accessible reproducible research. Science (New York, N.Y.), 2010.
[24]
J. Mirkovic, T. Benzel, T. Faber, R. Braden, J. Wroclawski, and S. Schwab. The deter project: Advancing the science of cyber security experimentation and test. In Technologies for Homeland Security (HST), 2010 IEEE International Conference on, pages 1--7, Nov 2010.
[25]
D. Oppenheimer, A. Ganapathi, and D. A. Patterson. Why do internet services fail, and what can be done about it? In Proceedings of the 4th Conference on USENIX Symposium on Internet Technologies and Systems -- Volume 4, USITS'03, pages 1--1, Berkeley, CA, USA, 2003. USENIX Association.
[26]
K. Park and V. S. Pai. Comon: A mostly-scalable monitoring system for planetlab. SIGOPS Oper. Syst. Rev., 40(1):65--74, Jan. 2006.
[27]
V. Paxson. Strategies for sound internet measurement. In Proceedings of the 4th ACM SIGCOMM Conference on Internet Measurement, IMC '04, pages 263--271, New York, NY, USA, 2004. ACM.
[28]
L. Peterson, T. Anderson, D. Culler, and T. Roscoe. A blueprint for introducing disruptive technology into the internet. SIGCOMM Comput. Commun. Rev., 33(1):59--64, Jan. 2003.
[29]
T. Rakotoarivelo, G. Jourjon, and M. Ott. Designing and orchestrating reproducible experiments on federated networking testbeds. Computer Networks, 63(0):173--187, 2014. Special issue on Future Internet Testbeds Part II.
[30]
T. Rakotoarivelo, M. Ott, G. Jourjon, and I. Seskar. Omf: A control and management framework for networking testbeds. SIGOPS Oper. Syst. Rev., 43(4):54--59, Jan. 2010.
[31]
D. Raychaudhuri, I. Seskar, M. Ott, S. Ganu, K. Ramachandran, H. Kremo, R. Siracusa, H. Liu, and M. Singh. Overview of the orbit radio grid testbed for evaluation of next-generation wireless network protocols. In Wireless Communications and Networking Conference, 2005 IEEE, volume 3, pages 1664--1669 Vol. 3, March 2005.
[32]
V. Stodden, F. Leisch, and R. D. Peng, editors. Implementing Reproducible Research. The R Series. Chapman and Hall/CRC, Apr. 2014.
[33]
W. F. Tichy, P. Lukowicz, L. Prechelt, and E. A. Heinz. Experimental evaluation in computer science: A quantitative study. Journal of Systems and Software, 28(1):9--18, 1995.
[34]
B. White, J. Lepreau, L. Stoller, R. Ricci, S. Guruprasad, M. Newbold, M. Hibler, C. Barb, and A. Joglekar. An integrated experimental environment for distributed systems and networks. SIGOPS Oper. Syst. Rev., 36(SI):255--270, Dec. 2002.

Cited By

View all
  • (2023)A survey on network simulators, emulators, and testbeds used for research and educationComputer Networks10.1016/j.comnet.2023.110054237(110054)Online publication date: Dec-2023
  • (2020)Impact of etcd deployment on Kubernetes, Istio, and application performanceSoftware: Practice and Experience10.1002/spe.288550:10(1986-2007)Online publication date: 7-Aug-2020
  • (2018)Reproducibility in Scientific ComputingACM Computing Surveys10.1145/318626651:3(1-36)Online publication date: 16-Jul-2018
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM SIGOPS Operating Systems Review
ACM SIGOPS Operating Systems Review  Volume 49, Issue 1
Special Issue on Repeatability and Sharing of Experimental Artifacts
January 2015
155 pages
ISSN:0163-5980
DOI:10.1145/2723872
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 20 January 2015
Published in SIGOPS Volume 49, Issue 1

Check for updates

Author Tags

  1. GENI
  2. experiment methodology
  3. networking testbeds
  4. repeatable experiments
  5. system administration

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)26
  • Downloads (Last 6 weeks)0
Reflects downloads up to 25 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)A survey on network simulators, emulators, and testbeds used for research and educationComputer Networks10.1016/j.comnet.2023.110054237(110054)Online publication date: Dec-2023
  • (2020)Impact of etcd deployment on Kubernetes, Istio, and application performanceSoftware: Practice and Experience10.1002/spe.288550:10(1986-2007)Online publication date: 7-Aug-2020
  • (2018)Reproducibility in Scientific ComputingACM Computing Surveys10.1145/318626651:3(1-36)Online publication date: 16-Jul-2018
  • (2018)Computer-Aided Reproducibility2018 International Conference on Computing, Networking and Communications (ICNC)10.1109/ICCNC.2018.8390274(127-133)Online publication date: Mar-2018
  • (2018)Semantic driven code generation for networking testbed experimentationEnterprise Information Systems10.1080/17517575.2018.150913512:8-9(1083-1099)Online publication date: 28-Aug-2018
  • (2017)Taking the Edge off with EspressoProceedings of the Conference of the ACM Special Interest Group on Data Communication10.1145/3098822.3098854(432-445)Online publication date: 7-Aug-2017
  • (2017)Learning Reproducibility with a Yearly Networking ContestProceedings of the Reproducibility Workshop10.1145/3097766.3097769(9-13)Online publication date: 11-Aug-2017
  • (2017)Improving the performance and reproducibility of experiments on large-scale testbeds with k-coresComputer Communications10.1016/j.comcom.2017.05.016110:C(35-47)Online publication date: 15-Sep-2017
  • (2016)The Experimenter’s View of GENIThe GENI Book10.1007/978-3-319-33769-2_15(349-379)Online publication date: 1-Sep-2016
  • (undefined)Dynamic virtual network restoration with optimal standby virtual router selectionNOMS 2016 - 2016 IEEE/IFIP Network Operations and Management Symposium10.1109/NOMS.2016.7502935(973-978)

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media