Abstract
Emulation testbeds are increasingly used in an effort to promote repeatable experiments in the area of distributed systems and networking. In this paper we are studying how different design choices, e.g. use of specific tools, can affect the repeatability of experiments of an emulation testbed (e.g. based on the Emulab software).
Our study is based on multiple experiments that are checked for stability and consistency (e.g., repetition of the same experiment and measurement of the mean and standard deviation of our metrics). The results indicate that repeatability of quantitative results is possible, under a degree of expected statistical variation. The event scheduling mechanism of Emulab is proven to be accurate down to a sub-second granularity. On the other hand we demonstrate that there are significant differences between traffic generation tools in terms of consistent recreation of a predefined traffic pattern and therefore experiment repeatability.
The main contribution of this study is that based on experimental results we provide scientific proofs that Emulab as a platform can be used for scientifically rigorous experiments for networking research. New users of Emulab can benefit from this study by understanding that Emulab’s scheduling mechanism, it’s built-in packet generators and Iperf can sufficiently support repeatable experiments while TCPreplay cannot and therefore an alternative tool, i.e. TCPivo should be used.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Pawlikowski, K., Joshua Jeong, H.d., Ruth Lee, J.s.: On credibility of simulation studies of telecommunication networks. IEEE Communications Magazine 40, 132–139 (2002)
Benzel, T., Braden, R., Kim, D., Neuman, C., Joseph, A., Sklower, K., Ostrenga, R., Schwab, S.: Design, deployment, and use of the deter testbed. In: DETER: Proceedings of the DETER Community Workshop on Cyber Security Experimentation and Test on DETER Community Workshop on Cyber Security Experimentation and Test 2007, p. 1. USENIX Association, Berkeley (2007)
Neville, S.W., Li, K.F.: The rational for developing larger-scale 1000+ machine emulation-based research test beds. In: International Conference on Advanced Information Networking and Applications Workshops, pp. 1092–1099 (2009)
Emulab Bibliography, http://www.emulab.net/expubs.php/
White, B., Lepreau, J., Stoller, L., Ricci, R., Guruprasad, S., Newbold, M., Hibler, M., Barb, C., Joglekar, A.: An integrated experimental environment for distributed systems and networks. In: Proc. of the Fifth Symposium on Operating Systems Design and Implementation, pp. 255–270. USENIX Association, Boston (2002)
DETER. cyber-DEfense Technology Experimental Research laboratory Testbed, http://www.isi.edu/deter/
Mirkovic, J., Hussain, A., Fahmy, S., Reiher, P.L., Thomas, R.K.: Accurately measuring denial of service in simulation and testbed experiments. IEEE Trans. Dependable Sec. Comput. 6(2), 81–95 (2009)
Anderson, D.S., Hibler, M., Stoller, L., Stack, T., Lepreau, J.: Automatic online validation of network configuration in the emulab network testbed. In: ICAC 2006: Proceedings of the 2006 IEEE International Conference on Autonomic Computing, pp. 134–142. IEEE Computer Society, Washington, DC (2006)
Chertov, R., Fahmy, S., Shroff, N.B.: Fidelity of network simulation and emulation: A case study of tcp-targeted denial of service attacks. ACM Trans. Model. Comput. Simul. 19(1), 1–29 (2008)
Guglielmi, M., Fovino, I.N., Garcia, A.P., Siaterlis, C.: A preliminary study of a wireless process control network using emulation testbed. In: Proc. of the 2nd International Conference on Mobile Lightweight Wireless Systems. ICST, Barcelona (2010)
ISI, Network simulator ns-2, http://www.isi.edu/nsnam/ns/
Rizzo, L.: Dummynet: a simple approach to the evaluation of network protocols. SIGCOMM Comput. Commun. Rev. 27(1), 31–41 (1997)
Andres Perez Garcia, M.M., Siaterlis, C.: Testing the fidelity of an emulab testbed. In: Proc. of the 2nd workshop on Sharing Field Data and Experiment Measurements on Resilience of Distributed Computing Systems, Genova, Italy (June 2010)
Turner, A.: Tcpreplay tool, http://tcpreplay.synfin.net/trac/
Feng, W.-C., Goel, A., Bezzaz, A., Feng, W.-C., Walpole, J.: Tcpivo: a high-performance packet replay engine. In: MoMeTools 2003: Proceedings of the ACM SIGCOMM Workshop on Models, Methods and Tools for Reproducible Network Research, pp. 57–64. ACM, New York (2003)
NLANR/DAST, Iperf: The TCP/UDP bandwidth measurement tool, http://sourceforge.net/projects/iperf/
Cho, K.: WIDE-TRANSIT 150 Megabit Ethernet Trace 2008-03-18 (Anonymized) (collection), http://imdc.datcat.org/collection/1-05L8-9=WIDE-TRANSIT+150+Megabit+Ethernet+Trace+2008-03-18+%28Anonymized%29
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 ICST Institute for Computer Science, Social Informatics and Telecommunications Engineering
About this paper
Cite this paper
Perez-Garcia, A., Siaterlis, C., Masera, M. (2012). Designing Repeatable Experiments on an Emulab Testbed. In: Tomkos, I., Bouras, C.J., Ellinas, G., Demestichas, P., Sinha, P. (eds) Broadband Communications, Networks, and Systems. BROADNETS 2010. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 66. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-30376-0_3
Download citation
DOI: https://doi.org/10.1007/978-3-642-30376-0_3
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-30375-3
Online ISBN: 978-3-642-30376-0
eBook Packages: Computer ScienceComputer Science (R0)