Abstract
Software applications providers have always been required to perform load testing prior to launching new applications. This crucial test phase is expensive in human and hardware terms, and the solutions generally used would benefit from further development. In particular, designing an appropriate load profile to stress an application is difficult and must be done carefully to avoid skewed testing. In addition, static testing platforms are exceedingly complex to set up. New opportunities to ease load testing solutions are becoming available thanks to cloud computing. This paper describes a Benchmark-as-a-Service platform based on: (i) intelligent generation of traffic to the benched application without inducing thrashing (avoiding predefined load profiles), (ii) a virtualized and self-scalable load injection system. This platform was found to reduce the cost of testing by 50% compared to more commonly used solutions. It was experimented on the reference JEE benchmark RUBiS. This involved detecting bottleneck tiers.
Chapter PDF
Similar content being viewed by others
References
Amza, C., Cecchet, E., Chanda, A., Cox, A.L., Elnikety, S., Gil, R., Marguerite, J., Rajamani, K., Zwaenepoel, W.: Specification and implementation of dynamic web site benchmarks. In: IEEE Annual Workshop on Workload Characterization, Austin, TX, USA, pp. 3–13 (2002)
Dillenseger, B.: CLIF, a framework based on fractal for flexible, distributed load testing. In: Annals of Telecommunications, vol. 64(1-2), pp. 101–120. Springer, Paris (2009)
Bruneton, E., Coupaye, T., Leclercq, M., Quema, V., Stefani, J.-B.: An Open Component Model and Its Support in Java. In: Crnković, I., Stafford, J.A., Schmidt, H.W., Wallnau, K. (eds.) CBSE 2004. LNCS, vol. 3054, pp. 7–22. Springer, Heidelberg (2004)
Harbaoui, A., Salmi, N., Dillenseger, B., Vincent, J.: Introducing Queuing Network-Based Performance Awareness in Autonomic Systems. In: Proceedings of the International Conference on Autonomic and Autonomous Systems, Cancun, Mexico, pp. 7–12 (2010)
Kleinrock, L.: Queueing Systems. Wiley-Interscience, New York (1975) ISBN 0471491101
Stewart, W.: Introduction to the Numerical Solution of Markov Chains. Princeton University Press, Princeton (1994) ISBN 0691036993
Jain, R.K.: The Art of Computer Systems Performance Analysis: Techniques for Experimental Design, Measurement, Simulation, and modelling. John Wiley and Sons, Inc., Canada (1991) ISBN 0471503363
Oracle, Java Message Service, (October 2012), http://docs.oracle.com/cd/E19957-01/816-5904-10/816-5904-10.pdf
Kephart, J.O., Chess, D.M.: The Vision of Autonomic Computing. Computer 36(1), 41–50 (2003)
Grid’5000: a scientific instrument designed to support experiment-driven research (October 2012), https://www.grid5000.fr/mediawiki/index.php/Grid5000:Home
Openstack web site (October 2012), http://openstack.org/
Amazon Web Services, Amazon EC2 auto-scaling functions (October 2012), http://aws.amazon.com/fr/autoscaling/
Simic, B.: The performance of web applications: Customers are won or lost in one second. A. R. Library (2008)
Wang, Q., Malkowski, S., Jayasinghe, D., Xiong, P., Pu, C., Kanemasa, Y., Kawaba, M., Harada, L.: The impact of soft resource allocation on n-tier application scalability. In: Proceedings of the 2011 IEEE International Parallel & Distributed Processing Symposium, Washington, DC, USA, pp. 1034–1045 (2011)
Rolls, D., Joslin, C., Scholz, S.-B.: Unibench: a tool for automated and collaborative benchmarking. In: Proceedings of the IEEE International Conference on Program Comprehension, Braga, Portugal, pp. 50–51 (2010)
Almeida, R., Vieira, M.: Benchmarking the resilience of self-adaptive software systems: perspectives and challenges. In: Proceedings of the International Symposium on Software Engineering for Adaptive and Self-Managing Systems, Waikiki, Honolulu, HI, USA, pp. 190–195 (2011)
El-Refaey, M.A., Rizkaa, M.A.: CloudGauge: a dynamic cloud and virtualization benchmarking suite. In: Proceedings of the IEEE International Workshops on Enabling Technologies: Infrastructures for Collaborative Enterprises, Larissa, Greece, pp. 66–75 (2010)
Jin, H., Cao, W., Yuan, P., Xie, X.: VSCBenchmark: benchmark for dynamic server performance of virtualization technology. In: Proceedings of the International Forum on Next-Generation Multicore/Manycore Technologies, Cairo, Egypt, pp. 1–8 (2008)
Makhija, V., Herndon, B., Smith, P., Roderick, L., Zamost, E., Anderson, J.: VMmark: a scalable benchmark for virtualized systems, Technical Report VMware-TR-2006-002, Palo Alto, CA, USA (September 2006)
Jayasinghe, D., Swint, G.S., Malkowski, S., Li, J., Park, J., Pu, C.: Expertus: A Generator Approach to Automate Performance Testing in IaaS Clouds. In: Proceedings of the IEEE International Conference on Cloud Computing, Honolulu, HI, USA, pp. 115–122 (June 2012)
BlazeMeter, Dependability benchmarking project (October 2012), http://blazemeter.com/
The Apache Software Foundation, Apache JMeter (October 2012), http://jmeter.apache.org/
Neotys, NeoLoad: load test all web and mobile applications (October 2012), http://www.neotys.fr/
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 IFIP International Federation for Information Processing
About this paper
Cite this paper
Tchana, A. et al. (2013). Self-scalable Benchmarking as a Service with Automatic Saturation Detection. In: Eyers, D., Schwan, K. (eds) Middleware 2013. Middleware 2013. Lecture Notes in Computer Science, vol 8275. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-45065-5_20
Download citation
DOI: https://doi.org/10.1007/978-3-642-45065-5_20
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-45064-8
Online ISBN: 978-3-642-45065-5
eBook Packages: Computer ScienceComputer Science (R0)