Cloud server benchmarks for performance evaluation of new hardware architecture

H Wu, F Liu, RB Lee - arXiv preprint arXiv:1603.01352, 2016 - arxiv.org
H Wu, F Liu, RB Lee
arXiv preprint arXiv:1603.01352, 2016arxiv.org
Adding new hardware features to a cloud computing server requires testing both the
functionalities and the performance of the new hardware mechanisms. However, commonly
used cloud computing server workloads are not well-represented by the SPEC integer and
floating-point benchmark and Parsec suites typically used by the computer architecture
community. Existing cloud benchmark suites for scale-out or scale-up computing are not
representative of the most common cloud usage, and are very difficult to run on a cycle …
Adding new hardware features to a cloud computing server requires testing both the functionalities and the performance of the new hardware mechanisms. However, commonly used cloud computing server workloads are not well-represented by the SPEC integer and floating-point benchmark and Parsec suites typically used by the computer architecture community. Existing cloud benchmark suites for scale-out or scale-up computing are not representative of the most common cloud usage, and are very difficult to run on a cycle-accurate simulator that can accurately model new hardware, like gem5. In this paper, we present PALMScloud, a suite of cloud computing benchmarks for performance evaluation of cloud servers, that is ready to run on the gem5 cycle-accurate simulator. We demonstrate how our cloud computing benchmarks are used in evaluating the cache performance of a new secure cache called Newcache as a case study. We hope that these cloud benchmarks, ready to run on a dual-machine gem5 simulator or on real machines, can be useful to other researchers interested in improving hardware micro-architecture and cloud server performance.
arxiv.org