Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Adopting Two Supervisors for Efficient Use of Large-Scale Remote Deep Neural Networks - RCR Report

Published: 23 November 2023 Publication History

Abstract

This is the Replicated Computational Results (RCR) Report for our TOSEM paper “Adopting Two Supervisors for Efficient Use of Large-Scale Remote Deep Neural Networks”, where we propose a novel client-server architecture allowing to leverage the high accuracy of huge neural networks running on remote servers while reducing the economical and latency costs typically coming from using such models. As part of this RCR, we provide a replication package, which allows the full replication of all our results and is specifically designed to facilitate reuse.

References

[1]
Martín Abadi, Ashish Agarwal, Paul Barham, Eugene Brevdo, Zhifeng Chen, Craig Citro, Greg S. Corrado, Andy Davis, Jeffrey Dean, Matthieu Devin, Sanjay Ghemawat, Ian Goodfellow, Andrew Harp, Geoffrey Irving, Michael Isard, Rafal Jozefowicz, Yangqing Jia, Lukasz Kaiser, Manjunath Kudlur, Josh Levenberg, Dan Mané, Mike Schuster, Rajat Monga, Sherry Moore, Derek Murray, Chris Olah, Jonathon Shlens, Benoit Steiner, Ilya Sutskever, Kunal TalwaPaul Tucker, Vincent Vanhoucke, Vijay Vasudevan, Fernanda Viégas, Oriol Vinyals, Pete Warden, Martin Wattenberg, Martin Wicke, Yuan Yu, and Xiaoqiang Zheng. 2015. TensorFlow, Large-scale machine learning on heterogeneous systems. (112015).
[2]
François Chollet. 2015. Keras. https://keras.io
[3]
Andrew Howard, Mark Sandler, Grace Chu, Liang-Chieh Chen, Bo Chen, Mingxing Tan, Weijun Wang, Yukun Zhu, Ruoming Pang, Vijay Vasudevan. 2019. Searching for MobileNetV3. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 1314–1324.
[4]
J. D. Hunter. 2007. Matplotlib: A 2D graphics environment. Computing in Science & Engineering 9, 3 (2007), 90–95.
[5]
Maliheh Izadi. 2022. CatIss: An intelligent tool for categorizing issues reports using transformers. arXiv preprint arXiv:2203.17196 (2022).
[6]
Quentin Lhoest, Albert Villanova del Moral, Patrick von Platen, Thomas Wolf, Mario Šaško, Yacine Jernite, Abhishek Thakur, Lewis Tunstall, Suraj Patil, Mariama Drame, Julien Chaumond, Julien Plu, Joe Davison, Simon Brandeis, Victor Sanh, Teven Le Scao, Kevin Canwen Xu, Nicolas Patry, Steven Liu, Angelina McMillan-Major, Philipp Schmid, Sylvain Gugger, Nathan Raw, Sylvain Lesage, Anton Lozhkov, Matthew Carrigan, Théo Matussière, Leandro von Werra, Lysandre Debut, Stas Bekman, and Clément Delangue. 2021. Datasets: A community library for natural language processing. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations. Association for Computational Linguistics, 175–184. https://aclanthology.org/2021.emnlp-demo.21
[7]
Yinhan Liu, Myle Ott, Naman Goyal, Jingfei Du, Mandar Joshi, Danqi Chen, Omer Levy, Mike Lewis, Luke Zettlemoyer, and Veselin Stoyanov. 2019. RoBERTa: A robustly optimized BERT pretraining approach. CoRR abs/1907.11692 (2019). arxiv:1907.11692http://arxiv.org/abs/1907.11692
[8]
Adam Paszke, Sam Gross, Francisco Massa, Adam Lerer, James Bradbury, Gregory Chanan, Trevor Killeen, Zeming Lin, Natalia Gimelshein, Luca Antiga, Alban Desmaison, Andreas Kopf, Edward Yang, Zachary DeVito, Martin Raison, Alykhan Tejani, Sasank Chilamkurthy, Benoit Steiner, Lu Fang, Junjie Bai, and Soumith Chintala. 2019. PyTorch: An imperative style, high-performance deep learning library. In Advances in Neural Information Processing Systems 32, H. Wallach, H. Larochelle, A. Beygelzimer, F. d’Alché Buc, E. Fox, and R. Garnett (Eds.). Curran Associates, Inc., 8024–8035. http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf
[9]
[10]
Mingxing Tan and Quoc Le. 2021. EfficientNetV2: Smaller models and faster training. In International Conference on Machine Learning. PMLR, 10096–10106.
[11]
Tensorflow Team. 2018. TensorFlow Datasets, A collection of ready-to-use datasets. https://www.tensorflow.org/datasets
[12]
The pandas development team. 2010. pandas-dev/pandas: Pandas. (2010). https://github.com/pandas-dev/pandas
[13]
Iulia Turc, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2019. Well-read students learn better: On the importance of pre-training compact models. arXiv preprint arXiv:1908.08962 (2019).
[14]
Michael Weiss, André García Gómez, and Paolo Tonella. 2022. A Forgotten Danger in DNN Supervision Testing: Generating and Detecting True Ambiguity. (2022).
[15]
Michael Weiss and Paolo Tonella. 2021. Fail-safe execution of deep learning based systems through uncertainty monitoring. In 2021 IEEE 14th International Conference on Software Testing, Validation and Verification (ICST). IEEE. IEEE, 24–35.
[16]
Michael Weiss and Paolo Tonella. 2023. Adopting Two Supervisors for Efficient Use of Large-Scale Remote Deep Neural Networks. (2023). arxiv:cs.LG/2304.02654In-Principle Accepted Registered Report at ACM TOSEM.

Cited By

View all
  • (2023)Adopting Two Supervisors for Efficient Use of Large-Scale Remote Deep Neural NetworksACM Transactions on Software Engineering and Methodology10.1145/361759333:1(1-29)Online publication date: 23-Nov-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Software Engineering and Methodology
ACM Transactions on Software Engineering and Methodology  Volume 33, Issue 1
January 2024
933 pages
EISSN:1557-7392
DOI:10.1145/3613536
  • Editor:
  • Mauro Pezzè
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 23 November 2023
Online AM: 26 August 2023
Accepted: 23 May 2023
Received: 24 March 2023
Published in TOSEM Volume 33, Issue 1

Permissions

Request permissions for this article.

Check for updates

Badges

Author Tags

  1. Neural networks
  2. replication
  3. artifact

Qualifiers

  • Research-article

Funding Sources

  • H2020 project PRECRIME
  • ERC Advanced

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)86
  • Downloads (Last 6 weeks)7
Reflects downloads up to 03 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Adopting Two Supervisors for Efficient Use of Large-Scale Remote Deep Neural NetworksACM Transactions on Software Engineering and Methodology10.1145/361759333:1(1-29)Online publication date: 23-Nov-2023

View Options

Get Access

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media