Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3652032.3659366acmconferencesArticle/Chapter ViewAbstractPublication PagescpsweekConference Proceedingsconference-collections
short-paper
Open access

Language-Based Deployment Optimization for Random Forests (Invited Paper)

Published: 20 June 2024 Publication History

Abstract

Arising popularity for resource-efficient machine learning models makes random forests and decision trees famous models in recent years. Naturally, these models are tuned, optimized, and transformed to feature maximally low-resource consumption. A subset of these strategies targets the model structure and model logic and therefore induces a trade-off between resource-efficiency and prediction performance. An orthogonal set of approaches targets hardware-specific optimizations, which can improve performance without changing the behavior of the model. Since such hardware-specific optimizations are usually hardware-dependent and inflexible in their realizations, this paper envisions a more general application of such optimization strategies at the level of programming languages. We therefore discuss a set of suitable optimization strategies first in general and envision their application in LLVM IR, i.e. a flexible and hardware-independent ecosystem.

References

[1]
Daniel Biebert, Christian Hakert, Kuan-Hsun Chen, and Jian-Jia Chen. 2024. Register Your Forests: Decision Tree Ensemble Optimization by Explicit CPU Register Allocation. CoRR, abs/2401.15503 (2024).
[2]
Sebastian Buschjager, Kuan-Hsun Chen, Jian-Jia Chen, and Katharina Morik. 2018. Realization of Random Forest for Real-Time Evaluation through Tree Framing. In 2018 IEEE International Conference on Data Mining (ICDM).
[3]
Sebastian Buschjäger and Katharina Morik. 2018. Decision Tree and Random Forest Implementations for Fast Filtering of Sensor Data. IEEE Transactions on Circuits and Systems I: Regular Papers, 65-I, 1 (2018).
[4]
Sebastian Buschjäger and Katharina Morik. 2023. Joint leaf-refinement and ensemble pruning through L_ 1 regularization. 37, 3 (2023).
[5]
Kuan-Hsun Chen, Chiahui Su, Christian Hakert, Sebastian Buschjäger, Chao-Lin Lee, Jenq-Kuen Lee, Katharina Morik, and Jian-Jia Chen. 2022. Efficient Realization of Decision Trees for Real-Time Inference. ACM Trans. Embed. Comput. Syst., 21, 6 (2022), Article 68, 26 pages.
[6]
Hyunsu Cho and Mu Li. 2018. Treelite: toolbox for decision tree deployment. In Proceedings of Machine Learning and Systems (MLSys).
[7]
Christian Hakert, Kuan-Hsun Chen, and Jian-Jia Chen. 2022. FLInt: Exploiting Floating Point Enabled Integer Arithmetic for Efficient Random Forest Inference. arXiv preprint arXiv:2209.04181.
[8]
Simon Koschel, Sebastian Buschjäger, Claudio Lucchese, and Katharina Morik. 2023. Fast Inference of Tree Ensembles on ARM Devices. CoRR, abs/2305.08579 (2023).
[9]
Claudio Lucchese, Franco Maria Nardini, Salvatore Orlando, Raffaele Perego, Nicola Tonellotto, and Rossano Venturini. 2015. QuickScorer: A Fast Algorithm to Rank Documents with Additive Ensembles of Regression Trees. In International ACM SIGIR Conference on Research and Development in Information Retrieval. 73–82.
[10]
Milan Shah, Reece Neff, Hancheng Wu, Marco Minutoli, Antonino Tumeo, and Michela Becchi. 2022. Accelerating Random Forest Classification on GPU and FPGA. In Proceedings of the 51st International Conference on Parallel Processing, ICPP.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
LCTES 2024: Proceedings of the 25th ACM SIGPLAN/SIGBED International Conference on Languages, Compilers, and Tools for Embedded Systems
June 2024
182 pages
ISBN:9798400706165
DOI:10.1145/3652032
This work is licensed under a Creative Commons Attribution International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 20 June 2024

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. LLVM IR
  2. Optimization
  3. Random Forests

Qualifiers

  • Short-paper

Funding Sources

  • German Research Foundation (DFG)
  • BMBF
  • state of North Rhine-Westphalia

Conference

LCTES '24

Acceptance Rates

Overall Acceptance Rate 116 of 438 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 69
    Total Downloads
  • Downloads (Last 12 months)69
  • Downloads (Last 6 weeks)29
Reflects downloads up to 12 Sep 2024

Other Metrics

Citations

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media