Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3665314.3670814acmconferencesArticle/Chapter ViewAbstractPublication PagesislpedConference Proceedingsconference-collections
research-article
Open access

HeTraX: Energy Efficient 3D Heterogeneous Manycore Architecture for Transformer Acceleration

Published: 09 September 2024 Publication History

Abstract

Transformers have revolutionized deep learning and generative modeling to enable unprecedented advancements in natural language processing tasks and beyond. However, designing hardware accelerators for executing transformer models is challenging due to the wide variety of computing kernels involved in the transformer architecture. Existing accelerators are either inadequate to accelerate end-to-end transformer models or suffer notable thermal limitations. In this paper, we propose the design of a three-dimensional heterogeneous architecture referred to as HeTraX specifically optimized to accelerate end-to-end transformer models. HeTraX employs hardware resources aligned with the computational kernels of transformers and optimizes both performance and energy. Experimental results show that HeTraX outperforms existing state-of-the-art by up to 5.6x in speedup and improves EDP by 14.5x while ensuring thermally feasibility.

References

[1]
T. Lin, Y. Wang, X. Liu and X. Qiu. 2021. A Survey of Transformers. arXiv preprint ArXiv:2106.04554 (2021).
[2]
A. Shafiee et al. 2016. ISAAC: a convolutional neural network accelerator with in-situ analog arithmetic in crossbars. In ISCA.
[3]
Z. He et al. 2019. Noise Injection Adaption: End-to-End ReRAM Crossbar Non-ideal Effect Adaption for Neural Network Mapping. In DAC.
[4]
M. Zhou, W. Xu, J. Kang and T. Rosing. 2022. TransPIM: A Memory-based Acceleration via Software-Hardware Co-Design for Transformer. In HPCA.
[5]
Y. Ding et al. 2023. HAIMA: A Hybrid SRAM and DRAM Accelerator-in-Memory Architecture for Transformer. In DAC
[6]
Y. Luo and S. Yu. 2024. H3D-Transformer: A Heterogeneous 3D (H3D) Computing Platform for Transformer Model Acceleration on Edge Devices. ACM TOADES (2024).
[7]
X. Yang et al. 2020. ReTransformer: ReRAM-based Processing-in-Memory Architecture for Transformer Acceleration. In ICCAD.
[8]
S. Sridharan, J. Stevens, K. Roy and A. Raghunathan. 2023. X-Former: In-Memory Acceleration of Transformers. IEEE TVLSI (2023).
[9]
https://ddr-phy.org/. [Online].
[10]
B. Joardar et al. 2019. Learning-Based Application-Agnostic 3D NoC Design for Heterogeneous Manycore Systems. IEEE TC 68 (2018).
[11]
J. Cong et al. 2004. A thermal-driven floorplanning algorithm for 3D ICs. In ICCAD.
[12]
V. Kandiah et al. 2021. AccelWattch: A Power Modeling Framework for Modern GPUs. MICRO-54 (2021).
[13]
X. Peng et al. 2020. DNN+NeuroSim V2.0: An end-to-end benchmarking framework for compute-in-memory accelerators for on-chip training. IEEE TCAD 40 (2020).
[14]
N. Jiang et al. 2013. Detailed and Flexible Cycle-Accurate Network-on-Chip Simulator. In ISPASS.
[15]
B. Gopireddy and J. Torrellas. 2019. Designing Vertical Processors in Monolithic 3D. In ISCA.
[16]
A. Wang et al. 2018. GLUE:A multi-task benchmark and analysis platform for natural language understanding. arXiv preprint arXiv:1804.07461 (2018).
[17]
S. Van Huylenbroeck et al. 2016. Small Pitch, High Aspect Ratio Via-Last TSV Module. IEEE ECTC (2016).

Index Terms

  1. HeTraX: Energy Efficient 3D Heterogeneous Manycore Architecture for Transformer Acceleration

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ISLPED '24: Proceedings of the 29th ACM/IEEE International Symposium on Low Power Electronics and Design
    August 2024
    384 pages
    ISBN:9798400706882
    DOI:10.1145/3665314
    This work is licensed under a Creative Commons Attribution International 4.0 License.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 09 September 2024

    Check for updates

    Author Tags

    1. transformer
    2. heterogeneity
    3. accelerator
    4. thermal-aware
    5. PIM

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    ISLPED '24
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 398 of 1,159 submissions, 34%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 152
      Total Downloads
    • Downloads (Last 12 months)152
    • Downloads (Last 6 weeks)43
    Reflects downloads up to 16 Jan 2025

    Other Metrics

    Citations

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media