Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1109/CONECT.2004.1375193guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Performance evaluation of InfiniBand with PCI Express

Published: 05 August 2004 Publication History
  • Get Citation Alerts
  • Abstract

    We present an initial performance evaluation of InfiniBand HCAs (host channel adapters) from Mellanox with PCI Express interfaces. We compare the performance with HCAs using PCI-X interfaces. Our results show that InfiniBand HCAs with PCI Express can achieve significant performance benefits. Compared with HCAs using 64 bit/133 MHz PCI-X interfaces, they can achieve 20%-30% lower latency for small messages. The small message latency achieved with PCI Express is around 3.8 /spl mu/s, compared with the 5.0 /spl mu/s with PCI-X. For large messages, HCAs with PCI Express using a single port can deliver unidirectional bandwidth up to 968 MB/s and bidirectional bandwidth up to 1916 MB/s, which are, respectively, 1.24 and 2.02 times the peak bandwidths achieved by HCAs with PCI-X. When both the ports of the HCAs are activated, HCAs with PCI Express can deliver a peak unidirectional bandwidth of 1486 MB/s and aggregate bidirectional bandwidth up to 2729 MB/s, which are 1.93 and 2.88 times the peak bandwidths obtained using HCAs with PCI-X. PCI Express also improves performance at the MPI level. A latency of 4.6 /spl mu/s with PCI Express is achieved for small messages. And for large messages, unidirectional bandwidth of 1497 MB/s and bidirectional bandwidth of 2724 MB/s are observed.

    Cited By

    View all
    • (2018)MQsimProceedings of the 16th USENIX Conference on File and Storage Technologies10.5555/3189759.3189765(49-65)Online publication date: 12-Feb-2018
    • (2018)Understanding PCIe performance for end host networkingProceedings of the 2018 Conference of the ACM Special Interest Group on Data Communication10.1145/3230543.3230560(327-341)Online publication date: 7-Aug-2018
    • (2009)Motivating future interconnectsProceedings of the 5th ACM/IEEE Symposium on Architectures for Networking and Communications Systems10.1145/1882486.1882513(94-103)Online publication date: 19-Oct-2009

    Index Terms

    1. Performance evaluation of InfiniBand with PCI Express
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image Guide Proceedings
      HOTI '04: Proceedings of the High Performance Interconnects, 2004. on Proceedings. 12th Annual IEEE Symposium
      August 2004
      96 pages
      ISBN:0780386868

      Publisher

      IEEE Computer Society

      United States

      Publication History

      Published: 05 August 2004

      Qualifiers

      • Article

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)0
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 12 Aug 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2018)MQsimProceedings of the 16th USENIX Conference on File and Storage Technologies10.5555/3189759.3189765(49-65)Online publication date: 12-Feb-2018
      • (2018)Understanding PCIe performance for end host networkingProceedings of the 2018 Conference of the ACM Special Interest Group on Data Communication10.1145/3230543.3230560(327-341)Online publication date: 7-Aug-2018
      • (2009)Motivating future interconnectsProceedings of the 5th ACM/IEEE Symposium on Architectures for Networking and Communications Systems10.1145/1882486.1882513(94-103)Online publication date: 19-Oct-2009

      View Options

      View options

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media