Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3427921.3450255acmconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
short-paper

ConfProf: White-Box Performance Profiling of Configuration Options

Published: 09 April 2021 Publication History
  • Get Citation Alerts
  • Abstract

    Modern software systems are highly customizable through configuration options. The sheer size of the configuration space makes it challenging to understand the performance influence of individual configuration options and their interactions under a specific usage scenario. Software with poor performance may lead to low system throughput and long response time. This paper presents ConfProf, a white-box performance profiling technique with a focus on configuration options. ConfProf helps developers understand how configuration options and their interactions influence the performance of a software system. The approach combines dynamic program analysis, machine learning, and feedback-directed configuration sampling to profile the program execution and analyze the performance influence of configuration options. Compared to existing approaches, ConfProf uses a white-box approach combined with machine learning to rank performance-influencing configuration options from execution traces. We evaluate the approach with 13 scenarios of four real-world, highly-configurable software systems. The results show that ConfProf ranks performance-influencing configuration options with high accuracy and outperform a state of the art technique.

    References

    [1]
    Paul Anderson. The use and limitations of static-analysis tools to improve software quality. CrossTalk: The Journal of Defense Software Engineering, 21(6):18--21, 2008.
    [2]
    http://httpd.apache.org/docs/current/misc/perf-tuning.html.
    [3]
    Mona Attariyan, MIchael Chow, and Jason Flinn. X-ray: Automating root-cause diagnosis of performance anomalies in production software. In OSDI, pages 307--320, 2012.
    [4]
    Mona Attariyan and Jason Flinn. Automating configuration troubleshooting with dynamic information flow analysis. In OSDI, pages 1--11, 2010.
    [5]
    David Bailey and Allan Snavely. Performance modeling: Understanding the past and predicting the future. Euro-Par 2005 Parallel Processing, pages 620--620, 2005.
    [6]
    Jacob Burnim, Sudeep Juvekar, and Koushik Sen. Wise: Automated test generation for worst-case complexity. In ICSE, pages 463--473, 2009.
    [7]
    Tianfeng Chai and Roland R Draxler. Root mean square error (rmse) or mean absolute error (mae)?--arguments against avoiding rmse in the literature. Geoscientific model development, 7(3):1247--1250, 2014.
    [8]
    Monika Dhok and Murali Krishna Ramanathan. Directed test generation to detect loop inefficiencies. In FSE, 2016.
    [9]
    JianmeiGuo, K.Czarnecki, S.Apel, N.Siegmund, and A.Wasowski. Variability-aware performance prediction: A statistical learning approach. In ASE, pages 301--311, 2013.
    [10]
    Mark Hall, Eibe Frank, Geoffrey Holmes, Bernhard Pfahringer, Peter Reutemann, and Ian H. Witten. The WEKA data mining software: an update. SIGKDD Explorations, 11(1):10--18, 2009.
    [11]
    Xue Han, Daniel Carroll, and Tingting Yu. Reproducing performance bug reports in server applications: The researchers' experiences. Journal of Systems and Software, 156:268--282, 2019.
    [12]
    Xue Han and Tingting Yu. An empirical study on performance bugs for highly configurable software systems. In ESEM, pages 215--224, 2016.
    [13]
    Ling Huang, Jinzhu Jia, Bin Yu, Byung-Gon Chun, Petros Maniatis, and Mayur Naik. Predicting execution time of computer programs using sparse polynomial regression. In Advances in Neural Information Processing Systems, pages 883--891, 2010.
    [14]
    Guoliang Jin, Linhai Song, Xiaoming Shi, Joel Scherpelz, and Shan Lu. Understanding and detecting real-world performance bugs. In PLDI, pages 77--88, 2012.
    [15]
    https://redmine.lighttpd.net/projects/1/wiki/docs-performance.
    [16]
    Max Lillack, Christian Kästner, and Eric Bodden. Tracking load-time configuration options. IEEE Transactions on Software Engineering, 44(12):1269--1291, 2018.
    [17]
    https://machinelearningmastery.com/much-training-data-required-machine-learning/.
    [18]
    Adrian Nistor, Tian Jiang, and Lin Tan. Discovering, reporting, and fixing performance bugs. In MSR, pages 237--246, 2013.
    [19]
    Adrian Nistor, Linhai Song, Darko Marinov, and Shan Lu. Toddler: Detecting performance problems via similar memory-access patterns. In ICSE, pages 562--571, 2013.
    [20]
    http://compression.ca/pbzip2/.
    [21]
    https://wiki.postgresql.org/wiki/Tuning-Your-PostgreSQL-Server.
    [22]
    http://www.revsys.com/writings/postgresql-performance.html.
    [23]
    John C Platt. 12 fast training of support vector machines using sequential minimal optimization. Advances in kernel methods, pages 185--208, 1999.
    [24]
    Michael Pradel, Parker Schuh, George Necula, and Koushik Sen. EventBreak: Analyzing the responsiveness of user interfaces through performance-guided test generation. In OOPSLA, 2014.
    [25]
    Ariel Rabkin and Randy Katz. Static extraction of program configuration options. In ICSE, pages 131--140, 2011.
    [26]
    Hinrich Schütze, Christopher D Manning, and Prabhakar Raghavan.Introduction to information retrieval, volume 39. Cambridge University Press, 2008.
    [27]
    Marija Selakovic, Thomas Glaser, and Michael Pradel. An actionable performance profiler for optimizing the order of evaluations. In ISSTA, pages 170--180, 2017.
    [28]
    Marija Selakovic and Michael Pradel. Performance issues and optimizations in JavaScript: An empirical study. In ICSE, pages 61--72, 2016.
    [29]
    Norbert Siegmund, Alexander Grebhahn, Sven Apel, and Christian Kästner. Performance-influence models for highly configurable systems. In ESEC, pages 284--294, 2015.
    [30]
    Norbert Siegmund, Sergiy S Kolesnikov,Christian Kästner,Sven Apel, Don Batory, Marko Rosenmüller, and Gunter Saake. Predicting performance via automated feature-interaction detection. In ICSE, pages 167--177, 2012.
    [31]
    Linhai Song and Shan Lu. Statistical debugging for real-world performance problems. In OOPSLA, pages 561--578, 2014.
    [32]
    Nathan R. Tallent, John M. Mellor-Crummey, and Allan Porterfield. Analyzing lock contention in multithreaded applications. In PPoPP, pages 269--280, 2010.
    [33]
    Chong Tang. System performance optimization via design and configuration space exploration. In FSE, pages 1046--1049, 2017.
    [34]
    Alexander Tarvo and Steven P. Reiss. Automated analysis of multithreaded programs for performance modeling. In ASE, pages 7--18, 2014.
    [35]
    Luca Della Toffola, Michael Pradel, and Thomas R. Gross. Synthesizing programs that expose performance bottlenecks. In CGO, 2018.
    [36]
    Alexander Wert, Jens Happe, and Lucia Happe. Supporting swift reaction: Automatically uncovering performance problems by systematic experiments. In ICSE, pages 552--561, 2013.
    [37]
    Cort J Willmott. Some comments on the evaluation of model performance. Bulletin of the American Meteorological Society, 63(11):1309--1313, 1982.
    [38]
    Xusheng Xiao, Shi Han, Dongmei Zhang, and Tao Xie. Context-sensitive delta inference for identifying workload-dependent performance bottlenecks. In ISSTA, pages 90--100, 2013.
    [39]
    Tingting Yu and Michael Pradel. Syncprof: Detecting, localizing, and optimizing synchronization bottlenecks. In ISSTA, pages 389--400, 2016.
    [40]
    Sai Zhang and Michael D. Ernst. Automated diagnosis of software configuration errors. In ICSE, pages 312--321, 2013.

    Cited By

    View all
    • (2023)Input sensitivity on the performance of configurable systems an empirical studyJournal of Systems and Software10.1016/j.jss.2023.111671201:COnline publication date: 1-Jul-2023
    • (2022)On debugging the performance of configurable software systemsProceedings of the 44th International Conference on Software Engineering10.1145/3510003.3510043(1571-1583)Online publication date: 21-May-2022
    • (2022)Automatic mapping of configuration options in software using static analysisJournal of King Saud University - Computer and Information Sciences10.1016/j.jksuci.2022.10.00434:10(10044-10055)Online publication date: Nov-2022

    Index Terms

    1. ConfProf: White-Box Performance Profiling of Configuration Options

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ICPE '21: Proceedings of the ACM/SPEC International Conference on Performance Engineering
      April 2021
      301 pages
      ISBN:9781450381949
      DOI:10.1145/3427921
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 09 April 2021

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. performance profiling
      2. software performance

      Qualifiers

      • Short-paper

      Funding Sources

      Conference

      ICPE '21

      Acceptance Rates

      ICPE '21 Paper Acceptance Rate 16 of 61 submissions, 26%;
      Overall Acceptance Rate 252 of 851 submissions, 30%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)23
      • Downloads (Last 6 weeks)0

      Other Metrics

      Citations

      Cited By

      View all
      • (2023)Input sensitivity on the performance of configurable systems an empirical studyJournal of Systems and Software10.1016/j.jss.2023.111671201:COnline publication date: 1-Jul-2023
      • (2022)On debugging the performance of configurable software systemsProceedings of the 44th International Conference on Software Engineering10.1145/3510003.3510043(1571-1583)Online publication date: 21-May-2022
      • (2022)Automatic mapping of configuration options in software using static analysisJournal of King Saud University - Computer and Information Sciences10.1016/j.jksuci.2022.10.00434:10(10044-10055)Online publication date: Nov-2022

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media