Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3183440.3183453acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
short-paper

Assisted discovery of software vulnerabilities

Published: 27 May 2018 Publication History

Abstract

As more aspects of our daily lives rely on technology, the software that enables the technology must be secure. Developers rely on practices such as threat modeling, static and dynamic analyses, code review, and fuzz and penetration testing to engineer secure software. These practices, while effective at identifying vulnerabilities in software, are limited in their ability to describe the potential reasons for the existence of vulnerabilities. In order to overcome this limitation, researchers have proposed empirically validated metrics to identify factors that may have led to the introduction of vulnerabilities in the past. Developers must be made aware of these factors so that they can proactively consider the security implications of each line of code that they contribute. The goal of our research is to assist developers in engineering secure software by providing a technique that generates scientific, interpretable, and actionable feedback on security as the software evolves. In this paper, we provide an overview of our proposed approach to accomplish this research goal through a series of three research studies in which we (1) systematize the knowledge on vulnerability discovery metrics, (2) leverage the metrics to generate feedback on security, and (3) implement a framework for providing automatically generated feedback on security using code reviews as a medium.

References

[1]
Istehad Chowdhury and Mohammad Zulkernine. 2011. Using complexity, coupling, and cohesion metrics as early indicators of vulnerabilities. Journal of Systems Architecture 57, 3 (2011), 294 -- 313. Special Issue on Security and Dependability Assurance of Software Architectures.
[2]
Norman E. Fenton and Martin Neil. 2000. Software Metrics: Roadmap. In Proceedings of the Conference on The Future of Software Engineering (ICSE '00). ACM, New York, NY, USA, 357--370.
[3]
Sylvie L Foss and Gail C Murphy. 2015. Do Developers Respond to Code Stability Warnings?. In Proceedings of the 25th Annual International Conference on Computer Science and Software Engineering (CASCON '15). IBM Corp., Riverton, NJ, USA, 162--170. http://dl.acm.org/citation.cfm?id=2886444.2886469
[4]
Seyed Mohammad Ghaffarian and Hamid Reza Shahriari. 2017. Software Vulnerability Analysis and Discovery Using Machine-Learning and Data-Mining Techniques: A Survey. ACM Comput. Surv. 50, 4 (aug 2017), 56:1--56:36.
[5]
Brittany Johnson, Yoonki Song, Emerson Murphy-Hill, and Robert Bowdidge. 2013. Why Don't Software Developers Use Static Analysis Tools to Find Bugs?. In 2013 35th International Conference on Software Engineering (ICSE). 672--681.
[6]
Barbara Kitchenham and Pearl Brereton. 2013. A systematic review of systematic review process research in software engineering. Information and Software Technology 55, 12 (2013), 2049--2075.
[7]
L Layman, L Williams, and R S Amant. 2007. Toward Reducing Fault Fix Time: Understanding Developer Behavior for the Design of Automated Fault Detection Tools. In First International Symposium on Empirical Software Engineering and Measurement (ESEM 2007). 176--185.
[8]
Chris Lewis, Zhongpeng Lin, Caitlin Sadowski, Xiaoyan Zhu, Rong Ou, and E James Whitehead Jr. 2013. Does Bug Prediction Support Human Developers? Findings from a Google Case Study. In Proceedings of the 2013 International Conference on Software Engineering (ICSE '13). IEEE Press, Piscataway, NJ, USA, 372--381. http://dl.acm.org/citation.cfm?id=2486788.2486838
[9]
Andrew Meneely, Ben Smith, and Laurie Williams. 2013. Validating Software Metrics: A Spectrum of Philosophies. ACM Trans. Softw. Eng. Methodol. 21, 4, Article 24 (Feb 2013), 28 pages.
[10]
A. Meneely, H. Srinivasan, A. Musa, A. R. Tejeda, M. Mokary, and B. Spates. 2013. When a Patch Goes Bad: Exploring the Properties of Vulnerability-Contributing Commits. In 2013 ACM /IEEE International Symposium on Empirical Software Engineering and Measurement. 65--74.
[11]
Andrew Meneely and Laurie Williams. 2009. Secure Open Source Collaboration: An Empirical Study of Linus' Law. In Proceedings of the 16th ACM Conference on Computer and Communications Security (CCS '09). ACM, New York, NY, USA, 453--462.
[12]
Andrew Meneely and Laurie Williams. 2010. Strengthening the Empirical Analysis of the Relationship Between Linus' Law and Software Security. In Proceedings of the 2010 ACM-IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM '10). ACM, New York, NY, USA, Article 9, 10 pages.
[13]
Patrick Morrison, Kim Herzig, Brendan Murphy, and Laurie Williams. 2015. Challenges with Applying Vulnerability Prediction Models. In Proceedings of the 2015 Symposium and Bootcamp on the Science of Security (HotSoS '15). ACM, New York, NY, USA, Article 4, 9 pages.
[14]
Nuthan Munaiah, Felivel Camilo, Wesley Wigham, Andrew Meneely, and Meiyappan Nagappan. 2017. Do bugs foreshadow vulnerabilities? An in-depth study of the chromium project. Empirical Software Engineering 22, 3 (01 Jun 2017), 1305--1347.
[15]
Nuthan Munaiah and Andrew Meneely. 2016. Beyond the Attack Surface: Assessing Security Risk with Random Walks on Call Graphs. In Proceedings of the 2016 ACM Workshop on Software PROtection (SPRO '16). ACM, New York, NY, USA, 3--14.
[16]
Nuthan Munaiah and Andrew Meneely. 2016. Vulnerability Severity Scoring and Bounties: Why the Disconnect?. In Proceedings of the 2Nd International Workshop on Software Analytics (SWAN 2016). ACM, New York, NY, USA, 8--14.
[17]
N. Munaiah, A. Meneely, and P. K. Murukannaiah. 2017. A Domain-Independent Model for Identifying Security Requirements. In 2017 IEEE 25th International Requirements Engineering Conference (RE). 506--511.
[18]
Nuthan Munaiah, Benjamin S. Meyers, Cecilia O. Aim, Andrew Meneely, Pradeep K. Murukannaiah, Emily Prud'hommeaux, Josephine Wolff, and Yang Yu. 2017. Natural Language Insights from Code Reviews that Missed a Vulnerability. Springer International Publishing, Cham, 70--86.
[19]
Stephan Neuhaus and Thomas Zimmermann. 2009. The Beauty and the Beast: Vulnerabilities in Red Hat's Packages. In Proceedings of the 2009 USENIX Annual Technical Conference (USENIX ATC) (USENIX ATC '09). https://www.usenix.org/legacy/event/usenix09/tech/full
[20]
Stephan Neuhaus, Thomas Zimmermann, Christian Holler, and Andreas Zeller. 2007. Predicting Vulnerable Software Components. In Proceedings of the 14th ACM Conference on Computer and Communications Security (CCS '07). ACM, New York, NY, USA, 529--540.
[21]
C. Ravindranath Pandian and Murali Kumar S K. 2015. Simple Statistical Methods for Software Engineering: Data and Patterns. CRC Press.
[22]
Henning Perl, Sergej Dechand, Matthew Smith, Daniel Arp, Fabian Yamaguchi, Konrad Rieck, Sascha Fahl, and Yasemin Acar. 2015. VCCFinder: Finding Potential Vulnerabilities in Open-Source Projects to Assist Code Audits. In Proceedings of the 22nd ACM SIGSAC Conference on Computer and Communications Security (CCS '15). ACM, New York, NY, USA, 426--437.
[23]
C Sadowski, J v. Gogh, C Jaspan, E Söderberg, and C Winter. 2015. Tricorder: Building a Program Analysis Ecosystem. In 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering, Vol. 1. 598--608.
[24]
R. Scandariato, J. Waiden, A. Hovsepyan, and W. Joosen. 2014. Predicting Vulnerable Software Components via Text Mining. IEEE Transactions on Software Engineering 40, 10 (Oct 2014), 993--1006.
[25]
Hossain Shahriar and Mohammad Zulkernine. 2012. Mitigating Program Security Vulnerabilities: Approaches and Challenges. ACM Comput. Surv. 44, 3, Article 11 (Jun 2012), 46 pages.
[26]
Yonghee Shin. 2011. Investigating Complexity Metrics As Indicators of Software Vulnerability. Ph.D. Dissertation. North Carolina State University. Advisor(s) Williams, Laurie and Vouk, Mladen. AAI3442705.
[27]
Y. Shin, A. Meneely, L. Williams, and J. A. Osborne. 2011. Evaluating Complexity, Code Churn, and Developer Activity Metrics as Indicators of Software Vulnerabilities. IEEE Transactions on Software Engineering 37, 6 (Nov 2011), 772--787.
[28]
Yonghee Shin and Laurie Williams. 2008. An Empirical Model to Predict Security Vulnerabilities Using Code Complexity Metrics. In Proceedings of the Second ACM-IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM '08). ACM, New York, NY, USA, 315--317.
[29]
Yonghee Shin and Laurie Williams. 2008. Is Complexity Really the Enemy of Software Security?. In Proceedings of the 4th ACM Workshop on Quality of Protection (QoP '08). ACM, New York, NY, USA, 47--50.
[30]
Yonghee Shin and Laurie Williams. 2011. An Initial Study on the Use of Execution Complexity Metrics As Indicators of Software Vulnerabilities. In Proceedings of the 7th International Workshop on Software Engineering for Secure Systems (SESS '11). ACM, New York, NY, USA, 1--7.
[31]
Yonghee Shin and Laurie Williams. 2013. Can traditional fault prediction models be used for vulnerability prediction? Empirical Software Engineering 18, 1 (01 Feb 2013), 25--59.
[32]
J. Waiden, J. Stuckman, and R. Scandariato. 2014. Predicting Vulnerable Components: Software Metrics vs Text Mining. In 2014 IEEE 25th International Symposium on Software Reliability Engineering. 23--33.
[33]
Awad Younis, Yashwant Malaiya, Charles Anderson, and Indrajit Ray. 2016. To Fear or Not to Fear That is the Question: Code Characteristics of a Vulnerable Function with an Existing Exploit. In Proceedings of the Sixth ACM Conference on Data and Application Security and Privacy (CODASPY '16). ACM, New York, NY, USA, 97--104.
[34]
T Zimmermann, N Nagappan, and L Williams. 2010. Searching for a Needle in a Haystack: Predicting Security Vulnerabilities for Windows Vista. In 2010 Third International Conference on Software Testing, Verification and Validation. 421--428.

Cited By

View all
  • (2023)Input Validation Vulnerabilities in Web Applications: Systematic Review, Classification, and Analysis of the Current State-of-the-ArtIEEE Access10.1109/ACCESS.2023.326638511(40128-40161)Online publication date: 2023
  • (2021)Characterizing and Understanding Software Developer Networks in Security Development2021 IEEE 32nd International Symposium on Software Reliability Engineering (ISSRE)10.1109/ISSRE52982.2021.00061(534-545)Online publication date: Oct-2021
  • (2019)Data-driven insights from vulnerability discovery metricsProceedings of the Joint 4th International Workshop on Rapid Continuous Software Engineering and 1st International Workshop on Data-Driven Decisions, Experimentation and Evolution10.1109/RCoSE/DDrEE.2019.00008(1-7)Online publication date: 27-May-2019

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICSE '18: Proceedings of the 40th International Conference on Software Engineering: Companion Proceeedings
May 2018
231 pages
ISBN:9781450356633
DOI:10.1145/3183440
  • Conference Chair:
  • Michel Chaudron,
  • General Chair:
  • Ivica Crnkovic,
  • Program Chairs:
  • Marsha Chechik,
  • Mark Harman
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 27 May 2018

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Short-paper

Conference

ICSE '18
Sponsor:

Acceptance Rates

Overall Acceptance Rate 276 of 1,856 submissions, 15%

Upcoming Conference

ICSE 2025

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)6
  • Downloads (Last 6 weeks)1
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Input Validation Vulnerabilities in Web Applications: Systematic Review, Classification, and Analysis of the Current State-of-the-ArtIEEE Access10.1109/ACCESS.2023.326638511(40128-40161)Online publication date: 2023
  • (2021)Characterizing and Understanding Software Developer Networks in Security Development2021 IEEE 32nd International Symposium on Software Reliability Engineering (ISSRE)10.1109/ISSRE52982.2021.00061(534-545)Online publication date: Oct-2021
  • (2019)Data-driven insights from vulnerability discovery metricsProceedings of the Joint 4th International Workshop on Rapid Continuous Software Engineering and 1st International Workshop on Data-Driven Decisions, Experimentation and Evolution10.1109/RCoSE/DDrEE.2019.00008(1-7)Online publication date: 27-May-2019

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media