Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2786805.2786865acmconferencesArticle/Chapter ViewAbstractPublication PagesfseConference Proceedingsconference-collections
research-article

Hidden truths in dead software paths

Published: 30 August 2015 Publication History

Abstract

Approaches and techniques for statically finding a multitude of issues in source code have been developed in the past. A core property of these approaches is that they are usually targeted towards finding only a very specific kind of issue and that the effort to develop such an analysis is significant. This strictly limits the number of kinds of issues that can be detected. In this paper, we discuss a generic approach based on the detection of infeasible paths in code that can discover a wide range of code smells ranging from useless code that hinders comprehension to real bugs. Code issues are identified by calculating the difference between the control-flow graph that contains all technically possible edges and the corresponding graph recorded while performing a more precise analysis using abstract interpretation. We have evaluated the approach using the Java Development Kit as well as the Qualitas Corpus (a curated collection of over 100 Java Applications) and were able to find thousands of issues across a wide range of categories.

References

[1]
S. Arlt, Z. Liu, and M. Schäf. Reconstructing Paths for Reachable Code. 8144:431–446, 2013.
[2]
S. Arlt and M. Schäf. Joogie: Infeasible Code Detection for Java. In CAV’12: Proceedings of the 24th international conference on Computer Aided Verification. Springer-Verlag, July 2012.
[3]
C. Artho and A. Biere. Applying Static Analysis to Large-Scale, Multi-Threaded Java Programs. In Proceedings of the 13th Australian Conference on Software Engineering, ASWEC ’01, pages 68–, Washington, DC, USA, 2001. IEEE Computer Society.
[4]
N. Ayewah and W. Pugh. The Google FindBugs Fixit. In Proceedings of the 19th International Symposium on Software Testing and Analysis, ISSTA ’10, pages 241–252, New York, NY, USA, 2010. ACM.
[5]
A. Bacchelli and C. Bird. Expectations, Outcomes, and Challenges of Modern Code Review. In Proceedings of the International Conference on Software Engineering. IEEE, May 2013.
[6]
C. Bertolini, M. Schäf, and P. Schweitzer. Infeasible Code Detection. VSTTE, 7152(Chapter 24):310–325, 2012.
[7]
A. Bessey, K. Block, B. Chelf, A. Chou, B. Fulton, S. Hallem, C. Henri-Gros, A. Kamsky, S. McPeak, and D. Engler. A Few Billion Lines of Code Later: Using Static Analysis to Find Bugs in the Real World. Commun. ACM, 53(2):66–75, Feb. 2010.
[8]
H.-Z. Chou, K.-H. Chang, and S.-Y. Kuo. Facilitating Unreachable Code Diagnosis and Debugging. 2011 16th Asia and South Pacific Design Automation Conference ASP-DAC 2011, pages 1–6, Feb. 2011.
[9]
B. Cole, D. Hakim, D. Hovemeyer, R. Lazarus, W. Pugh, and K. Stephens. Improving Your Software Using Static Analysis to Find Bugs. In Companion to the 21st ACM SIGPLAN Symposium on Object-oriented Programming Systems, Languages, and Applications, OOPSLA ’06, pages 673–674, New York, NY, USA, 2006. ACM.
[10]
P. Cousot and R. Cousot. Abstract Interpretation: A Unified Lattice Model for Static Analysis of Programs by Construction or Approximation of Fixpoints. In Proceedings of the 4th ACM SIGACT-SIGPLAN Symposium on Principles of Programming Languages, POPL ’77, pages 238–252, New York, NY, USA, 1977. ACM.
[11]
P. Cousot, R. Cousot, J. Feret, L. Mauborgne, A. Miné, and X. Rival. Why Does Astrée Scale Up? Form. Methods Syst. Des., 35(3):229–264, Dec. 2009.
[12]
S. K. Debray, W. Evans, R. Muth, and B. De Sutter. Compiler Techniques for Code Compaction. ACM Trans. Program. Lang. Syst., 22(2):378–415, Mar. 2000.
[13]
S. Eder, M. Junker, E. Jürgens, B. Hauptmann, R. Vaas, and K.-H. Prommer. How Much Does Unused Code Matter for Maintenance? In Proceedings of the 34th International Conference on Software Engineering, ICSE ’12, pages 1102–1111, Piscataway, NJ, USA, 2012. IEEE Press.
[14]
M. Eichberg and B. Hermann. A Software Product Line for Static Analyses: The OPAL Framework. In Proceedings of the 3rd ACM SIGPLAN International Workshop on the State of the Art in Java Program Analysis, SOAP ’14, pages 1–6, New York, NY, USA, 2014. ACM.
[15]
E. Geay, E. Yahav, and S. Fink. Continuous Code-quality Assurance with SAFE. In Proceedings of the 2006 ACM SIGPLAN Symposium on Partial Evaluation and Semantics-based Program Manipulation, PEPM ’06, pages 145–149, New York, NY, USA, 2006. ACM.
[16]
B. Johnson, Y. Song, E. Murphy-Hill, and R. Bowdidge. Why Don’t Software Developers Use Static Analysis Tools to Find Bugs? In Proceedings of the 2013 International Conference on Software Engineering, ICSE ’13, pages 672–681, Piscataway, NJ, USA, 2013. IEEE Press.
[17]
B. Kasikci, T. Ball, G. Candea, J. Erickson, and M. Musuvathi. Efficient Tracing of Cold Code via Bias-Free Sampling. USENIX Annual Technical Conference, pages 243–254, 2014.
[18]
J. Knoop, O. Rüthing, and B. Steffen. Partial Dead Code Elimination. In Proceedings of the ACM SIGPLAN 1994 Conference on Programming Language Design and Implementation, PLDI ’94, pages 147–158, New York, NY, USA, 1994. ACM.
[19]
F. Nielson, H. R. Nielson, and C. Hankin. Principles of Program Analysis. Springer-Verlag New York, Inc., Secaucus, NJ, USA, 1999.
[20]
E. Payet and F. Spoto. Static Analysis of Android Programs. In Proceedings of the 23rd International Conference on Automated Deduction, CADE’11, pages 439–445, Berlin, Heidelberg, 2011. Springer-Verlag.
[21]
M. Schäf, D. Schwartz-Narbonne, and T. Wies. Explaining Inconsistent Code. the 2013 9th Joint Meeting, pages 1–11, Aug. 2013.
[22]
E. Tempero, C. Anslow, J. Dietrich, T. Han, J. Li, M. Lumpe, H. Melton, and J. Noble. Qualitas Corpus: A Curated Collection of Java Code for Empirical Studies. In 2010 Asia Pacific Software Engineering Introduction Classification of Issues The Approach Identifying Infeasible Paths The Abstract Interpretation Analysis Post-processing Analysis Results Evaluation JDK 1.8.0 Update 25 Issue Categories Varying the Analysis Parameters Qualitas Corpus Related Work Conclusion Acknowledgments References

Cited By

View all
  • (2021)Empirical Investigation of Code Quality Rule Violations in HPC ApplicationsProceedings of the 25th International Conference on Evaluation and Assessment in Software Engineering10.1145/3463274.3463787(402-411)Online publication date: 21-Jun-2021
  • (2021)TaintBench: Automatic real-world malware benchmarking of Android taint analysesEmpirical Software Engineering10.1007/s10664-021-10013-527:1Online publication date: 29-Oct-2021
  • (2020)TACAI: an intermediate representation based on abstract interpretationProceedings of the 9th ACM SIGPLAN International Workshop on the State Of the Art in Program Analysis10.1145/3394451.3397204(2-7)Online publication date: 15-Jun-2020
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ESEC/FSE 2015: Proceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering
August 2015
1068 pages
ISBN:9781450336758
DOI:10.1145/2786805
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 30 August 2015

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Abstract Interpretation
  2. Finding Bugs
  3. Infeasible Paths
  4. Java
  5. Scalable Analysis

Qualifiers

  • Research-article

Conference

ESEC/FSE'15
Sponsor:

Acceptance Rates

Overall Acceptance Rate 112 of 543 submissions, 21%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)13
  • Downloads (Last 6 weeks)2
Reflects downloads up to 27 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2021)Empirical Investigation of Code Quality Rule Violations in HPC ApplicationsProceedings of the 25th International Conference on Evaluation and Assessment in Software Engineering10.1145/3463274.3463787(402-411)Online publication date: 21-Jun-2021
  • (2021)TaintBench: Automatic real-world malware benchmarking of Android taint analysesEmpirical Software Engineering10.1007/s10664-021-10013-527:1Online publication date: 29-Oct-2021
  • (2020)TACAI: an intermediate representation based on abstract interpretationProceedings of the 9th ACM SIGPLAN International Workshop on the State Of the Art in Program Analysis10.1145/3394451.3397204(2-7)Online publication date: 15-Jun-2020
  • (2018)Systematic evaluation of the unsoundness of call graph construction algorithms for JavaCompanion Proceedings for the ISSTA/ECOOP 2018 Workshops10.1145/3236454.3236503(107-112)Online publication date: 16-Jul-2018
  • (2018)Addressing problems with replicability and validity of repository mining studies through a smart data platformEmpirical Software Engineering10.1007/s10664-017-9537-x23:2(1036-1083)Online publication date: 1-Apr-2018
  • (2017)Hermes: assessment and creation of effective test corporaProceedings of the 6th ACM SIGPLAN International Workshop on State Of the Art in Program Analysis10.1145/3088515.3088523(43-48)Online publication date: 18-Jun-2017
  • (2017)Automated and Scalable Mutation Testing2017 IEEE International Conference on Software Testing, Verification and Validation (ICST)10.1109/ICST.2017.74(559-560)Online publication date: Mar-2017
  • (2016)Call graph construction for Java librariesProceedings of the 2016 24th ACM SIGSOFT International Symposium on Foundations of Software Engineering10.1145/2950290.2950312(474-486)Online publication date: 1-Nov-2016
  • (2016)Toward an automated benchmark management systemProceedings of the 5th ACM SIGPLAN International Workshop on State Of the Art in Program Analysis10.1145/2931021.2931023(13-17)Online publication date: 14-Jun-2016

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media