Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2414721.2414727acmconferencesArticle/Chapter ViewAbstractPublication PagessplashConference Proceedingsconference-collections
research-article

Comparative language fuzz testing: programming languages vs. fat fingers

Published: 21 October 2012 Publication History

Abstract

We explore how programs written in ten popular programming languages are affected by small changes of their source code. This allows us to analyze the extend to which these languages allow the detection of simple errors at compile or at run time. Our study is based on a diverse corpus of programs written in several programming languages systematically perturbed using a mutation-based fuzz generator. The results we obtained prove that languages with weak type systems are significantly likelier than languages that enforce strong typing to let fuzzed programs compile and run, and, in the end, produce erroneous results. More importantly, our study also demonstrates the potential of comparative language fuzz testing for evaluating programming language designs.

References

[1]
H. J. Boom and E. de Jong. A critical comparison of several programming language implementations. Software: Practice and Experience, 10 (6): 435--473, 1980. 10.1002/spe.4380100605.
[2]
M. Brader. Mariner I {once more}. The Risks Digest, 9 (54), Dec. 1989. URL http://catless.ncl.ac.uk/Risks/9.54.html#subj1.1. Current August 6th, 2012.
[3]
B. Daniel, D. Dig, K. Garcia, and D. Marinov. Automated testing of refactoring engines. In Proceedings of the the 6th Joint Meeting of the European Software Engineering Conference and the ACM SIGSOFT Symposium on the Foundations of Software Engineering, ESEC-FSE '07, pages 185--194, New York, NY, USA, 2007. ACM. 10.1145/1287624.1287651.
[4]
A. Endres. An analysis of errors and their causes in system programs. SIGPLAN Notices, 10 (6): 327--336, Apr. 1975. 10.1145/390016.808455. Proceedings of the International Conference on Reliable Software.
[5]
A. Feldthaus, T. Millstein, A. Møller, M. Schafer, and F. Tip. Tool-supported refactoring for JavaScript. In phProceedings of the 2011 ACM International Conference on Object Oriented Programming Systems Languages and Applications, OOPSLA'11, pages 119--138, New York, NY, USA, 2011. ACM. 10.1145/2048066.2048078.
[6]
R. A. Fisher. The logic of inductive inference. Journal of the Royal Statistical Society Series A, 98: 39--54, 1935.
[7]
M. Fowler. Refactoring: Improving the Design of Existing Code. Addison-Wesley, Boston, MA, 2000. With contributions by Kent Beck, John Brant, William Opdyke, and Don Roberts.
[8]
V. Ganesh, T. Leek, and M. Rinard. Taint-based directed whitebox fuzzing. In Proceedings of the 31st International Conference on Software Engineering, ICSE'09, pages 474--484, Washington, DC, USA, 2009. IEEE Computer Society. 10.1109/ICSE.2009.5070546.
[9]
R. Gerlich, R. Gerlich, and T. Boll. Random testing: from the classical approach to a global view and full test automation. In Proceedings of the 2nd International Workshop on Random Testing: Co-located with the 22nd IEEE/ACM International Conference on Automated Software Engineering (ASE 2007), RT'07, pages 30--37, New York, NY, USA, 2007. ACM. 10.1145/1292414.1292424.
[10]
P. Godefroid. Random testing for security: blackbox vs. whitebox fuzzing. In Proceedings of the 2nd International Workshop on Random Testing: Co-located with the 22nd IEEE/ACM International Conference on Automated Software Engineering (ASE 2007), RT'07, pages 1--1, New York, NY, USA, 2007. ACM. 10.1145/1292414.1292416.
[11]
P. Godefroid, A. Kiezun, and M. Y. Levin. Grammar-based whitebox fuzzing. In Proceedings of the 2008 ACM SIGPLAN Conference on Programming Language Design and Implementation, PLDI'08, pages 206--215, New York, NY, USA, 2008. ACM. 10.1145/1375581.1375607.
[12]
D. Hamlet. When only random testing will do. In Proceedings of the 1st International Workshop on Random Testing, RT'06, pages 1--9, New York, NY, USA, 2006. ACM. 10.1145/1145735.1145737.
[13]
S. Hanenberg. Faith, hope, and love: an essay on software science's neglect of human factors. In OOPSLA'10: Proceedings of the ACM International Conference on Object Oriented Programming Systems Languages and Applications, New York, NY, USA, 2010. ACM. ISBN 978--1--4503-0203--6. http://doi.acm.org/10.1145/1869459.1869536.
[14]
D. Hofstadter. Godel, Escher, Bach: An Eternal Golden Braid, page 137. Vintage Books, 1989.
[15]
Y. Jia and M. Harman. An analysis and survey of the development of mutation testing. IEEE Transactions on Software Engineering, (99), 2010.
[16]
B. W. Kernighan. Why Pascal is not my favorite programming language. Computer Science Technical Report 100, Bell Laboratories, Murray Hill, NJ, July 1981.
[17]
R. S. King. The top 10 programming languages. IEEE Spectrum, 48 (10): 84, Oct. 2011. 10.1109/MSPEC.2011.6027266.
[18]
S. Kleinschmager, S. Hanenberg, R. Robbes, É. Tanter, and A. Stefik. Do static type systems improve the maintainability of software systems? An empirical study. In Proceedings of the International Conference on Program Comprehension, pages 153--162, 2012. 10.1109/ICPC.2012.6240483.
[19]
D. Knuth. Man or boy? phAlgol Bulletin, 17: 7, July 1964. URL http://archive.computerhistory.org/resources/text/algol/algol_bulletin/A17/P24.H™. Current August 7th, 2012.
[20]
D. E. Knuth. The errors of TeX. Software: Practice and Experience, 19 (7): 607--687, July 1989.
[21]
I. Koren and C. M. Krishna. Fault-Tolerant Systems. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 2007.
[22]
B. S. Lerner, M. Flower, D. Grossman, and C. Chambers. Searching for type-error messages. SIGPLAN Notices, 42 (6): 425--434, June 2007. 10.1145/1273442.1250783. Proceedings of the 2007 ACM SIGPLAN Conference on Programming Language Design and Implementation.
[23]
M. R. Lyu. Software Fault Tolerance. John Wiley & Sons, Inc., New York, NY, USA, 1995.
[24]
B. P. Miller, L. Fredriksen, and B. So. An empirical study of the reliability of UNIX utilities. Communications of the ACM, 33 (12): 32--44, Dec. 1990.
[25]
P. G. Neumann. Computer Related Risks, chapter 2.2.2 Other Space-Program Problems; DO I=1.10 bug in Mercury Software, page 27. Addison-Wesley, Reading, MA, 1995.
[26]
T. J. Ostrand and E. J. Weyuker. Collecting and categorizing software error data in an industrial environment. Journal of Systems and Software, 4 (4): 289--300, 1984. 10.1016/0164--1212(84)90028--1.
[27]
B. C. Pierce. Types and Programming Languages. MIT Press, 2002.
[28]
L. Prechelt. An empirical comparison of seven programming languages. Computer, 33 (10): 23--29, Oct. 2000. 10.1109/2.876288.
[29]
R. Ramler and K. Wolfmaier. Economic perspectives in test automation: balancing automated and manual testing with opportunity cost. In Proceedings of the 2006 International Workshop on Automation of Software Test, AST'06, pages 85--91, New York, NY, USA, 2006. ACM. 10.1145/1138929.1138946.
[30]
R. J. Rubey, R. C. Wick, W. J. Stoner, and L. Bentley. Comparative evaluation of PL/I. Technical report, Logicon Inc, April 1968. URL http://oai.dtic.mil/oai/oai?verb=getRecord&metadataPrefix=html&identifier=AD0669096.
[31]
M. Schafer. Refactoring tools for dynamic languages. In Proceedings of the Fifth Workshop on Refactoring Tools, WRT'12, pages 59--62, New York, NY, USA, 2012. ACM. 10.1145/2328876.2328885.
[32]
A. Stuchlik and S. Hanenberg. Static vs. dynamic type systems: an empirical study about the relationship between type casts and development time. In Proceedings of the 7th Symposium on Dynamic Languages, DLS'11, pages 97--106, New York, NY, USA, 2011. ACM. ISBN 978--1--4503-0939--4. 10.1145/2047849.2047861.
[33]
A. Takanen, J. DeMott, and C. Miller. Fuzzing for Software Security Testing and Quality Assurance. Artech House, Inc., Norwood, MA, USA, 1 edition, 2008.
[34]
T. Wang, T. Wei, G. Gu, and W. Zou. Checksum-aware fuzzing combined with dynamic taint analysis and symbolic execution. ACM Transactions of Information Systems Security, 14 (2): 15:1--15:28, Sept. 2011. 10.1145/2019599.2019600.
[35]
J. Zhao, Y. Wen, and G. Zhao. H-fuzzing: a new heuristic method for fuzzing data generation. In Proceedings of the 8th IFIP International Conference on Network and Parallel Computing, NPC'11, pages 32--43, Berlin, Heidelberg, 2011. Springer-Verlag.

Index Terms

  1. Comparative language fuzz testing: programming languages vs. fat fingers

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    PLATEAU '12: Proceedings of the ACM 4th annual workshop on Evaluation and usability of programming languages and tools
    October 2012
    46 pages
    ISBN:9781450316316
    DOI:10.1145/2414721
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 21 October 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. comparison
    2. fuzzing
    3. programming languages
    4. rosetta stone

    Qualifiers

    • Research-article

    Conference

    SPLASH '12
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 5 of 8 submissions, 63%

    Upcoming Conference

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 281
      Total Downloads
    • Downloads (Last 12 months)10
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 15 Oct 2024

    Other Metrics

    Citations

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media