Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1409720.1409730acmconferencesArticle/Chapter ViewAbstractPublication PagessoftvisConference Proceedingsconference-collections
research-article

Representing unit test data for large scale software development

Published: 16 September 2008 Publication History

Abstract

Large scale software projects rely on routine, automated testing to gauge progress towards its goals. The diversity and quantity of these tests grow as time and project scope increase. This is as a consequence of both experience and expanding audience. It becomes increasingly difficult to interpret testing results as the testing suites multiply and diversify. If interpretation becomes too difficult, testings results could become ignored all together. Visualization has proven to be an effective tool to aid the interpretation of large amounts of data. We have adapted visualization techniques based on small multiples to communicate the health of the software project across several levels of abstraction. The collective set of techniques we refer to as the SeeTest visualization schema. We applied this visualization technique to the Open MPI test results in order to assist developers in the software release cycle. Through the visualizations, developers found a variety of surprising mismatches between their data and their intuitions. This exploration did not involve collecting any data not already being collected, merely presenting it in manner that better supported their needs. In this paper, we detail the development of the representation we used and give more particular analysis of the insights gained by the Open MPI community. The techniques presented in this paper can be applied to other software projects.

References

[1]
Bederson, B. B., Grosjean, J., and Meyer, J. 2004. Toolkit design for interactive structures and graphics. IEEE Transactions on Software Engineering 30, 8, 535--546.
[2]
Cottam, J. A., and Lumsdaine, A. 2007. Tuple Space Mapper: Design, challenges and goals. Tech. Rep. TR648, Indiana University, Bloomington, IN, June.
[3]
Eick, S. G., Steffen, J. L., and Summer, E. E. 1992. SeeSoft: A tool for visualizing line oriented software statistics. IEEE Transactions on Software Engineering 18, 11 (November), 956--968.
[4]
Gabriel, E., and et al. 2004. Open MPI: Goals, concept, and design of a next generation MPI implementation. In Proceedings, 11th European PVM/MPI Users' Group Meeting, 97--104.
[5]
Harrold, M. J. 2000. Testing: A roadmap. In ICSE '00: Proceedings of the Conference on The Future of Software Engineering, ACM Press, New York, NY, USA, 61--72.
[6]
Hursey, J., Mallove, E., Squyres, J. M., and Lumsdaine, A. 2007. An extensible framework for distributed testing of MPI implementations. In Proceedings, Euro PVM/MPI.
[7]
Jones, J. A., Harrold, M. J., and Stasko, J. 2002. Visualization of test information to assist fault localization. In ICSE '02: Proceedings of the 24th International Conference on Software Engineering, ACM, New York, NY, USA, 467--477.
[8]
JUnit, 2008. JUnit: Resources for Test Driven Development. http://www.junit.org/.
[9]
Kitware Inc., 2008. CDash: Open Source, Distributed, Software Quality System. http://www.cdash.org/.
[10]
Kitware Inc., 2008. Dart: Tests, Reports and Dashboards. http://public.kitware.com/Dart/.
[11]
Message Passing Interface Forum. 1993. MPI: A Message Passing Interface. In Proc. of Supercomputing '93, IEEE Computer Society Press, 878--883.
[12]
Meyer, M., Gîrba, T., and Lungu, M. 2006. Mondrian: An agile information visualization framework. In Proceedings of the 2006 ACM Symposium on Software visualization (SOFTVIS'06), ACM, New York, NY, USA, 135--144.
[13]
North, S. 1998. Visualizing graph models of software. In Software Visualization: Programming as a Multimedia Experience, J. Stasko, J. Domingue, M. H. Brown, and B. A. Price, Eds. The MIT Press.
[14]
nUnit, 2008. nUnit. http://www.nunit.org/.
[15]
Open MPI, 2008. MTT: MPI Testing Tool. http://www.open-mpi.org/projects/mtt/.
[16]
Osterweil, L. 1996. Strategic directions in software quality. ACM Comput. Surv. 28, 4, 738--750.
[17]
Panas, T., Lincke, R., and Löwe, W. 2005. Online-configuration of software visualizations with Vizz3D. In Proceedings of the 2005 ACM Symposium on Software Visualization (SoftVis'05), ACM Press, New York, NY, USA, 173--182.
[18]
Pinzger, M., Gall, H., Fischer, M., and Lanza, M. 2005. Visualizing multiple evolution metrics. In IEEE Internation Workshop of Software Visualization (SoftVis2005).
[19]
Reiss, S., and Renieris, M. 2003. The Bloom software visualization system. In Software Visualization: From Theory to Practice, J. Cao, Z. Ren, A. Chan, L. Fang, L. Xie, and D. Chen, Eds. Kluwer Academic Publishers, Norwell, Massachusetts, 243--284.
[20]
Reiss, S., and Renieris, M. 2005. Jove: Java as it happens. In Proceedings of the 2005 ACM Symposium on Software Visualization, ACM Press, New York, New York, 115--124.
[21]
Voinea, L., Telea, A., and von Wijk, J. 2005. CVSscan: Visualzation of code evolution. In IEEE Internation Workshop of Software Visualization (SoftVis2005).

Cited By

View all
  • (2021)Adapting Behavior Driven Development (BDD) for large-scale software systemsJournal of Systems and Software10.1016/j.jss.2021.110944(110944)Online publication date: Mar-2021
  • (2016)Advances in the Characterization of Cognitive Support for Unit Testing: The Bug-Hunting Game and the Visualization Arsenal2016 IEEE International Symposium on Software Reliability Engineering Workshops (ISSREW)10.1109/ISSREW.2016.11(213-220)Online publication date: Oct-2016

Recommendations

Reviews

Ponmurugarajan Thiyagarajan

The ability to track progress is critical to the success of any software project. When it comes to large programs, tracking progress becomes more difficult. Normally, project tracking data is available in tabular formats, which are often not easy to interpret. Adding visualization effects to the project data helps make decision making simpler and quicker. The authors of this paper discuss the need for visualization of testing data, mainly for large-scale software development projects. This paper is a straightforward effort to elaborate the background used, and the steps taken by the authors and team, to build the collective set of visualization techniques called the SeeTest visualization schema, and to integrate it with the message passing interface (MPI) testing tool (MTT). This was implemented for the Open MPI community. This MTT visualization extension took care of various existing deficiencies, and proved to be effective when compared with the existing reporting tools. The authors explain various issues faced during tool development, including the changes they brought in to address them, which is commendable. Online Computing Reviews Service

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SoftVis '08: Proceedings of the 4th ACM symposium on Software visualization
September 2008
228 pages
ISBN:9781605581125
DOI:10.1145/1409720
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 16 September 2008

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. MPI
  2. project management
  3. testing
  4. visualization

Qualifiers

  • Research-article

Funding Sources

Conference

Softvis '08

Acceptance Rates

Overall Acceptance Rate 20 of 65 submissions, 31%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)9
  • Downloads (Last 6 weeks)0
Reflects downloads up to 30 Aug 2024

Other Metrics

Citations

Cited By

View all
  • (2021)Adapting Behavior Driven Development (BDD) for large-scale software systemsJournal of Systems and Software10.1016/j.jss.2021.110944(110944)Online publication date: Mar-2021
  • (2016)Advances in the Characterization of Cognitive Support for Unit Testing: The Bug-Hunting Game and the Visualization Arsenal2016 IEEE International Symposium on Software Reliability Engineering Workshops (ISSREW)10.1109/ISSREW.2016.11(213-220)Online publication date: Oct-2016

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media