Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1168149.1168156acmotherconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

Methods for the evaluation of an interactive InfoVis tool supporting exploratory reasoning processes

Published: 23 May 2006 Publication History

Abstract

Developing Information Visualization (InfoVis) techniques for complex knowledge domains makes it necessary to apply alternative methods of evaluation. In the evaluation of Gravi++ we used several methods and studied different user groups. We developed a reporting system yielding data about the insights the subjects gained during the exploration. It provides complex information about subjects' reasoning processes. Log files are valuable for time-dependent analysis of cognitive strategies. Focus groups provide a different view on the process of gaining insights. We assume that our experiences with all these methods can also be applied in similar evaluation studies on InfoVis techniques for complex data.

References

[1]
T. M. Boren and J. Ramey. Thinking aloud: Reconciling theory and practice. Professional Communication, IEEE Transactions on, 43(3):261--278, September 2000.
[2]
C. Freitas, P. Luzzardi, R. Cava, M. Winckler, M. Pimenta, and L. Nedel. On evaluating information visualization techniques. In Proceedings of the working conference on Advanced Visual Interfaces. ACM Press, 2002.
[3]
K. Hinum, S. Miksch, W. Aigner, S. Ohmann, C. Popow, M. Pohl, and M. Rester. Gravi++: Interactive information visualization to explore highly structured temporal data. Journal of Universal Computer Science (J. UCS) -- Special Issue on Visual Data Mining, 11(11):1792--1805, 2005.
[4]
M. Kuniavsky. User Experience: A Practitioner's Guide for User Research. Morgan Kaufmann, 2003.
[5]
M. Lanzenberger, S. Miksch, and M. Pohl. The stardinates---visualizing highly structured data. In Proceedings of the Int. Conference on Information Visualization (IV03), July 16--18, 2003, London, UK, pages 47--52. IEEE Computer Science Society, 2003.
[6]
M. Lanzenberger, S. Miksch, and M. Pohl. Exploring highly structured data - a comparative study of stardinates and parallel coordinates. In Proceedings of the 9th Int. Conference on Information Visualisation (IV05), July 6--8, 2005, London, UK, pages 312--320. IEEE Computer Science Society, 2005.
[7]
E. Morse, M. Lewis, and K. A. Olsen. Evaluating visualizations: using a taxonomic guide. Int. J. Human-Computer Studies, 53(5):637--662, 2000.
[8]
J. Nielsen. Heuristic Evaluation, chapter 2, pages 25--62. John Wiley & Sons, Inc., New York, 1994.
[9]
R. Pillat, E. Valiati, and C. Freitas. Experimental study on evaluation of multidimensional information visualization techniques. In CLIHC '05: Proceedings of the 2005 Latin American conference on Human-computer interaction, pages 20--30, ACM Press, New York, NY, USA, 2005.
[10]
C. Plaisant. The challenge of information visualization evaluation. In M. F. Costabile, editor, Proceedings of the working conference on Advanced visual interfaces, pages 109--116. ACM Press, 2004.
[11]
M. Rester and M. Pohl. Ecodesign - an online university course for sustainable product design. In P. Kommers and G. Richards, editors, Proceedings of of ED-MEDIA 2005. World Conference on Educational Multimedia, Hypermedia and Telecommunications. Montreal, Canada, pages 316--323, Association for the Advancement of Computing in Education (AACE), Norfolk, VA, 2005.
[12]
M. Rester, M. Pohl, K. Hinum, S. Miksch, S. Ohmann, C. Popow, and S. Banovic. Assessing the usability of an interactive information visualization method as the first step of a sustainable evaluation. In A. Holzinger and K.-H. Weidmann, editors, Empowering Software Quality: How can Usability Engineering reach these goals?, volume 198 of [email protected], pages 31--44. Austrian Computer Society, 2005.
[13]
P. Saraiya, C. North, and K. Duca. An insight-based methodology for evaluating bioinformatics visualizations. Visualization and Computer Graphics, IEEE Transactions on, 11(4):443--456, 2005.
[14]
M. Tory and T. Möller. Evaluating visualizations: do expert reviews work? Computer Graphics and Applications, IEEE, 25(5):8--11, 2005.

Cited By

View all
  • (2015)Users impressions about visualization techniques in social networks contextProceedings of the 14th Brazilian Symposium on Human Factors in Computing Systems10.1145/3148456.3148515(1-4)Online publication date: 3-Nov-2015
  • (2014)Towards analyzing eye tracking data for evaluating interactive visualization systemsProceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization10.1145/2669557.2669569(70-77)Online publication date: 10-Nov-2014
  • (2014)Understanding implicit and explicit interface tools to perform visual analytics tasksProceedings of the 2014 IEEE 15th International Conference on Information Reuse and Integration (IEEE IRI 2014)10.1109/IRI.2014.7051956(687-694)Online publication date: Aug-2014
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
BELIV '06: Proceedings of the 2006 AVI workshop on BEyond time and errors: novel evaluation methods for information visualization
May 2006
89 pages
ISBN:1595935622
DOI:10.1145/1168149
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 23 May 2006

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Article

Conference

AVI06

Acceptance Rates

Overall Acceptance Rate 45 of 64 submissions, 70%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)3
  • Downloads (Last 6 weeks)0
Reflects downloads up to 25 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2015)Users impressions about visualization techniques in social networks contextProceedings of the 14th Brazilian Symposium on Human Factors in Computing Systems10.1145/3148456.3148515(1-4)Online publication date: 3-Nov-2015
  • (2014)Towards analyzing eye tracking data for evaluating interactive visualization systemsProceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization10.1145/2669557.2669569(70-77)Online publication date: 10-Nov-2014
  • (2014)Understanding implicit and explicit interface tools to perform visual analytics tasksProceedings of the 2014 IEEE 15th International Conference on Information Reuse and Integration (IEEE IRI 2014)10.1109/IRI.2014.7051956(687-694)Online publication date: Aug-2014
  • (2013)A longitudinal study of HotMap web searchOnline Information Review10.1108/OIR-09-2011-015337:2(252-267)Online publication date: 12-Apr-2013
  • (2012)Is visualization usable for displaying web search results in an exploratory search context?Proceedings of the 2012 international conference on Information Retrieval Meets Information Visualization10.1007/978-3-642-36415-0_11(167-176)Online publication date: 23-Jan-2012
  • (2009)User Evaluation Methods for Visual Web Search InterfacesProceedings of the 2009 13th International Conference Information Visualisation10.1109/IV.2009.21(139-145)Online publication date: 15-Jul-2009
  • (2008)Distributed usability evaluation of the Pennsylvania Cancer AtlasInternational Journal of Health Geographics10.1186/1476-072X-7-367:1Online publication date: 11-Jul-2008
  • (2008)Understanding and characterizing insightsProceedings of the 2008 Workshop on BEyond time and errors: novel evaLuation methods for Information Visualization10.1145/1377966.1377971(1-6)Online publication date: 5-Apr-2008
  • (2008)Evaluating the relationship between user interaction and financial visual analysis2008 IEEE Symposium on Visual Analytics Science and Technology10.1109/VAST.2008.4677360(83-90)Online publication date: Oct-2008
  • (2007)Mixing evaluation methods for assessing the utility of an interactive InfoVis techniqueProceedings of the 12th international conference on Human-computer interaction: interaction design and usability10.5555/1772490.1772559(604-613)Online publication date: 22-Jul-2007
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media