Abstract
In the design of a visualization system for exploratory data analysis, a designer faces several issues: (i) the recognition of the causes behind excessive latency experienced by end users, who become quickly disengaged in the exploration if the response time is below a desired threshold (i.e., 500 ms); (ii) the discovery of portions of the visualization system that are poorly explored or may not work as intended; (iii) the lack of precise feedback from the end users who, struggling from excessive latency, become disinterested in the exploration and report high-level feedback that is too broad and generic for the designer to understand and transform into actionable changes to the design. To address these issues and provide more guidance to visualization system designers, we contributed a general framework to model and assess user interactions in big data visualization systems. It models the interaction space of the visualization system with the concept of augmented statecharts that label interactions with their latency thresholds. It is implemented in a system, InterView (the name relates to the collaboration between visualization designers and end users), composed of two software components, one to automatically generate the interaction space of a visualization system using a statechart, and one to replay user traces, reproducing each interaction an end user performed in the interaction log. In this paper, we demonstrate the capabilities of InterView applying it to a well-known crossfilter interface, Falcon, to guide the visualization system designers in discovering the root causes behind excessive latency, coupled with a complete understanding of the interaction space of their visualization system. In such a way, designers can finally acknowledge the problems of their visualization system with higher granularity and precision, giving more context to the feedback received by the end users.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Baeza-Yates, R., et al.: Modern information retrieval, July 1999
Battle, L., et al.: Database benchmarking for supporting real-time interactive querying of large data. In: Proceedings of the 2020 ACM SIGMOD International Conference on Management of Data, SIGMOD 2020, pp. 1571–1587. Association for Computing Machinery, New York (2020). https://doi.org/10.1145/3318464.3389732
Benvenuti, D., Filosa, M., Catarci, T., Angelini, M.: Modeling and assessing user interaction in big data visualization systems. In: Abdelnour Nocera, J., Kristín Lárusdóttir, M., Petrie, H., Piccinno, A., Winckler, M. (eds.) Human-Computer Interaction - INTERACT 2023, pp. 86–109. Springer Nature Switzerland, Cham (2023)
Bureau of Transportation Statistics: On-time performance. https://www.bts.gov/ (nd). Accessed 02 April 2024
Harel, D.: Statecharts: a visual formalism for complex systems. Sci. Comput. Program. 8(3), 231–274 (1987). https://doi.org/10.1016/0167-6423(87)90035-9
La Rosa, B., et al.: State of the art of visual analytics for explainable deep learning. Comput. Graph. Forum 42(1), 319–355 (2023). https://doi.org/10.1111/cgf.14733
Liu, Z., Heer, J.: The effects of interactive latency on exploratory visual analysis. IEEE Trans. Visual Comput. Graph. 20(12), 2122–2131 (2014). https://doi.org/10.1109/TVCG.2014.2346452
Livny, M., et al.: Devise: integrated querying and visual exploration of large datasets (demo abstract), vol. 26, pp. 517–520, June 1997
Micallef, L., et al.: The human user in progressive visual analytics. In: EuroVis 2019 - Short Papers, pp. 19–23. Eurographics Association (2019). https://doi.org/10.2312/evs.20191164. https://www.eurovis.org/, 21st EG/VGTC Conference on Visualization, EuroVis ; Conference date: 03-06-2019 Through 07-06-2019
Moritz, D., Howe, B., Heer, J.: Falcon: balancing interactive latency and resolution sensitivity for scalable linked visualizations, pp. 1–11 (2019). https://doi.org/10.1145/3290605.3300924
Shneiderman, B.: Response time and display rate in human performance with computers. ACM Comput. Surv. 16(3), 265–285 (1984). https://doi.org/10.1145/2514.2517. https://doi.org/10.1145/2514.2517
Waloszek, G., Kreichgauer, U.: User-centered evaluation of the responsiveness of applications. In: IFIP Conference on Human-Computer Interaction. pp. 239–242. Springer (2009)
Zhang, T., Ramakrishnan, R., Livny, M.: Birch: an efficient data clustering method for very large databases. In: Proceedings of the 1996 ACM SIGMOD International Conference on Management of Data, SIGMOD 1996, pp. 103–114. Association for Computing Machinery, New York (1996).https://doi.org/10.1145/233269.233324
Acknowledgements
This project was supported by the MUR PRIN 2022 Project No. 202248FWFS “Discount quality for responsible data science: Human-in-the-Loop for quality data” within the NextGenerationEU Programme within the NextGenerationEU Programme - M4C2.1.1
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 IFIP International Federation for Information Processing
About this paper
Cite this paper
Filosa, M., Plexousaki, A., Benvenuti, D., Catarci, T., Angelini, M. (2024). InterView: A System to Support Interaction-Driven Visualization Systems Design. In: Lárusdóttir, M.K., Naqvi, B., Bernhaupt, R., Ardito, C., Sauer, S. (eds) Human-Centered Software Engineering. HCSE 2024. Lecture Notes in Computer Science, vol 14793. Springer, Cham. https://doi.org/10.1007/978-3-031-64576-1_23
Download citation
DOI: https://doi.org/10.1007/978-3-031-64576-1_23
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-64575-4
Online ISBN: 978-3-031-64576-1
eBook Packages: Computer ScienceComputer Science (R0)