Abstract
ARC is an adaptive indoor mobile wayfinding system, which was developed based on the results of a previous online survey. The ARC system links data from several sources to enable a route guidance adapted to the environment: the type of route instruction is adapted to the location of the user. In this study, the usability of the system was tested in a smart building by use of a mobile eye-tracker. Five eye tracking measures were analyzed and compared with the space syntax values of the decision points. The results confirm that video instructions can improve support at complex decision points, while symbols might not be supportive enough at these points in the indoor environment. These findings enhance our understanding of the relationship between environment and complexity perception during route guidance, which is essential for supportive indoor route guidance.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Numerous studies have investigated the influence of the environment on people’s indoor wayfinding behavior [1,2,3]. Moreover, the structure and layout of buildings has been identified as a major contributing factor in the mental map that is formed during wayfinding [4]. However, technology in the built environment is evolving fast and little is known about the influence of new factors, such as positioning sensors, mobile indoor wayfinding aids and 3D-technology. There is an unambiguous relationship between the environment and explorative wayfinding behavior, but when people are being guided by a system, their perception of the environment might be different. Understanding this perception during route guidance and the elements that affect it is crucial for the usability of the wayfinding systems of the future. Therefore, this study aims to link complexity perception during route guidance with building architecture. To this end, an Adaptive Route Communication (ARC) system has been developed, which links architectural data of several sources, such as ultra-wideband (UWB) sensors and the building information model (BIM). A usability study was conducted in which the cognitive load imposed by the ARC system during route guidance was monitored with mobile eye tracking glasses. The major objective of this experiment is to study the link between the cognitive load imposed by a wayfinding system and building architecture, quantified with spacy syntax.
1.1 Space Syntax
Indoor wayfinding can be very challenging as the indoor environment can be very complex [5]. A well-established theory to quantify building complexity, which is known to correlate with wayfinding performance, is space syntax [6]. Space syntax is a collection of theories and methods to quantify the relation between both indoor and outdoor space on the one hand and society on the other hand. One of these methods are isovists, which are areas in space that can be seen simultaneously from a certain viewpoint. The visibility at viewpoints can be compared through the properties of the isovists at these points, such as area and longest line of sight [7]. Another space syntax method which uses visibility as a complexity measure is the visibility graph analysis (VGA). In this approach, a grid is drawn on the floorplan and in every point of the grid an isovist is drawn. Analogue to isovists, several metrics can be calculated of a VGA, such as the mean visual depth (MVD) which is the mean number of isovists you have to cross to reach a certain point of the grid from every other point [8]. De Cock et al. (2020) studied the relationship between space syntax and complexity perception during route guidance by conducting an online survey [9]. Results showed that this relationship depended a lot on the given route instruction at a decision point: taking turns was most complex at convex, central spaces, while this was reversed to start and end a route and to change levels. These findings indicate that the link between space syntax and perception might be different during route guidance compared to explorative wayfinding. This study seeks to test the findings of the online survey by De Cock et al. (2020) in the real-life environment with a real-life adapted mobile wayfinding system.
1.2 Adapted Mobile Wayfinding Systems
Because the indoor environment can be very complex, users could benefit from route guidance systems that adhere better to the user’s spatial cognition. One way of making route guidance systems cognitively supportive is by adapting the given information to the environment [10]. The ARC system puts this theory into practice by adapting the type of route instruction (e.g. photo, symbol, video) to the decision point. Every decision point is different so the needs of the users also change at every decision point. Changing the instruction type can support these needs, as every instruction type has specific characteristics and induces a different cognitive load [11]: Symbols impose a low cognitive load as they show abstracted information as opposed to photos and 3D-simulations, but at complex decision points abstracted information can also impose a high cognitive load when the translation to the environment is more difficult. The ARC system, developed for this study, adapts the route instruction types according to the results of the online survey carried out by De Cock et al. (2020) [12]: symbols + text to start and end a route, 3D-simulations + text at central decision points and photo + text at other decision points.
A number of studies have developed adaptive systems for specific users, such as impaired people or tourists [13,14,15,16], but few were developed for everyday life, even though in this case people are often rushed. For example when you have to attend a meeting, you want to find the meeting room as efficiently as possible. Recent technologies in smart buildings enable the development of wayfinding systems that use linked data. The Find Me! App is an example of such a system where sensor data is linked to the BIM of the building to enable accurate route guidance [17]. With these systems on your smartphone, you can be guided indoors, from your own office to the meeting room. So far, however, research on these systems was mostly limited to the development and installation instead of the usability [18].
1.3 Eye Tracking
The usability of wayfinding systems can be tested by use of an eye tracker, which renders information on where and how long the visual attention of users is directed [19]. By using a mobile eye tracker, wayfinding experiments can be conducted in real indoor environments, which enables cross-validation of less immersive, more controlled experiments [20]. A number of studies have used mobile eye tracking to measure the cognitive load induced by wayfinding systems, such as [21,22,23]. Most of them use fixation measures to analyze cognitive load, for example a longer fixation duration implies a higher cognitive load, while a higher fixation rate implies a lower cognitive load. Saccadic duration and rate measure the same effect as fixation duration and rate, but additionally saccadic amplitude can be calculated, which decreases when cognitive load increases because of a difficult search task or a careful inspection of the stimulus [24].
2 Materials and Methods
2.1 ARC
For this case study, the ARC mobile wayfinding aid has been developed (Fig. 1).
ARC uses the low-cost and open source UWB hardware platform Wi-PoS for positioning, installed in the smart building iGent (Belgium) [25]. The location of the mobile UWB tag, connected to the smartphone, is calculated through a particle filter and sent to the webplatform through the MQTT protocol. For the route planning Dijkstra’s shortest path algorithm is used on a graph of the building, extracted from the floorplans. The photos for the photo + text type were taken beforehand in the building and arrows were placed on top of them to create augmented photographs [18]. For the 3D-simulations, the BIM of the building was imported in Unity and graphics were added. The symbol + text type requires least resources as only 4 symbols have to be designed to cover all actions. Depending on the decision point and the demanded turn, the right route instruction is fetched from the server and combined with a text line. When the user reaches the decision point with the smartphone and UWB tag, a short sound will be played and the route instruction will automatically appear on the screen.
2.2 Eye Tracking Experiment
33 participants were asked to walk three routes in the building (Fig. 2), guided by the ARC system, while wearing the SMI ETG 2.1 mobile eye tracking device (60 Hz/30 FPS). During route guidance they received 12 route instructions: 3 to change levels (Level 1, Level 2 and Level 3), 4 to take turns (Turn 1, Turn 2, Turn 3 and Turn 4), 2 to start a route (Start 1 and Start 2) and 3 to end a route (End 1, End 2 and End 3). The first route is started by taking the stairs at Level 1, from then on each new route starts at the endpoint of the previous route. The color of each route instruction on Fig. 2 is determined by the space syntax value and visualizes how complex this instruction would be perceived compared to other instructions of the same category, according to the results of De Cock et al. (2020) [9]. For example, the route instruction to take the stairs at Level 2 (red color) would be perceived as more complex than the instruction to take the elevator at Level 3 (green color), because of a higher mean visual depth value.
2.3 Statistical Analysis
Five eye tracking measures were calculated for every decision point: fixation duration, saccadic duration, fixation rate, saccadic rate and saccadic amplitude. For each of these measures the difference between route instructions of the same category (Start/End, Level and Turn) was tested through the Mann-Whitney U test. This way, the relation was analyzed between perceived complexity, quantified through space syntax by De Cock et al. (2020) [9], on the one hand and cognitive load imposed by ARC, quantified through eye tracking, on the other hand.
3 Results
The red colored route instructions of Fig. 2 (high perceived complexity) would be expected to have longer fixation and saccadic durations, lower fixation and saccadic rates and a lower saccadic amplitude than the green colored route instructions (low perceived complexity). For most comparisons, this seems to be the case, confirming the findings of De Cock et al. (2020) [9], except for the following route instructions:
-
The video instruction of Turn 3 resulted in smaller saccadic durations, higher saccadic rates and a smaller saccadic amplitude than the photo instruction of Turn 2.
-
Turn 3 induced a higher fixation and saccadic rate than Turn 1, resulting in a lower cognitive load.
-
Level 1 induced a higher saccadic amplitude than Level 3, resulting in a lower cognitive load.
-
End 1 induced a higher saccadic rate than End 3, resulting in a lower cognitive load
4 Discussion
The first two findings were most likely caused by the experimental setup. In the first, a video type is compared to a photo type, which can render ambiguous results in eye tracking research. When dynamic stimuli, such as videos, are used, smooth pursuit eye movements can occur, in which the eye follows a moving object. These smooth pursuit movements are not easily detected by standard event detection algorithms, and even less so with mobile eye tracking glasses. As a result, smooth pursuit movements are often misclassified as long fixations or small saccades [26], which might explain the small saccadic duration, high saccadic rate and small saccadic amplitude of Turn 3. The second finding involves two videos, which should render more comparable results. However, all participants did the three routes in the same order, thus all participants saw Turn 1 before Turn 2. Both route instructions were very alike so the lower cognitive load at the second turn instruction might be caused by a learning effect. The last two findings that contradict De Cock et al. (2020) [9], could not be attributed to experimental setup. Although Level 1 is a video instruction, a higher saccadic amplitude was registered compared to the photo instruction of Level 3. Both learning effect and smooth pursuit would have caused a smaller saccadic amplitude, thus another factor is causing the lower cognitive load on the more complex Level 1 decision point. The adapted ARC system showed a video instruction at Level 1, as De Cock et al. (2019) [12] found that this instruction type was more appreciated at complex decision points. Therefore, this eye tracking study confirms the positive effect of video instructions on complex decision points: the imposed cognitive load can even be less than on a less complex decision point with a photo instruction. As for the last finding, both at End 1 and End 3 symbol instructions were shown by ARC. According to De Cock et al. (2020) [9], End 3 should be perceived as more complex than End 1, but the findings of the usability study do not support this. Moreover, it seems that the use of the symbol instruction type can have a reversed influence on the complexity perception of decision points: route instructions to end a route in a narrow hallway induce a lower cognitive load than at a convex space when the symbol type is used. This finding confirms that the abstracted information of symbols might not be sufficient for effective wayfinding at convex spaces [27].
5 Conclusions and Future Research
For this usability study, an adapted mobile wayfinding system (ARC) has been developed, which combines several data sources to facilitate a mobile route guidance system adapted to the environment. The cognitive load imposed by the system was determined by measuring the eye movements of participants during a wayfinding experiment. This imposed cognitive load was then compared to the space syntax values of the indoor environment. The results confirm that adapting the route instruction type can have a significant influence on the cognitive load at decision points. More specifically, this study has shown that using a video instruction at complex decision points can decrease cognitive load, while using a symbol instruction can increase cognitive load at convex spaces. Understanding the association between space syntax, route instruction types and cognitive load is crucial to facilitate efficient route guidance. Therefore, this information should be included in the linked data systems of smart indoor environments, which is to date not yet the case.
References
Li, R., Klippel, A.: Using space syntax to understand knowledge acquisition and wayfinding in indoor environments. In: Proceedings of the 9th IEEE International Conference on Cognitive Informatics, pp 302–307 (2010). https://doi.org/10.1109/COGINF.2010.5599724
Meilinger, T., Franz, G., Bülthoff, H.H.: From isovists via mental representations to behaviour: first steps toward closing the causal chain. Environ. Plan. B Plan. Des. 39, 48–62 (2012). https://doi.org/10.1068/b34048t
Peponis, J., Zimring, C., Choi, Y.K.: Finding the building in wayfinding. Environ. Behav. 22(5), 555–590 (1990)
O’Neill, M.J.: Evaluation of a conceptual model of architectural legibility. Environ. Behav. 23(3), 259–284 (1991)
Giudice, N.A., Walton, L.A., Worboys, M.: The informatics of indoor and outdoor space: a research agenda. In: Proceedings of the 2nd ACM SIGSPATIAL International Workshop on Indoor Spatial Awareness, pp 47–53. ACM, New York (2010). https://doi.org/10.1145/1865885.1865897
Montello, D.R.: Spatial cognition and architectural space: research perspectives. Archit. Des. 84(5), 74–79 (2014)
Benedikt, M.L.: To take hold of space: isovists and isovist fields. Environ. Plann. B Plann. Des. 47–65 (1979). https://doi.org/10.1068/b060047
Turner, A., Doxa, M., O’Sullivan, D., Penn, A.: From isovists to visibility graphs: a methodology for the analysis of architectural space. Environ. Plan. B Plan. Des. 28(1), 103–121 (2001). https://doi.org/10.1068/b2684
De Cock, L., et al.: Identifying what constitutes complexity perception of decision points during indoor route guidance. Int. J. Geogr. Inf. Sci. 1–19 (2020). https://doi.org/10.1080/13658816.2020.1719109
Reichenbacher, T.: Adaptive methods for mobile cartography. In: Proceedings of the 21st ICC, Durban, 10–16 August 2003. https://doi.org/10.1044/1092-4388(2010/10-0131)
Gartner, G.: Location-based mobile pedestrian navigation services – the role of multimedia cartography. In: ICA UPIMap, Na, Tokyo, Japan, pp 155–184 (2004). https://doi.org/10.1017/S0373463302001790
De Cock, L., Ooms, K., Van de Weghe, N., Vanhaeren, N., De Maeyer, P.: User preferences on route instruction types for mobile indoor route guidance. ISPRS Int. J. Geo-Inf. 8(482), 1–15 (2019). https://doi.org/10.3390/ijgi8110482
Chang, Y.J., Wang, T.Y.: Indoor wayfinding based on wireless sensor networks for individuals with multiple special needs. Cybern. Syst. 41(4), 317–333 (2010). https://doi.org/10.1080/01969721003778584
Abowd, G.D., et al.: Cyberguide: a mobile context-aware tour guide. Wirel. Networks 3(5), 421–433 (1997). https://doi.org/10.1023/A:1019194325861
Cheverst, K., Davies, N., Mitchell, K., Friday, A., Efstratiou, C.: Developing a context-aware electronic tourist guide: some issues and experiences. CHI Lett. 2(1), 17–24 (2000). https://doi.org/10.1145/332040.332047
Malaka, R., Zipf, A.: DEEP MAP challenging IT research in the framework of a tourist information system. In: Fesenmaier, D.R., Klein, S., Buhalis, D. (eds.) Information and Communication Technologies in Tourism 2000, pp. 15–27. Springer, Vienna (2000). https://doi.org/10.1007/978-3-7091-6291-0_2
Ferreira, J.C., Resende, R., Martinho, S.: Beacons and BIM models for indoor guidance and location. Sensors (Switzerland) 18(12) (2018). https://doi.org/10.3390/s18124374
Walther-Franks, B., Malaka, R.: Evaluation of an augmented photograph-based pedestrian navigation system. In: Butz, A., Fisher, B., Krüger, A., Olivier, P., Christie, M. (eds.) SG 2008. LNCS, vol. 5166, pp. 94–105. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-85412-8_9
Kiefer, P., Giannopoulos, I., Raubal, M., Duchowski, A.: Eye tracking for spatial research: cognition, computation. Challenges. Spat. Cogn. Comput. 17(1–2), 1–19 (2017). https://doi.org/10.1080/13875868.2016.1254634
Ooms, K.: Cartographic user research in the 21st century: mixing and interacting. In: 6th International Conference on Cartography and GIS Proceedings, no. June, pp. 367–377 (2016)
Ohm, C., Müller, M., Ludwig, B.: Evaluating indoor pedestrian navigation interfaces using mobile eye tracking. Spat. Cogn. Comput. 17(1–2), 89–120 (2017). https://doi.org/10.1080/13875868.2016.1219913
Schnitzler, V., Giannopoulos, I., Hölscher, C., Barisic, I.: The interplay of pedestrian navigation, wayfinding devices, and environmental features in indoor settings. In: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications - ETRA 2016, pp. 85–93 (2016). https://doi.org/10.1145/2857491.2857533
Li, Q.: Use of Maps in Indoor Wayfinding (2017)
Holmqvist, K., et al.: Eye Tracking A Comprehensive Guide to Methods and Measures. Oxford University Press, New York (2011)
Van Herbruggen, B., et al.: Wi-Pos: a low-cost, open source ultra-wideband (UWB) hardware platform with long range sub-GHZ backbone. Sensors (Switzerland) 19(7), 1–16 (2019). https://doi.org/10.3390/s19071548
Larsson, L., Nyström, M., Andersson, R., Stridh, M.: Detection of fixations and smooth pursuit movements in high-speed eye-tracking data. Biomed. Signal Process. Control 18, 145–152 (2015). https://doi.org/10.1016/j.bspc.2014.12.008
Chittaro, L., Burigat, S.: Augmenting audio messages with visual directions in mobile guides. In: Proceedings of the 7th International Conference on Human Computer Interaction with Mobile Devices & Services, Salzburg, pp 107–114 (2005). https://doi.org/10.1145/1085777.1085795
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
De Cock, L., Ooms, K., Van de Weghe, N., De Maeyer, P. (2020). Google Indoor Maps or Google Indoor No Maps? Usability Study of an Adapted Mobile Indoor Wayfinding Aid. In: Stephanidis, C., Antona, M. (eds) HCI International 2020 - Posters. HCII 2020. Communications in Computer and Information Science, vol 1224. Springer, Cham. https://doi.org/10.1007/978-3-030-50726-8_54
Download citation
DOI: https://doi.org/10.1007/978-3-030-50726-8_54
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-50725-1
Online ISBN: 978-3-030-50726-8
eBook Packages: Computer ScienceComputer Science (R0)