Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1109/ITSC48978.2021.9564515guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
research-article

Autonomous Vehicles Drive into Shared Spaces: eHMI Design Concept Focusing on Vulnerable Road Users

Published: 19 September 2021 Publication History

Abstract

In comparison to conventional traffic designs, shared spaces promote a more pleasant urban environment with slower motorized movement, smoother traffic, and less congestion. In the foreseeable future, shared spaces will be populated with a mixture of autonomous vehicles (AVs) and vulnerable road users (VRUs) like pedestrians and cyclists. However, a driver-less AV lacks a way to communicate with the VRUs when they have to reach an agreement of a negotiation, which brings new challenges to the safety and smoothness of the traffic. To find a feasible solution to integrating AVs seamlessly into shared-space traffic, we first identified the possible issues that the shared-space designs have not considered for the role of AVs. Then an online questionnaire was used to ask participants about how they would like a driver of the manually driving vehicle to communicate with VRUs in a shared space. We found that when the driver wanted to give some suggestions to the VRUs in a negotiation, participants thought that the communications via the driver's body behaviors were necessary. Besides, when the driver conveyed information about her/his intentions and cautions to the VRUs, participants selected different communication methods with respect to their transport modes (as a driver, pedestrian, or cyclist). These results suggest that novel eHMIs might be useful for AV-VRU communication when the original drivers are not present. Hence, a potential eHMI design concept was proposed for different VRUs to meet their various expectations. In the end, we further discussed the effects of the eHMIs on improving the sociality in shared spaces and the autonomous driving systems.

References

[1]
E. Clarke, “Shared space-: the alternative approach to calming traffic,” Traffic Engineering & Control, vol. 47, no. 8, pp. 290–292, 2006.
[2]
S. Reid, DfT Shared Space Project Stage 1: Appraisal of Shared Space. MVA Consultancy, 2009.
[3]
B. Hamilton-Baillie and P. Jones, “Improving traffic behaviour and safety through urban design,” in Proceedings of the Institution of Civil Engineers-Civil Engineering, vol. 158, pp. 39–47, Thomas Telford Ltd, 2005.
[4]
E. Clarke, H. Monderman, and B. Hamilton-Baillie, “Shared space-the alternative approach to calming traffic,” Traffic Engineering and Control, vol. 47, no. 8, pp. 290–292, 2006.
[5]
B. Hamilton-Baillie, “Shared space: Reconciling people, places and traffic,” Built environment, vol. 34, no. 2, pp. 161–181, 2008.
[6]
W. Bode, S. Deutler, F. Weßling, K. Fennhoff, and C. Grottendieck, “Verkehrsuntersuchung in der gemeinde bohmte unter besonderer berücksichtigung der wirkungen des shared space bereiches,” tech. rep., 2009.
[7]
R. Schönauer, “A microscopic traffic flow model for shared space,” Graz: University of Technology, 2017.
[8]
B. W. Wargo and N. W. Garrick, “Shared space: Could less formal streets be better for both pedestrians and vehicles?,” in Transportation Research Board 95th Annual Meeting, no. 16-5067, 2016.
[9]
A. Karndacharuk, D. J. Wilson, and R. C. Dunn, “Analysis of pedestrian performance in shared-space environments,” Transportation research record, vol. 2393, no. 1, pp. 1–11, 2013.
[10]
I. Kaparias, M. G. Bell, W. Dong, A. Sastrawinata, A. Singh, X. Wang, and B. Mount, “Analysis of pedestrian-vehicle traffic conflicts in street designs with elements of shared space,” Transportation research record, vol. 2393, no. 1, pp. 21–30, 2013.
[11]
J. Deichman, B. Winterberg, and A. Bredmose, “Shared space-safe space,” Ramboll-Nyvig report, 2008.
[12]
I. Kaparias, M. G. Bell, A. Miri, C. Chan, and B. Mount, “Analysing the perceptions of pedestrians and drivers to shared space,” Transportation research part F: traffic psychology and behaviour, vol. 15, no. 3, pp. 297–310, 2012.
[13]
F. Gkekas, A. Bigazzi, and G. Gill, “Perceived safety and experienced incidents between pedestrians and cyclists in a high-volume non-motorized shared space,” Transportation research interdisciplinary perspectives, vol. 4, p. 100094, 2020.
[14]
J. Parkin and N. Smithies, “Accounting for the needs of blind and visually impaired people in public realm design,” Journal of urban design, vol. 17, no. 1, pp. 135–149, 2012.
[15]
SAE Technical Standards Board, “J3016b: taxonomy and definitions for terms related to driving automation systems for on-road motor vehicles,” pp. 1–35, SAE International, 2018.
[16]
L. K. Vissers, S. van der Schagen, and M. Inlg van & Hagenzieker, “Safe interaction between cyclists, pedestrians and automated vehicles: What do we know and what do we need to know?,” 2017.
[17]
N. Merat, T. Louw, R. Madigan, M. Wilbrink, and A. Schieben, “What externally presented information do vrus require when interacting with fully automated road transport systems in shared space?,” Accident Analysis & Prevention, vol. 118, pp. 244–252, 2018.
[18]
H. Liu, T. Hirayama, and M. Watanabe, “Importance of instruction for pedestrian-automated driving vehicle interaction with an external human machine interface: Effects on pedestrians' situation awareness, trust, perceived risks and decision making,” in IEEE Intelligent Vehicles Symposium, pp. 748–754, 2021.
[19]
N. Matsunaga, T. Daimon, N. Yokota, and S. Kitazaki, “Effect of external human machine interface (ehmi) of automated vehicle on pedestrian's recognition,” Proceedings of the International Display Workshops, p. 1125, 11 2019.
[20]
H. Liu, T. Hirayama, L. Y. Morales, and H. Murase, “What is the gaze behavior of pedestrians in interactions with an automated vehicle when they do not understand its intentions?,” arXiv preprint arXiv:, 2020.
[21]
K. A. Hoff and M. Bashir, “Trust in automation: Integrating empirical evidence on factors that influence trust,” Human factors, vol. 57, no. 3, pp. 407–434, 2015.
[22]
S. Sadeghian, M. Hassenzahl, and K. Eckoldt, “An exploration of prosocial aspects of communication cues between automated vehicles and pedestrians,” in 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, pp. 205–211, 2020.
[23]
A. Schieben, M. Wilbrink, C. Kettwich, R. Madigan, T. Louw, and N. Merat, “Designing the interaction of automated vehicles with other traffic participants: A design framework based on human needs and expectations,” Cognition, Technology and Work, 2018.
[24]
S. Busch, A. Schlichting, and C. Brenner, “Generation and communication of dynamic maps using light projection,” Proceedings of the ICA, vol. 1, p. 16, 2018.
[25]
J. Uttley, Y. Lee, R. Madigan, and N. Merat, “Road user interactions in a shared space setting: priority and communication in a uk car park,” Transportation research part F: traffic psychology and behaviour, vol. 72, pp. 32–46, 2020.
[26]
K. Bengler, M. Rettenmaier, N. Fritz, and A. Feierle, “From hmi to hmis: Towards an hmi framework for automated driving,” Information, vol. 11, no. 2, 2020.
[27]
W. Tabone, J. de Winter, C. Ackermann, J. Bärgman, M. Baumann, S. Deb, C. Emmenegger, A. Habibovic, M. Hagenzieker, P. Hancocket al., “Vulnerable road users and the coming wave of automated vehicles: Expert perspectives,” Transportation Research Interdisciplinary Perspectives, vol. 9, p. 100293, 2021.
[28]
A. Dietrich, J.-H. Willrodt, K. Wagner, and K. Bengler, “Projection-based external human machine interfaces-enabling interaction between automated vehicles and pedestrians,” in DSC 2018 Europe VR, 2019.
[29]
K. De Clercq, A. Dietrich, J. P. Núñez Velasco, J. De Winter, and R. Happee, “External human-machine interfaces on automated vehicles: effects on pedestrian crossing decisions,” Human factors, vol. 61, no. 8, pp. 1353–1370, 2019.
[30]
S. Deb, D. W. Carruth, M. Fuad, L. M. Stanley, and D. Frey, “Comparison of child and adult pedestrian perspectives of external features on autonomous vehicles using virtual reality experiment,” in International Conference on Applied Human Factors and Ergonomics, pp. 145–156, Springer, 2019.
[31]
L. Fridman, B. Mehler, L. Xia, Y. Yang, L. Y. Facusse, and B. Reimer, “To walk or not to walk: Crowdsourced assessment of external vehicle-to-pedestrian displays,” arXiv preprint arXiv:, 2017.
[32]
D. Dey, A. Matviienko, M. Berger, B. Pfleging, M. Martens, and J. Terken, “Communicating the intention of an automated vehicle to pedestrians: The contributions of ehmi and vehicle behavior,” it-Information Technology, vol. 63, no. 2, pp. 123–141, 2021.
[33]
G. J. Wilde, “The theory of risk homeostasis: implications for safety and health,” Risk analysis, vol. 2, no. 4, pp. 209–225, 1982.
[34]
H. Liu and T. Hiraoka, “Driving behavior model considering driver's over-trust in driving automation system,” in Proceedings of the 11th International Conference on AutoUI, pp. 115–119, 2019.
[35]
K. Nagel and M. Schreckenberg, “A cellular automaton model for freeway traffic,” Journal de physique I, vol. 2, no. 12, pp. 2221–2229, 1992.
[36]
D. Helbing and P. Molnar, “Social force model for pedestrian dynamics,” Physical review E, vol. 51, no. 5, p. 4282, 1995.
[37]
N. Lee, W. Choi, P. Vernaza, C. B. Choy, P. H. Torr, and M. Chan-draker, “Desire: Distant future prediction dynamic scenes with interacting agents,” in IEEE CVPR, pp. 336–345, 2017.
[38]
A. Sadeghian, V. Kosaraju, A. Sadeghian, N. Hirose, H. Rezatofighi, and S. Savarese, “Sophie: An attentive gan for predicting paths compliant to social and physical constraints,” in IEEE/CVF CVPR, pp. 1349–1358, 2019.
[39]
R. Chandra, U. Bhattacharya, A. Bera, and D. Manocha, “Traphic: Trajectory prediction dense and heterogeneous traffic using weighted interactions,” in IEEE/CVF CVPR, pp. 8483–8492, 2019.
[40]
S. H. Park, G. Lee, J. Seo, M. Bhat, M. Kang, J. Francis, A. Jadhav, P. P. Liang, and L.-P. Morency, “Diverse and admissible trajectory forecasting through multimodal context understanding,” in European Conference on Computer Vision, pp. 282–298, Springer, 2020.
[41]
H. Cheng, W. Liao, X. Tang, M. Y. Yang, M. Sester, and B. Rosenhahn, “Exploring dynamic context for multi-path trajectory prediction,” arXiv preprint arXiv:, 2020.
[42]
R. B. Myerson, Game theory. Harvard university press, 2013.
[43]
H. Cheng, F. T. Johora, M. Sester, and J. P. Müller, “Trajectory modelling in shared spaces: Expert-based vs. deep learning approach?,” in International Workshop on Multi-Agent Systems and Agent-Based Simulation, 2020.
[44]
J. M. Wang, D. J. Fleet, and A. Hertzmann, “Gaussian process dynamical models for human motion,” IEEE transactions on pattern analysis and machine intelligence, vol. 30, no. 2, pp. 283–298, 2007.
[45]
K. M. Kitani, B. D. Ziebart, J. A. Bagnell, and M. Hebert, “Activity forecasting,” in ECCV, pp. 201–214, 2012.
[46]
Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, p. 436, 2015.
[47]
A. Alahi, K. Goel, V. Ramanathan, A. Robicquet, L. Fei-Fei, and S. Savarese, “Social lstm: Human trajectory prediction in crowded spaces,” in IEEE CVPR, pp. 961–971, 2016.
[48]
H. Cui, V. Radosavljevic, F.-C. Chou, T.-H. Lin, T. Nguyen, T.-K. Huang, J. Schneider, and N. Djuric, “Multimodal trajectory predictions for autonomous driving using deep convolutional networks,” in ICRA, pp. 2090–2096, 2019.
[49]
A. Gupta, J. Johnson, F.-F. Li, S. Savarese, and A. Alahi, “Social gan: Socially acceptable trajectories with generative adversarial networks,” in IEEE/CVF CVPR, pp. 2255–2264, 2018.
[50]
D. P. Kingma and M. Welling, “Auto-encoding variational bayes,” in ICLR, 2014.
[51]
J. Xu, Kelvand Ba, R. Kiros, K. Cho, A. Courville, R. Salakhudinov, R. Zemel, and Y. Bengio, “Show, attend and tell: Neural image caption generation with visual attention,” in ICML, pp. 2048–2057, 2015.
[52]
A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. Kaiser, and I. Polosukhin, “Attention is all you need,” in NeuIPS, pp. 5998–6008, 2017.
[53]
A. Sadeghian, F. Legros, M. Voisin, R. Vesel, A. Alahi, and S. Savarese, “Car-net: Clairvoyant attentive recurrent network,” in European Conference on Computer Vision, pp. 151–167, 2018.
[54]
H. Cheng, W. Liao, M. Y. Yang, B. Rosenhahn, and M. Sester, “Amenet: Attentive maps encoder network for trajectory prediction,” ISPRS Journal of Photogrammetry and Remote Sensing, vol. 172, pp. 253–266, 2021.
[55]
J. Co-Reyes, Y. Liu, A. Gupta, B. Eysenbach, P. Abbeel, and S. Levine, “Self-consistent trajectory autoencoder: Hierarchical reinforcement learning with trajectory embeddings,” in International Conference on Machine Learning, pp. 1009–1018, PMLR, 2018.
[56]
R. Quintero, I. Parra, D. F. Llorca, and M. Sotelo, “Pedestrian path prediction based on body language and action classification,” in IEEE ITSC, pp. 679–684, 2014.
[57]
O. Ghori, R. Mackowiak, M. Bautista, N. Beuter, L. Drumond, F. Diego, and B. Ommer, “Learning to forecast pedestrian intention from pose dynamics,” in IEEE Intelligent Vehicles Symposium, pp. 1277–1284, 2018.
[58]
F. Shinmura, Y. Kawanishi, D. Deguchi, T. Hirayama, I. Ide, H. Murase, and H. Fujiyoshi, “Estimation of driver's insight for safe passing based on pedestrian attributes,” in IEEE ITSC, pp. 1041–1046, 2018.
[59]
I. Hasan, F. Setti, T. Tsesmelis, A. Del Bue, F. Galasso, and M. Cristani, “Mx-lstm: mixing tracklets and vislets to jointly forecast trajectories and head poses,” in IEEE/CVF CVPR, pp. 6067–6076, 2018.
[60]
E. A. Pool, J. F. Kooij, and D. M. Gavrila, “Context-based cyclist path prediction using recurrent neural networks,” in IEEE Intelligent Vehicles Symposium, pp. 824–830, 2019.
[61]
V. Onkhar, P. Bazilinskyy, J. Stapel, D. Dodou, D. Gavrila, and J. de Winter, “Automatic detection of driver-pedestrian eye contact,” 2020.
[62]
H. Cheng, W. Liao, M. Y. Yang, M. Sester, and B. Rosenhahn, “Mcenet: Multi-context encoder network for homogeneous agent trajectory prediction mixed traffic,” in IEEE ITSC, pp. 1–8, 2020.
[63]
H. Liu, T. Hirayama, L. Y. Morales, and H. Murase, “What timing for an automated vehicle to make pedestrians understand its driving intentions for improving their perception of safety?,” in IEEE ITSC, pp. 462–467, 2020.
[64]
V. Kamalasanan and M. Sester, “Behaviour control with augmented reality systems for shared spaces,” The International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. 43, pp. 591–598, 2020.

Cited By

View all
  • (2024)Longitudinal Effects of External Communication of Automated Vehicles in the USA and Germany: A Comparative Study in Virtual Reality and Via a BrowserProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997788:4(1-33)Online publication date: 21-Nov-2024
  • (2023)Designing Emotional Expressions of Autonomous Vehicles for Communication with Pedestrians in Urban Shared Spaces: Use Cases, Modalities, and ConsiderationsProceedings of the 35th Australian Computer-Human Interaction Conference10.1145/3638380.3638408(454-461)Online publication date: 2-Dec-2023
  • (2023)My Eyes Speak: Improving Perceived Sociability of Autonomous Vehicles in Shared Spaces Through Emotional Robotic EyesProceedings of the ACM on Human-Computer Interaction10.1145/36042617:MHCI(1-30)Online publication date: 13-Sep-2023
  • Show More Cited By

Index Terms

  1. Autonomous Vehicles Drive into Shared Spaces: eHMI Design Concept Focusing on Vulnerable Road Users
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image Guide Proceedings
      2021 IEEE International Intelligent Transportation Systems Conference (ITSC)
      Sep 2021
      4060 pages

      Publisher

      IEEE Press

      Publication History

      Published: 19 September 2021

      Qualifiers

      • Research-article

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)0
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 19 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Longitudinal Effects of External Communication of Automated Vehicles in the USA and Germany: A Comparative Study in Virtual Reality and Via a BrowserProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997788:4(1-33)Online publication date: 21-Nov-2024
      • (2023)Designing Emotional Expressions of Autonomous Vehicles for Communication with Pedestrians in Urban Shared Spaces: Use Cases, Modalities, and ConsiderationsProceedings of the 35th Australian Computer-Human Interaction Conference10.1145/3638380.3638408(454-461)Online publication date: 2-Dec-2023
      • (2023)My Eyes Speak: Improving Perceived Sociability of Autonomous Vehicles in Shared Spaces Through Emotional Robotic EyesProceedings of the ACM on Human-Computer Interaction10.1145/36042617:MHCI(1-30)Online publication date: 13-Sep-2023
      • (2022)Pedestrian-Vehicle Interaction in Shared Space: Insights for Autonomous VehiclesProceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3543174.3546838(330-339)Online publication date: 17-Sep-2022
      • (2021)Manually Driven Vehicle Encounters with Autonomous Vehicle in Bottleneck Roads: HMI Design for Communication IssuesProceedings of the 9th International Conference on Human-Agent Interaction10.1145/3472307.3484678(382-385)Online publication date: 9-Nov-2021

      View Options

      View options

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media