Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3613904.3642015acmconferencesArticle/Chapter ViewFull TextPublication PageschiConference Proceedingsconference-collections
research-article
Open access

Augmented Reality at Zoo Exhibits: A Design Framework for Enhancing the Zoo Experience

Published: 11 May 2024 Publication History

Abstract

Augmented Reality (AR) offers unique opportunities for contributing to zoos’ objectives of public engagement and education about animal and conservation issues. However, the diversity of animal exhibits pose challenges in designing AR applications that are not encountered in more controlled environments, such as museums. To support the design of AR applications that meaningfully engage the public with zoo objectives, we first conducted two scoping reviews to interrogate previous work on AR and broader technology use at zoos. We then conducted a workshop with zoo representatives to understand the challenges and opportunities in using AR to achieve zoo objectives. Additionally, we conducted a field trip to a public zoo to identify exhibit characteristics that impacts AR application design. We synthesise the findings from these studies into a framework that enables the design of diverse AR experiences. We illustrate the utility of the framework by presenting two concepts for feasible AR applications.

1 Introduction

Once perceived as places of entertainment and leisure [70], zoos have evolved to position themselves as centers of conservation, learning and research [31, 56, 61]. Good, modern zoos care deeply about their public facing role, especially educating and engaging visitors on animal, environmental and conservation issues [56, 61]. To this end, zoos are deploying an array of technologies for visitor education and engagement [32, 35, 63]. Increased HCI attention to zoos as important sites for informal education [29] intersects with research into technologies for nature engagement [87], cultural heritage [20, 40, 74, 95], tourism [10], makerspaces [67] and science museums [93]. Recent scholarship illustrates that new ways of relating to animals are opened up by technologies such as citizen science [44] apps, cameras for watching backyard wildlife [82], digital enrichment for zoo animals [42, 84], and curated interactions with animals through technology [21]. However, integrating technologies into zoo experiences is not always successful, due to perceived disruption to animals’ natural behaviours [30] and distraction from visitors’ visual connection with the animals [83].
Augmented reality (AR) capabilities for superimposing digital content on the user’s real-world view  [9] present opportunities for engaging, interactive experiences which enhance encounters with animals, rather than interrupting them [9, 83]. However, zoos present unique design opportunities and challenges which must be addressed for AR to be adopted successfully in this setting. Firstly, AR applications should not draw user’s attention excessively to digital content [6, 76, 77] or create information overload [24, 46]; the animal should remain the focus of the visitor experience. Secondly, ‘naturalistic’ zoo exhibits (also called third generation zoos), which attempt to replicate animals’ wild habitats [12], are considered to be highly effective for visitor learning and engagement [56]. But the characteristics of these exhibits, such as high foliage density, distant viewing locations, and opportunities for animals to hide, present challenges for visitors to observe animals. While AR might be beneficial in engaging visitors through rich visualizations and interactions at such exhibits, there are substantial challenges for incorporating AR into these complex environments. Thirdly, visitor areas have characteristics which present limitations to designers: they are sometimes small and crowded, which prevents interactions involving large movements, and inhibits presentation of large scale content such as a life-size giraffe.
In this paper, we aim to better understand how to support the design of AR applications for engaging visitors with zoo objectives (or “meaningful zoo experiences” for brevity). We begin with two scoping reviews of the ACM Digital Library, IEEE Xplore, Web of Science, and Design and Applied Arts Index (DAAI) databases. Our first review focused on the design and use of AR applications for meaningful zoo experiences. We found there is limited research on AR for meaningful zoo experiences, with most AR applications focusing on a slim range of design elements related to AR’s social, visual, spatial, and interactive capabilities. We conducted a second review which took a broader look at interactive technologies used for zoo visitor experiences and identified design themes applicable to AR technology, encompassing design elements related to perspective, visual focus, scope, sociality, interactivity, game elements, and content type. Together, these reviews provide insights on current designs of AR applications for meaningful zoo experiences, and expand the design space by drawing lessons from use of other technologies in zoos.
We conducted on-site research at a major public zoo, comprising a workshop and a field trip. The workshop entailed focus group discussion and ideation activities with zoo representatives to elicit their perspectives on how AR applications might be designed to further the organisation’s objectives and overcome known challenges. We found that zoo representatives were optimistic about the use of AR to achieve visitor related zoo objectives, particularly relating to wildlife education and presenting conservation activities. However, participants’ design ideas reproduced AR application elements similar to those found in our first review, but failed to explore possible designs identified in our second review. Through our field trip we sought to better understand how the characteristics of different types of zoo exhibits could affect the deployment of AR technology. We trialled a state-of-the-art AR device at different exhibits to assess exhibit affordances in relation to AR. We found and summarised the exhibit characteristics that affected the use of AR, to inform future design of AR applications for exhibits with similar characteristics.
We compiled the findings from the scoping reviews, workshop, and field trip into a design framework to guide the design of AR applications for meaningful zoo experiences. The framework consists of three activities: determining the AR application goal, identifying the exhibit affordances in relation to AR, and deliberating on possible and desired design elements. We demonstrate the utility of our framework by presenting two illustrative examples of AR apps for the zoo—one inspired by prior work and the other identified during our workshop. Through presenting and discussing the design framework, we contribute an intuitive and comprehensive tool for future interaction designers and HCI practitioners to create meaningful AR experiences at the zoo, while benefiting from a synthesis of prior academic knowledge and first-hand insights from fieldwork.

2 Scoping Reviews

We conducted two scoping reviews to understand how to design meaningful AR applications that enhance the zoo experience. Scoping reviews [7] enable us to examine the extent of research activity on a particular topic, summarise research findings, and identify gaps in the literature. We reviewed articles from academic databases that focus on Human-Computer Interaction (HCI) [41], namely the ACM Digital Library, IEEE Xplore, Web of Science, and Design and Applied Arts Index (DAAI). Our first review focuses on AR to enhance the zoo visitor experience, while the second takes a broader look at technologies deployed for this purpose, to expand the design space of zoo AR applications. We provide references to all papers in the Supplementary Material.

2.1 First Review - AR Technology for Enhancing Zoo Visitor Experience

2.1.1 Sampling.

The review spanned papers published between 1980 and 2023. The search was conducted using the search string “[Full Text: "augmented reality"] AND [Abstract: zoo]” in the databases listed above.
The initial dataset resulted in a total of 25 papers. This dataset was imported to Covidence 1, an online tool supporting literature reviews. We removed all duplicate instances of a paper and then screened the papers based on their title and abstract. We followed up with a full review of the remaining papers. We only included papers that employed, explored, discussed or conceptualised AR technology used by visitors/non-experts as part of the zoo experience.
Our final sample consisted of 11 papers. Figure 1 shows a summary of the scoping review process for our first review using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA flowchart) [54].

2.1.2 Analysis.

We extracted details related to the system description and the intended use of the AR system. We then engaged in open coding to identify distinct design elements of the studied or conceptualized AR system. We followed with axial coding to organize the different codes and form themes [14].

2.1.3 Results.

Figure 1:
Figure 1: PRISMA flowcharts
We found that 10 of the 11 included papers employed or discussed the use of AR technology in mobile devices (such as smartphones and PDAs). Only one article [4] used a Head-Mounted Display (HMD), and no articles involved spatially augmented reality technology [68] (where digital content is directly projected onto real objects/surfaces). Seven articles described AR systems used in zoos (in-situ), one article described using AR outside of the zoo environment (ex-situ), and two articles discussed AR for both in-situ and ex-situ usage.
We identified four themes related to design elements as the primary focus of the AR applications:
Location-based: Presenting location-based digital content through AR was a common design element in the reviewed articles (6/11 papers). For example,  Srisuphab et al. [75] designed a navigation system that shows points of interest in the real world as seen through the back camera of an AR smartphone app. Similarly, Fahlquist et al. [19] conceptualised the design of an AR system that identified nearby animals and presented relevant content to the user.
Game elements: Six papers incorporated game elements in AR to engage or educate users. For instance, Fahlquist et al. [19] conceptualized a game where users form teams and collect information to solve a challenge. The information is presented when the user points their smartphone at a specific location. The interactions offered in these games primarily use location-based triggers [18, 64] and/or touch input on the mobile devices [72, 75, 94]. The exception was  Andrade and Shool [4]’s app, in which users controlled 3D models of animals with a magic leap controller.
Social: Two papers discussed AR applications that incorporated social elements to enhance the visitor experience and learning [18, 64]. On one hand, the design of  Perry et al.’s [64] AR game harnessed the capabilities of AR for presenting information without hindering the user’s ability to view, move through, and socially interact with others around them. On the other hand, Fahlquist et al. [19] conceptualized a social media AR game to connect the user to local and remote users.
Content and User Experience: All papers discussed the use of AR to present visual information (as text, images, video or 2D/3D visualizations). Only 3 papers discussed how AR content can enable novel experiences that enhance the zoo experience [4, 19, 64]. As an example of a game supporting educational goals, Perry et al. [64] designed a location-based AR game where users play as an animal escaping from poachers and must find their way back to their exhibit while avoiding exhibits housing predators. Another example is  Andrade and Shool [4]’s system, in which users interacted with 3D models of animals to learn about their anatomy and behaviour.

2.1.4 Discussion.

The review showed that previous works using AR applications to enable meaningful zoo experiences primarily leverage the social, visual, spatial, and interactive capabilities of AR. However, we see that these works share similarities in their design elements (e.g. the use of location-based triggers [18, 33, 64]). The design of these applications seems to primarily focus on explicit means of interaction (touch/controllers) [4, 72, 75, 94] with limited exploration of implicit interactions beyond location-based triggers that are afforded by AR devices today, such as gestures and facial expression recognition that could present new avenues for interactive and social solutions for enhancing the zoo experience. This narrow focus suggests challenges in breaking from pre-existing design patterns and fixations in envisioning future experiences.
We also found zoo objectives strongly influenced the AR application design. For example, the use of game elements and location-based triggers in  Perry et al. [64]’s AR game aimed at supporting children’s learning. Another example is  Fahlquist et al. [19]’s concept of an AR system to engage remote zoo visitors for marketing. This highlights a need to develop a deeper understanding of how these goals can be incorporated into the design process of such AR applications.

2.2 Second Review - Technology for Enhancing the Zoo Visitor Experience

2.2.1 Sampling.

Given the small sample of papers that used AR to enhance the zoo visitor experience, our second review focused on broader technologies. We restricted our search to 1980 to 2023, using the search string “[Full Text: technology] AND [Abstract: zoo]”.
The initial dataset resulted in 1052 papers, which we imported to Covidence. We removed all duplicate instances of a paper and then screened the papers based on their title and abstract. A substantial number of papers (910 papers) were considered irrelevant for our review based on the title and abstract screening. The large number of exclusions can be explained by the use of the term ‘zoo’ for reasons other than referring to an establishment that house animals. For example, our search returned papers related to machine learning (“Radio Galaxy Zoo” [2]), face recognition technology (“FaceX-Zoo” [48]), and networking papers (“Networked Data Zoo” [65]). Additionally, our search returned papers that clearly indicated in their title and abstract that the technology was not used to enhance the zoo visitor experience. Examples include: ‘Using technology to monitor and improve zoo animal welfare’ [89] and ‘Assisted reproductive technologies for endangered species conservation’ [28]. We considered these as out of scope for this review. We followed the title and abstract screening with a full review of the remaining papers. We only included papers that employ, explore, discuss or conceptualise technology used by visitors or non-experts to enhance the zoo experience.
Our final sample consisted of 50 papers. Figure 1 shows a summary of the scoping review process for our second review using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA flowchart) [54].

2.2.2 Analysis.

Similar to our analysis of the papers sampled in our first review, we used qualitative methods to create themes classifying the design and use of technology in enhancing the zoo visitor experience. Specifically, we used open coding to identify design elements of the technology described in the papers. We then used axial coding to group related codes to form themes.

2.2.3 Results.

Almost all (49) papers in our sample involved technology that presented visual content (text, images, video, 2D graphics and/or 3D visualizations). The one exception was a paper that involved a physical interface (button or lever) to trigger a mechanical enrichment device within an animal’s enclosure. As no digital technology was used in this paper (both the interaction interface and the response was physical), we excluded it from further analysis.
The open and axial coding revealed seven themes: perspective, visual focus, scope, interactivity, social, game elements and content type. We elaborate on them below.
Perspective. Refers to the design of technology that enables users to experience different points of view. Previous works have used technology to show users how animals see the world [49, 50, 92], or to embody animals [3, 11, 64]. For example, Kasuga et al. [36] created virtual reality (VR) videos of various animals’ eyesight, colour vision and dynamic vision. Other works explored how different perspectives can support users in learning abstract concepts. For example, Allison et al. [3] designed a VR system in which users embodied an adolescent gorilla and learned about the social hierarchies in gorilla tribes through implicit interactions with other gorillas. Other works leveraged technology to better communicate the work of zoo experts. For instance, Whitehouse et al. [88] designed a game on an interactive public display through which users “became” primate researchers to better empathise with what the work entails. Visitors could navigate a stylized map on the display and engage with quiz games related to the researcher’s life in the wild.
Visual Focus. Refers to the design of technology based on the focal point of attention of the user. Technology in previous works has either focused the users’ view of the animal and the exhibit [22, 35, 83] or on the technology [3, 4, 49, 50, 88]. A key motivator for using AR technologies at zoos is not hindering visual connection with animals [83]. For example, Fu et al. [22]’s AR app overlaid text or 2D graphics on animals detected via a smartphone without obstructing the view of the animal. However, technology has also been used to present animal visualizations as an alternative when observing a real animal is impossible. For example, Tanaka et al. [78] presented a web-based system that displayed penguins’ anatomy and behaviour as 2D graphics and animations.
Scope. Refers to the area where the technology will be used. Prior works deployed technology in a small area of the zoo [32], a section of the zoo encompassing multiple exhibits [34], or in the entirety of the zoo [75]. For example, Jimenez Pazmino et al. [32]’s system visualised the challenges faced by polar bears due to global warming. The system was used in place, so the user did not need to move to interact with the system. In contrast, other applications encouraged visitors to move around the zoo.  Kapoun and Kapounová [34]’s mobile app helped users understand abstract zoological concepts, such as ecological links and food chains. The application guided users to locations in the zoo based on zoological relationships (ecological links or food chains) and presented visualizations of zoological relationships using dynamic semantic networks. Additionally, Kim et al. [39]’s system used RFID to locate users within the zoo and present relevant educational content.
Social. Refers to the design of technology to facilitate or encourage social interactions. Previous works have facilitated social interactions for teaching visitors [64], sharing content with other visitors [58, 59], and playing zoo-related games [19]. Many of these aim to facilitate in-person social interactions. For instance,  Perry et al. [64]’s AR app was used to facilitate learning about zoo objectives while engaged in a location-based game. Different subgroups of the same learning group received information about the game and were encouraged to share that with other groups.
Other works encouraged remote social interactions. For example, Ren et al. [69] designed a mobile application where users in the zoo could interact with remote users to share their experiences. We also found technology used to support interactions between the visitor and docents [32].
Interactivity. Refers to the degree of interaction designed into the experience. We found three kinds of interaction: passive, active stationary, and active mobile interactions. We refer to experiences delivered through technology whose design does not involve user interactions as passive. For example, Perdue et al. [63] explored the effectiveness of video presentations on visitor knowledge about the presented topic. Active stationary interactions involve gestures, touch or buttons but can be accessed from a single location. For instance, in  Chang et al. [13], users actively interacted with a digital zoo using gestures, but they had to stand in front of a Kinect sensor. Finally, active mobile interactions require the user to interact from different locations. For example,  Pishtari et al. [66]’s location-based mobile system required users to move to different areas of the zoo to interact with game content placed by a game creator.
Game Elements. Refers to the use of game elements in the system design. Examples of such game elements include quizzes [78, 79, 80, 88], competitive team-based tasks [19] and location-based game mechanics [64]. These works include both collaborative and individual examples. The systems described by  Perry et al. [64] and [19] describe location-based games designed for collaborative play. Both systems used mobile devices to present information but differed in that in  Perry et al.’s [64] users worked towards a common goal, while in  Fahlquist et al.’s [19] teams competed against each other. Single-player games included  Long and Gooch’s [50] educational simulation in which individual users switched between human and bee vision to find relevant game objects.
Content Type. Refers to the type of content delivered through the technology. The types of content explored in previous works can be broadly categorized as visual or auditory. For example, Long et al. [49] used visual content to simulate the cat visual system and show four key differences (colour, luminance, blur and field of view) between the human and feline systems. In contrast, Pendse et al. [62] created a novel experience for an accessible aquarium. They map between different characteristics of fish and unique notes to enable visitors to ‘hear’ the presence of different fish. For example, smaller fish were associated with higher pitches, while quick movements led to a faster tempo [62].

2.2.4 Discussion.

By considering a broader range of technologies, this second review revealed a wider design space of potential applications little explored in an AR context (e.g. the use of different perspectives in educational experiences with other technologies [3, 49, 50, 92], despite its feasibility in AR). Similarly, examples of novel interactions in AR applications are scarce compared to other technologies. This can be seen in the range of interactions implemented in other technologies (implicit [3], multi-gesture [13], etc.). This suggests an opportunity to support the design of novel AR applications by leveraging the lessons learnt from other technologies.
Similar to the applications found in our first review (section 2.1), zoo objectives largely influenced the design elements used in the technological implementation. We also found that the design of the technology for meaningful zoo experiences was influenced by the animal(s) that the experience was concerned with. This suggests that, in addition to the objectives of the zoo, there is a need to understand and consider design elements in relation to the animal(s) that the experience is centered on.

3 Workshop

Although the ideas identified within the literature reflect the potential that researchers and technologists see in the use of AR at the zoo, they might not necessarily reflect the views of zoo stakeholders, such as senior managers or those in charge of the visitor experience. These staff members are involved in shaping zoo objectives and are knowledgeable about the challenges zoos face in terms of visitor engagement, education and conservation understanding. The adoption of AR technology in zoos, therefore, largely depends on whether the technology aids in achieving the organisation’s objectives. As such, it is important to equip AR developers and designers with tools that would enable them to better understand and address the concerns and objectives of zoo representatives. Therefore, we conducted a workshop to capture existing challenges faced by zoos, understand opportunities for using AR at the zoo, and gain insights on prospective AR experience design ideas, as seen from the perspective of key stakeholders.

3.1 Method

We conducted our workshop at Melbourne Zoo with seven representatives from their team, facilitated by a member of the research team and an external expert. Both facilitators specialize in Animal-Computer Interaction (ACI) in zoos and HCI focusing on AR technology. The zoo representatives included the senior manager of digital engagement, one representative from the community conservation campaigns team and one from the schools’ education program, three representatives of the visitor experience team, and a manager in the business applications team.
The workshop lasted two hours in total. Representatives were first informed of the purpose and activities of the workshop, and were asked to provide verbal consent to participate. In advance of the workshop, researchers provided a briefing document which outlined the capabilities of AR technologies, images of AR head-mounted devices and AR overlays, and examples of AR applications including several Snap “Lenses”. Participants tried a head-mounted AR device and discussed its capabilities, such as overlaying or anchoring digital content onto the user’s real-world surroundings. We used the Snap Spectacles 2021 2 for their portable form factor 3, resembling a regular pair of glasses. We also chose the Spectacles because they are smaller and lighter than other AR displays, such as the Microsoft HoloLens 2 4 or the Magic Leap 5. We then presented the current capabilities of the Snap Spectacles and how these compared to other visions of AR technology (as seen in movies and other cultural media). We used a selection of existing AR applications (called Snap Lenses) to demonstrate the features of the device.
We conducted three workshop activities, the design of which was informed by our previous engagements with Zoos, to explore how future AR-based experiences might contribute to the zoo’s mission and the visitor experience, paying particular attention to the social and interactive dimensions of the visit. We first asked participants to discuss current challenges related to visitors’ encounters with animals, interactions with zoo experts (e.g. “keeper talks”) and social interactions within visitor groups. We then asked them to identify possible opportunities to leverage AR to further the zoo’s objectives. Finally, we led an ideation session on different AR application designs to address challenges and seize opportunities. We asked participants to note their ideas on post-it notes, as well as describing them to the group.
The workshop was audio recorded and a researcher took additional notes during the meeting, gathered post-its, and ensured that all participants’ responses were fully represented in the data collected, with reference to the workshop transcript. After the workshop, we clustered the notes using affinity mapping to develop categories of findings relevant to challenges, opportunities, and design ideas  [27, 51]. Through the affinity mapping process, we identified 5 challenges that AR might address, 11 potential strategies for using AR in zoos, and 18 design ideas relating to the challenges and strategies based on the zoo representatives’ responses. We prepared summaries of key findings, distributed these documents to workshop participants and elicited their feedback, from which we further refined the findings.

3.2 Findings

Participants identified five challenges faced by the Zoo: remote attendance, multilingual accessibility, communication of conservation efforts, way-finding, and engagement with wildlife not on display. The first challenge was to enable visitors who could not attend an exhibit at a specific time to still have access to keeper talks. Keeper talks are live presentations by zoo experts and have been recognized as an important tool that zoos use to educate visitors [63]. The second was to expand multilingual access, especially for local, culturally, and linguistically diverse communities and schools. A third was related to communicating the relationship between the exhibited animals and the broader zoo’s conservation efforts. For example, engaging visitors to the orangutan exhibit with the zoo’s conservation efforts on preventing habitat loss. The fourth was to improve way-finding within the zoo, as its large and complex layout might lead visitors to miss exhibits. The final challenge was to enable visitors to build a connection with and engage in conservation efforts relating to wildlife not being exhibited. For example, a species of concern is the “Golden-rayed blue butterfly”, which cannot be housed at the zoo due to its dependency on a specific food source.
Participants identified ways AR technologies might enhance the zoo visitor experience. These opportunities were either directly related to the challenges or proposed novel ways in which AR could create meaningful experiences. Opportunities to address challenges included an AR guide to lead visitors around the zoo (way-finding); an augmented installation at the butterfly house to see the “Golden-rayed blue butterfly”; a virtual zookeeper that could present keeper talks in different languages to visitors; enabling visitors to explore virtual environments to learn about conservation work that takes place outside the zoo (including release and rehabilitation of endangered species); and enabling visitors to take on the role of zoo conservation professionals in a virtual experience outside the zoo (e.g., wildlife rescue).
The themes observed in opportunities not related to the identified challenges included: a shared view of 3D visualizations of animals; educational games on topics of environmental concern (such as removing virtual plastic pollution from marine exhibits), or creating connections with animals that are not often on exhibit (such as providing enrichment or care to meet specific animal needs); and gesture-based interactions for presenters to aid visitors in seeing and noticing specific elements of exhibits or presentations.
We identified three major themes on design ideas after grouping responses during affinity mapping. These themes were centred on the educational goals of the zoo: animal-focused, environmental, and conservation. Animal-focused concepts include educational AR applications that use 3D visualizations to show animals, their behaviours or physiology. Generated design ideas include visualizing animals at different stages of their life cycle (e.g. the metamorphosis of a butterfly) or presenting visualizations of physiological phenomena (e.g. an elephant’s gestation). Environmental concepts included ideas that showed the impacts of environmental change on animals and the effects of environmental enrichment on animal welfare. Finally, concepts around conservation included ideas for supporting conservation optimism for visitors and enabling visitors to connect with researchers in-situ to learn about conservation actions.

3.3 Discussion

The workshop revealed that zoo representatives saw potential in using AR to provide education and awareness to visitors, engage visitors through novel experiences, and enhance existing experiences with different aspects of wildlife, including their environments and conservation efforts. We observed that the majority of ideas focused on design elements that appear in previous works using AR in the context of zoos (section 2.1). For example, multiple ideas used AR content [73, 94] to educate visitors about wildlife phenomena (elephant calf gestation or the life-cycle of a butterfly), animals that were not housed at the zoo (golden-rayed blue butterfly), or an animal’s natural environment using digital visualizations. These ideas, however, did not explore how AR could enable novel visualization experiences unattainable with cheaper technology, such as a video.
Other ideas leveraged AR for enabling more engaging experiences, such as an AR game where users “remove balloons from seal tanks”. However, these were also similar to previous work, in that the interactions offered were largely explicit [4, 94]. Further, some ideas aimed to enhance the visitors’ experience by assisting users with finding their way around the zoo or by enabling users to access content presented in languages not familiar to them. These ideas were also found in the literature [72, 75].
These similarities seem to suggest that stakeholders struggle to incorporate design elements that are beyond the immediate affordances of AR, i.e., displaying visual content or enabling explicit interaction. As such, there is a need to support the design of broader applications of AR in providing experiences that are relevant to the zoo’s mission.
Despite the similarities observed between the workshop ideas and prior work, a few new opportunities and ideas also surfaced. These proposed exploratory experiences in AR that facilitate better understanding and engagement with conservation efforts. Ideas such as the “visitor island journey” enable users to explore (rather than just view) virtual environments to learn about conservation. While ideas such as the “mission with a marine rescue unit” aimed to foster conservation optimism by having users virtually experience wildlife rescue. However, these ideas presented concepts that were more fitting for virtual reality technology, and participants did not specify how the capabilities of AR could be leveraged for these experiences. This further indicates the need to support designers and developers in better understanding the needs and wants of the zoo in order to create AR applications for meaningful zoo experiences.

4 Exhibit Characteristics AND AR Application Design

Exhibits in modern zoos vary significantly depending on the needs of the animal they house  [56]. For example, exhibits vary in the barriers between the visitor and the animals—from letting visitors into the enclosure to blocking it with fences—and the amount of natural light, from bright outdoor spaces to completely dark ‘caves’. These characteristics create challenges and opportunities for AR 6 7 8.
Differences in animal behaviour should also be considered in the design of AR applications. For example, exhibits that house less active animals can benefit from interventions to elicit visitors’ interest [92]. To better understand how exhibit characteristics shape the challenges and opportunities for AR applications, we conducted a field trip to Melbourne Zoo, as an example of a third-generation zoo with modern enclosures [12]. Our research team has visited multiple zoos around the world as part of their research [85, 86] and chose Melbourne Zoo based on its capacity and experience in successful digital initiatives, and its inclusion of varied exhibits, animal species and spaces.

4.1 Method

Our method combined a field trip [17] and first-hand experience [37] to gain insights on how exhibit characteristics can influence AR design. Conducting a field trip enables researchers to develop an understanding of a particular setting in a limited period of time [17]. Two members of our research team conducted a field trip to Melbourne Zoo and were accompanied by a specialist in animal welfare science with extensive knowledge of the exhibits and animals housed there. We asked the specialist to guide us through exhibits with distinct characteristics, enabling us to capture observational notes. We visited more than one exhibit with similar characteristics, as per the suggestions of the specialist, to ascertain that we did not miss any important exhibit characteristic. We spent a total of 4 hours at the zoo [17] and visited 21 exhibits of diverse design, housing animals from diverse habitats. We took multiple photographs of each exhibit and made observational notes on the physical and contextual characteristics of each exhibit.
Additionally, we used first-hand experience to better understand the user’s perspective when using technology in a specific context or setting [37]. During our visit, we used an AR device (Snap Spectacles 2021) to observe how different exhibit characteristics might affect the use of AR applications. We noted observations on the challenges and opportunities for the display and interaction with multimedia AR content. We focused on characteristics that were physical—relating to factors known to affect AR applications 6 7 8 (such as lighting)—and contextual—such as visitor density and animal behaviour. Figure 2 shows images from the field trip.

4.2 Analysis

Our visit resulted in 21 pages of notes and 80 exhibit photographs. Our observations consisted of structured notes (e.g. “What type of crowd flow does the exhibit afford? Walk past, walk through, in and out, or other”) and unstructured observations. We began by pooling our data and discussing salient observations. We then employed the general inductive approach [81] to analyse our data. This involved segmenting our data based on the exhibit characteristics that affected AR usage. We used open coding to label each segment to create categories (e.g. sunlight, shadows, dark exhibits, vocalization), and further refined our categories by reducing overlap and creating more generalized categories (e.g., ‘lighting’ was a general categorization for labels ‘sunlight’, ‘shadows’, and ‘dark exhibit’). We then engaged in multiple discussions and whiteboarding sessions to consolidate any inconsistent categorizations and develop a set of key exhibit characteristics that affect the design of AR applications. Each whiteboarding session consisted of one or both of the researchers who visited the zoo along with 1-3 other researchers. All researchers had access to the data and categories that were previously agreed upon. Each session involved grouping related codes into categories, identifying exhibit characteristics related to the categories, and deliberating on the effects of the exhibit characteristics on AR usage. We iterated this process until we reached a shared view of exhibit characteristics that affect the design of AR applications.
Our analysis identified six key characteristics that can influence the design of AR applications for the zoo. We categorized these characteristics as either physical or contextual:
Figure 2:
Figure 2: Image showing the different types of exhibits observed at Melbourne Zoo.

4.3 Physical Characteristics

Physical and environmental factors are known to affect the design of AR applications  6 7 8. We had first-hand experience of these issues in our field trip, outlined below.

4.3.1 Lighting.

Lighting is an important consideration when designing for AR applications, especially in outdoor settings with natural light sources [23]. Documentation related to most state-of-the-art AR devices and APIs also suggests extensive considerations related to lighting 6 7 8. As such, zoo exhibits present substantial challenges in integrating AR technology due to diverse lighting conditions that cater for the needs of different animals species. For instance, foliage within the animal’s enclosure creates pockets of shade that can create complex lighting profiles that impact the presentation of visual content using AR. We observed four factors that affect the lighting in zoo exhibits: exhibit features, weather, time of day, and foliage density and type.
We observed how exhibit features, such as their location indoors or outdoors, the amount of artificial lighting, and the type of overhead enclosure used at an exhibit (none, overhead net, metal meshes) affect the amount of light that enters the exhibit. Additionally, weather conditions (sunny, cloudy, etc.) and time of day (morning, dusk, etc.) determine the degree of natural light present at the exhibit (though this was only applicable for outdoor exhibits). Finally, we observed how the type of foliage and its density can cast varying degrees of shadows within the exhibit.
We found that the different lighting profiles created through the interaction of the above factors influence the visibility of the real surroundings (animal/exhibit) and the visual content displayed via AR. For example, presenting visual content in an indoor exhibit with minimal artificial lighting hinders the view of the real animal. This was observed even on the lowest brightness settings of the Snap Spectacles. As another example, we found that the visibility of the real animal/surroundings was not compromised when presenting visual content in the brightest settings in outdoor settings with adequate natural light. However, the presence of shade cast by foliage or overhead enclosures would require us to lower the brightness in order to continue viewing the real animal without difficulties.

4.3.2 Viewing area size.

The size of the visitor viewing area can influence both the visualizations and the interactions that can be incorporated into AR. Firstly, the size of the available viewing area affects the size and animations afforded to AR visualizations. This is an important consideration, especially when presenting AR content is not possible within the exhibit (due to barriers or foliage for example – see figure 2e). Secondly, the size of the viewing area can also limit the amount of interaction and movement that the application can require from visitors. For example, we found that AR applications that require large-scale movement are not meaningful in exhibits with smaller viewing platforms.

4.3.3 Exhibit Barriers.

The type of barrier (none, glass, fence, fence + moat, mesh) separating the visitor and the animals is also an influencing factor that can shape the available experiences afforded by AR. For example, we observed that we could still scan the floor surface of the animal enclosure through a glass barrier (figure 2c) using the Snap Spectacles 2021, which enabled us to register visual content to it. The same could not be achieved when the exhibit used a mesh barrier (figure 2a). We also observed that mesh barriers limited the view of the animal which could affect AR functions that take advantage of animal detection (Snapchat’s animal detection for example 9). Additionally, the type of barrier limited the interactions that could be performed by users safely. For instance, interactions requiring large amounts of movement may not be safe without barriers between the visitors and the animals (e.g., Lemur island, figure 2d).

4.4 Contextual Characteristics

Contextual factors, such as the popularity of the housed animal or the exhibit’s proximity to other zoo attractions and amenities (such as cafes) also affect the interactions and content that could be meaningfully incorporated into AR. The following are the key contextual characteristics that we found could affect the AR experience at different exhibits.

4.4.1 Crowd Density.

The number of people gathered at exhibit platforms and viewpoints limits the visualizations that can be presented on the AR device. The presence of other visitors hinders the surface tracking of the AR device due to the lack of visibility of the floor and other surfaces. This in turn disrupted the registration of the visual content to the surface, affecting the overall AR experience. Additionally, as optical see-through AR displays (such as the Snap Spectacles 2021) feature an additive display, visitors moving behind digital content could still be seen, resulting in a less immersive experience. Large numbers of visitors also limit the range of interactions available to the user. For example, large movements create the risk of bumping into others.

4.4.2 Noise.

The noise around the exhibit affects the presentation of audio content in AR. Zoos involve several sources of noise (crowd chatter, animal vocalization, nearby keeper talks and audio messages played from installations within the exhibit). For example, we observed that exhibits housing vocal animals, such as the ‘Black and White Ruffed Lemur’, resulted in a sporadically noisy environment that can affect audio content presented on the AR application, especially if the audio content could not be replayed. Other exhibits, such as the aviary, were located close to a cafe, resulting in a constant stream of sound from the visitors having conversations. As such, exhibits close to the potentially noisy visitor facilities may be less appropriate for AR applications designed to present audio content.

4.4.3 Animal Behaviour.

The behaviour of the animal, specifically in relation to how visible and active they are, can indirectly influence the afforded AR experiences of the exhibit. This is because highly active and visible animals, such as the elephants at the elephant paddocks, drew large crowds and created a generally noisy environment. This in turn creates considerations for our previous two characteristics (crowd density and noise). Additionally, we observed that passive (e.g. tortoises) or hidden animals (e.g. wombats) lead to shorter visits. We observed this at the coati exhibit, where the animal was hard to see amidst the dense foliage and complex enclosure design (figure 2f). Such exhibits present opportunities to incorporate AR applications designed to capture the user’s full attention, without concerns related to drawing attention away from the animal or the animal’s behaviour.

4.5 Discussion

Our findings suggest that different exhibit characteristics can influence the viability of AR content presentations and interactions. Exhibit characteristics were also observed to interact with other exhibit characteristics that resulted in a compounded effect on AR use. For instance, large digital visualizations and exaggerated physical interactions in AR were enabled in exhibits with sufficient visitor platform size, but was mitigated by high crowd density. As such, assessing a subset of the exhibit characteristics may not suffice in distilling appropriate designs for AR applications.
Additionally, we found that exhibit characteristics highlighted challenges and illuminated opportunities for AR application design. As an example, exhibits with mesh barriers present challenges in registering digital content within the animal enclosure in AR, but are well suited for applying methods related to AR in concealing the barrier, such as with diminished reality [55]. Similarly, exhibits with less active animals (animal behaviour), allow for rich visualizations related to the animal in AR to educate and extend visitor engagement with the animal exhibit [92].
Finally, we found that the challenges observed with different exhibit characteristics were largely associated with the different limitations of AR technology. Certain exhibit characteristics, such as crowd density, exhibit barriers, and lighting, were observed to exacerbate known limitations of AR technology, such as additive displays for optical-see through AR devices, content visualization challenges in natural light, and surface tracking issues. While such limitations are heavily dependent on different AR technologies, the assessment and consideration of different exhibit characteristics would provide valuable insights that can be accounted for in AR application designs.

5 Design Framework for AR Applications at the Zoo

Figure 3:
Figure 3: The design framework we propose consists of three activities - determine application goal, identify exhibit affordances and deliberate on design elements.
Our first scoping review (section 2.1) and workshop with zoo personnel together indicate that HCI designers and zoos are addressing only a small proportion of the broad opportunities that AR offers for creating diverse and meaningful visitor experiences (section 2.2). To expand the design space of AR applications for zoos and support the development of diverse and meaningful AR applications for zoos, we present a design framework based on our scoping reviews, workshop and field trip. The framework consists of three stages, each involving two activities. Each activity in turn is concerned with three elements relevant to designing AR applications for meaningful zoo experiences. These stages include determining the application goal, identifying exhibit affordances, and deliberating on design elements. We discuss each stage in turn, the activities involved and their role within the framework.

5.1 Stages of the Framework

Application Goal. This stage of the framework involves setting the purpose of the AR application by considering the application’s targets (visitors, subject, and setting), and the underlying zoo objectives (education & awareness, provide novel experiences and/or enhancing existing experiences). Identifying the target visitor type (individuals, children [64, 78], etc.) and/or groups (couples, families, school groups, etc. [22]), zoo subject (specific animals, exhibits, habitats, conservation efforts, information boards - exhibit signage, zoo map, etc.) and audience setting (in-situ or ex-situ) has been shown to influence the design of applications in prior work (section 2), and will help determine the available affordances and inform decisions related to AR application design. For instance, an AR application aimed at engaging visiting (in-situ setting) families (visitor group) with a specific animal habitat (zoo subject) could leverage design elements that foster social interactions within family units and use engaging game elements at locations that include the target habitat.
Aligning the application with specific zoo objectives is an essential activity that ensures the application design is not separated from the zoo’s mission. Based on the findings of our workshop and scoping reviews, we categorize visitor related zoo objectives broadly into education & awareness, engagement through novel experiences and enhancing existing experience (see section 3). Aligning one or more of these objectives with the application’s goal introduces unique sets of design considerations that interact with the considerations arising from the application’s targets. As an example, Tanaka et al. [78] noted that children (visitor type) visiting the zoo (in-situ) struggle with scientific observations (zoo objective), and designed a web-based system that incorporates quiz-styled game elements to support learning (zoo objective) about the anatomy and behaviour of penguins (zoo subject) through visual content.
Exhibit Affordances. This stage of the framework is concerned with the affordances of the exhibit relevant to AR. It involves assessing the impacts of the characteristics of the selected exhibit(s) on AR. The exhibit(s) where the application will be used influences both the application goal(s) and the feasible design elements. For instance, there may be only one exhibit deemed safe to incorporate AR (which determines/shapes our target zoo subject), due to issues like tripping hazards, animal barriers and so on. As such, the application must be designed to meaningfully integrate with the exhibit and the animal it houses, thus shaping the technology’s affordances.
Our field trip (section 4) revealed that there are two broad categories of exhibit characteristics — physical and contextual characteristics — that impact the affordances of AR. Physical characteristic include elements related to lighting (which effects visibility of visual content/target zoo subject), viewing area size (affecting available interactions and size of visual content), and exhibit barriers (affecting presented content, available interactions, and user’s visual focus). Contextual characteristics include elements related to crowd density (affecting available visualizations and interactions), noise (affecting auditory content presentation), and animal behaviour (informing considerations related to content type, visual focus, and interactivity). The effect of such characteristics can introduce new affordances and non-affordances for the design of AR (detailed in section 4). For instance, exhibits housing animals that are difficult to observe (shy/nocturnal) present opportunities to display engaging visual content in AR without concern of drawing attention away from the real animal. However, exhibits characterised by noisy environments, as a result of either highly vocal animals or large crowd densities, can restrict the use of auditory design elements in AR.
Design Elements. This stage of the framework supports the choice of design elements for the AR experience. We draw on the findings of our second scoping review to help identify desired design elements based on two factors: How the AR content is designed to affect visitor behaviour and how the information related to the AR content is designed to be perceived by visitors. These two factors are based on the common themes related to design elements found during our second scoping review (section 2.2). Specifically, themes related to visual focus, scope, social, interactivity, and game elements are concerned with user behaviour (including visual behaviour), while themes related to perspective, and content type are primarily concerned with how users perceive the presented information. The consideration of these factors aims to support developers in selecting appropriate design elements based on how they wish their users to respond and parse the presented information.
The framework highlights the need to consider visitor behaviour in relation to their visual focus, social behaviour, and their physical actions. Understanding how the desired design elements affect user behaviour will shape decision related to application goal and exhibit affordances. For instance, selecting design elements that require visitors to be highly interactive (affect on physical actions) may limit the available exhibits based on the exhibit’s characteristics and in turn affect our choice of target zoo subjects.
Another important activity in this part of the framework is considering how users perceive the presented information. The content may be designed to present implicit and/or explicit information to the user in different types (audio, text, image, video, 2D/3D visualization, etc.). For example, using real-time audio or visual (type) navigation in AR with a digital-twin of an exhibit space (as was achieved by Andreasen et al. [5] and Rosenkvist et al. [71] in a virtual environment) can present relevant information about how bats perceive their world implicitly. Identifying how the user perceives the information presented can impact other stages of the design framework. For instance, presenting information in the form of text (type) may shift visitors’ focus to the content, and affect considerations such as visitor type (in relation to accessibility), or exhibit choice (if lighting conditions hinder visual content presentation).

5.2 Guidelines for Using the Framework

Our framework does not prescribe the need to begin at any particular stage. This enables designers to use the framework based on their specific needs and circumstances. For example, designers addressing stakeholders’ need to facilitate visitor learning could start with the application goal stage of the framework, whereas the availability of limited exhibits to host AR experiences may lead designers to start with the exhibit affordances stage. This flexibility, however, could make the framework difficult to navigate and use. As such, we offer the following guidelines as a set of steps to support the use of our framework, irrespective of which stage the designer begins at.
1. Determine the stage of the framework whose activities need to be carried out based on current knowledge of application goal, exhibit affordances or design elements. A stage can be selected based on stakeholder or designer preferences, or the selection could be directed by specific requirements and restrictions. For example, the limited availability of suitable exhibits for AR applications could necessitate starting at the exhibit affordances stage of the framework.
2. Conduct the activities associated with the selected stage. The activities aid in exploring, understanding and setting key factors that may influence the design of an AR application. As an example, the activities involved in the application goal stage of the framework support consideration of different zoo objectives (e.g., education, enhancing a zoo experience, and/or providing more engaging experiences), different target visitor types/groups, different target zoo subjects, and considerations related to in-situ and ex-situ settings.
3. Aggregate the insights gained from the activities of the selected stage with insights gained from any previous stages of the framework that have been carried out. The aggregated insights provide an overview of the current considerations necessary for shaping the desired AR application. For example, if the consider visitor behaviour activity in the design elements stage leads to a focus on elements that encourage social behaviours between visiting groups, then the assess physical characteristics activity in the exhibit affordances stage would be limited to exhibits that have sufficiently large viewing platforms with higher crowd densities.
The use of our design framework, complemented by the proposed guidelines, results in the identification of key challenges and opportunities associated with the design of AR applications for enhancing the zoo experience. These insights are intended to support stakeholders, designers, and developers in expanding the design space of AR solutions for zoo-related objectives.

6 The Design Framework in Practice

Figure 4:
Figure 4: Illustration of the two distinct design ideas for an AR application to teach users about pit organs, which are present in certain species of snakes. In both ideas, the users are envisaged to be wearing an AR headset like the Microsoft Hololens or Snap Spectacles. The ideas were generated through activities detailed in our design framework.
The motivation for our framework is rooted in the limited range of AR applications to enhance zoo experiences observed in prior work (section 2.1), and during our workshop (section 3). To showcase how our framework can be used, we present two examples that illustrate how the framework can be used to arrive at novel AR applications. The first example is inspired by objectives related to educating visitors on animal characteristics that may not be observable during a zoo visit [4, 78, 79, 80]. The second example demonstrates how the framework can be used to capitalize on accessibility challenges identified by stakeholders during our workshop (section 3). The use of examples to demonstrate the utility of design spaces and frameworks in generating multiple new designs is common practice in HCI [15, 16, 26], and allows for a broader perspective on the framework’s utility when applied to different cases, as opposed to the implementation and evaluation of a single solution generated from the framework [57].

6.1 Example 1: Unobservable Animal Characteristics

Animals possess physiological, behavioural and social characteristics different from humans. Some characteristics, such as the social hierarchy of gorilla tribes [3] or the vision of cats [49], are difficult to observe and understand from a visit to the zoo or from reading text. In this example, we consider one such characteristic that is unique to certain species of snakes: the pit organ. The pit organ is a specialized organ that enables snakes to detect prey based on their body heat (infrared sensing). While this is relatively easy to imagine for individuals familiar with thermal imaging, it may be less so for people who have no prior experience with such techniques. Therefore, the objective of this example is to provide a way for zoo visitors to learn about pit organs.
To design an AR application to achieve the specified objective, we employ the design framework and our associated guidelines. We first determine that we start with the application goal stage of the framework, as our current knowledge specifies one of the key zoo objectives i,e., to educate users about pit organs. We then conduct the remaining activities associated with that stage. For this example, we will not be concerned with providing a novel experience or enhancing an existing experience through the AR application, but will focus on facilitating learning through the AR application. We then identify that the target visitor type/group are individuals who are physically visiting the zoo (in-situ target setting). Given that the goal is to educate visitors on the functions of pit organs, we can either set our target zoo subject to snake species with pit organs, or prey species to snakes with pit organs. Choosing the former (exhibit housing the snake with pit organs) would enable presenting relevant information about pit organs while the visitor observes the snake. Choosing the latter enables the use of AR to demonstrate how pit organs are used to detect prey. For the purposes of this example, we will select our target zoo subject to be a rodent species that is common prey to snakes with pit organs.
Next, we assess the influence of the selected exhibit’s characteristics on the affordances of AR by conducting the activities involved with the exhibit affordances stage. We have chosen a rodent that is nocturnal and housed in a dimly lit enclosure with a glass barrier (physical characteristics - lighting and exhibit barriers respectively). The chosen animal is also shy and prefers to hide behind foliage (contextual characteristic - animal behaviour). As there is less opportunity for users to view the real rodent and because the aim of the AR application is to educate about snakes with pit organs, there is less need to consider how AR detracts visual attention from the rodent. This affords the presentation of rich 3D visualizations in AR that can be highly controlled to demonstrate the utility of pit organs. Additionally, the glass barrier enables AR content to be registered inside the enclosure, or on the animal. Further, as a side-effect of being less visible, the rodent’s exhibit is not crowded with visitors and is relatively quiet (contextual characteristics - crowd density and noise), which affords the incorporation of interactive and auditory design elements. However, the size of the visitor viewing location (physical characteristic - viewing area size) is small and therefore prevents the use of highly dynamic interactions.
Lastly, we can conduct the activities related to the design elements stage while considering the aggregated insights gained through the application goal and exhibit affordances stages of the framework. The target zoo subject, target setting and application goal has determined the exhibit where we will deploy the AR application, in this case the exhibit of a prey species to snakes with pit organs. As the target visitor is set to individual users, design elements encouraging social behaviour can be used (for example, amongst different individual users) but should be optional. The physical and contextual characteristics of the exhibit housing the rodent requires design that involves less physical actions (viewing area size and crowd density), that can use both auditory and visual content (presentation type), and that can cause users to visually focus on the AR content. Furthermore, we can include both implicit (perspective visualization of the rodents from the snake’s point of view) and/or explicit (auditory explanation of snake’s behaviour with pit organs) information. Given these insights, we illustrate two example AR applications based on the insights gained through the activities involved in our framework.
In the first instance, we include design elements related to perspective, visual content type, and minimal interactivity (physical actions) to enable users to swap between human vision and infrared thermal vision (which is most akin to what pit organs allow) to view a virtual rodent (figure 4a). The envisaged application would enable visitors to move around the virtual rodent while wearing an AR headset, in order to view the animal from different angles with both human vision and infrared thermal vision. This would enable users to understand how the snake perceives such prey. Swapping between the different vision types can also encourage users to compare, and better understand, the differences between human vision and that of snakes with pit organs. While not designed specifically for sociality, educators could also leverage the experiential nature of the application to enable students to form their own hypotheses about pit organs and discuss or correct these hypotheses as the student engages with the application.
Alternatively, including design elements related to perspective, visual content type, minimal interactivity, game elements, social and visual focus on the real animal, can result in a vastly different experience (figure 4b). In this case, we incorporate animal detection features — made possible by modern AR devices 10 — to enable users to scan for real rodents in the exhibit and overlay a thermal filter on the rodent, replicating the function of a pit organ. As we identified the rodent to be shy and nocturnal, it may be challenging to detect for users, but can be assisted by object recognition and the thermal image overlay. Including game elements, such as enabling users to use pit organs (thermal vision) for a limited amount of time to search for as many rodents as they can while competing against a timer (no social behaviour) or against other visitors (social behaviour), could also engage users and further highlight the utility of the pit organs.

6.2 Example 2: Multilingual Access + Digital Keeper Talks

Figure 5:
Figure 5: This idea shows an implemented AR prototype application that enables visitors to access multilingual information about animals by scanning the animal’s picture on zoo signage placed outside the exhibit. From left to right; a) the app can detect and classify animal images from signage, b) after an animal image is detected, the user can make an open palm gesture to begin viewing AR content, c) the app presents educational information via 3D content, audio and text, and d) the app translates relevant information into a different language using AR captions. The 3D digital animal can be rotated to enable viewing from different angles.
A challenge identified by zoo stakeholders during our workshop pertains to enabling multilingual and anytime access to keeper talks (section 3). Stakeholders saw an opportunity to leverage the visual and auditory presentation capabilities of AR to address this challenge by enabling visitors to access digital keeper talks at any time. This challenge of presenting information in zoos is not unique to keeper talks, but extends to the presentation of animal details on the information boards placed outside zoo exhibits (which was primarily in English for the zoo we visited). Therefore, in this example, we demonstrate how our framework can be used to design a solution for the specified challenge by starting from the ‘Application Goal’ stage of our framework.
First, our current knowledge dictates that the target visitor type refers to in-situ visitors who may not be able to attend keeper talks at the scheduled times, and/or are non-native speakers of the language used for keeper talks and zoo signage. The objective of the application is to enhance existing experiences through more accessible zoo content related to education and awareness for visitors. For this example, we set the target zoo subjects to animals that are not currently on display, which was another challenge uncovered by our workshop. These insights are drawn from the application goal stage of our framework.
Next we perform the activities described in the design elements stage. As the target animals are not on display, we can design applications that allow visitors to visually focus on the AR through explicit presentation of visual content while providing audio for keeper talks (presentation type). Social elements can be incorporated to enable shared experiences between visitors. However, for this example, we choose to design a solution that visitors can engage with individually. Additionally, we choose to include minimal physical actions (hand gestures and touch manipulations) to enable visitors to interact with the AR content presentations.
Finally, we engage with the exhibit affordances stage in our framework and determine that the crowd density and noise are low as a consequence of designing for an exhibit where the animal is not on display. The absence of the housed animal also suggests that the animal’s behaviour will not influence the design of our application in this example. Physical exhibit characteristics such as lighting will depend on the specific exhibit that house animals currently not on display, and as such may vary. As we have chosen to include both auditory and visual content during the design elements stage of the framework, we can adjust the type of content to be presented based on lighting conditions in specific exhibits (opting for more auditory content if lighting hinders visual presentations). Viewing area size will have less influence on the design of this application as minimal physical actions will be used. Additionally, we choose to display the AR content in close proximity to the zoo’s signage about the animal. As our decisions so far place the AR content outside of the animal enclosure, exhibit barriers are less of a consideration aside from safety concerns (for example, no physical actions near a low fence and moat exhibit).
We demonstrate the feasibility of the AR application designed through this example with the help of an implemented prototype. The prototype is a Snapchat-based AR application developed using the Snap Lens Studio 11. The application, shown in Figure 5, enables visitors to detect animal images on zoo signage using animal detection features 12. Then, gesture and touch interactions enable users to reveal a digital version of the animal, start and stop information presentations, rotate the digital animal to view it from different angles, and present textual translations for the presented information. The prototype is intended to showcase the feasibility of implementing AR applications designed with the help of our framework.

7 General Discussion

This paper aimed to support the design of diverse and meaningful AR applications to enhance the zoo experience. Our key contribution is a novel design framework that is grounded in an analysis of previous research on technology at the zoo, observations of real-world zoo exhibits, and expert input from zoo personnel. We offer the design framework as a flexible tool for navigating key decision points that are not straightforward. Our framework enables designers to consider the application’s goal, the affordances and constraints of zoo exhibits, and a range of design elements when designing AR for visitors. To demonstrate the utility of our framework, we follow previous work [15, 16, 26] by presenting illustrative examples of using our framework to design AR applications for the zoo.
Our research underpinning the framework offers additional contributions to HCI, both in terms of understanding how AR should be deployed at the zoo and in identifying areas of opportunity. In particular, our scoping reviews revealed a wider range of opportunities for supporting visitor engagement with AR than are currently being used. They also highlight how design elements in AR applications for zoos are underexplored when compared to other technologies. Furthermore, our workshop with zoo stakeholders revealed additional challenges in the design and adoption of AR for enhancing the zoo visitor experience. We found that key stakeholders, including representatives responsible for visitor experience and digital engagement, struggled to envision AR solutions that meaningfully tackle challenges and capitalize on opportunities present in zoos. Lastly, our field trip highlighted the importance of physical and contextual characteristics of zoo exhibits in shaping the design of relevant AR applications. These findings, individually, offer valuable insights on the current state of AR applications for zoos, the barriers to adoption and design for zoo stakeholders, and the unique challenges that arise from exhibit characteristics. Our framework consolidates these insights into a flexible tool to enable the design of AR applications that explores viable design elements, targets relevant objectives, and considers pertinent exhibit characteristics.
The examples we present in section 6 demonstrate the use of our framework to explore underutilized design elements while addressing the challenges and opportunities identified by stakeholders. For instance, our first example (section 6.1) exemplifies how design elements related to visual perspectives can be leveraged in AR solutions to educate visitors about an animal’s physiology and behaviours, and foster empathy [36]. Similar applications have been explored with virtual reality (VR) [36] and 2D displays [49, 50], but they do not offer the same capabilities as AR in maintaining visual connection with real animals (if visible) and their surroundings. Our second example (section 6.2) demonstrates how our framework can be used to respond to challenges and opportunities identified by zoo stakeholders. We designed an AR application to enable multilingual access to zoo signage and keeper talks, along with anytime access to keeper talks. We further demonstrated the feasibility of the resulting design via a prototype implementation using smartphone-based AR. These examples show how our framework supports the design of AR applications for the zoo and also allows for flexible exploration of multiple ideas around central design elements, objectives, or exhibits.
While our framework was specifically created to support the design of AR applications for zoo settings, we envisage that the framework will be beneficial for a broader range of application domains. Specifically, our framework can help practitioners in responding to the challenges of creating AR applications for real environments that cannot be fully controlled. For instance, previous works on AR in tourism [91], cultural heritage sites [60], and other outdoor settings [47], have reported on the limitations of AR technology in relation to lighting conditions. Such limitations and issues could be identified and appropriately designed for by engaging with the ‘Assess Physical Characteristics’ and ‘Determine Presentation’ activities in our framework prior to implementation. Our framework could also help in identifying issues related to excessive user attention towards content in AR (‘consider visual focus’ activity), such as increased collision risks with real objects in AR applications for tourism or navigation [91], and may prevent AR that detracts from real-world exhibits and experiences [52, 77]. Similarly, adapting the ‘Exhibit Affordances’ activity to asses the physical and contextual characteristics in museums or other informal learning settings could help illuminate new opportunities on “how to make pedagogical use of the landscape” [8] for educational AR applications. These examples suggest that our framework, while not directly applicable to other application domains, may highlight considerations and point to issues that might otherwise not be anticipated by designers. This indicates the framework’s broader utility, and the potential avenues for expanding it to serve other application domains.

7.1 Limitations & Future Work

Our framework is designed to be applicable to a broad range of zoos, and so does not include specific application goal(s), exhibit affordances, or design elements. We conduct our on-site research at a site considered to be a typical modern zoo, but we acknowledge that the framework’s broad applicability should be assessed in other zoos that may have different objectives and exhibit characteristics. Additionally, we present two examples, with illustrations and a prototype implementation respectively, to demonstrate the use of our framework in generating multiple design ideas for AR applications at the zoo. This enables us to showcase the flexibility of our framework in handling various cases [26]. However, the evaluation of the different implemented designs generated by this framework are areas of future work that lie beyond the scope of this paper.
Our framework aims to reduce barriers to AR adoption in zoos. Specifically, we provide a solution to the limited range of AR application designs for zoos. However, limitations related to AR technology itself remain as barriers to AR adoption. These relate to surface tracking and content registration [25, 60], small field-of-view [90], implementation challenges [53], and usability issues, such as postural discomfort for interactions [38, 45], depth perception [1], and muscle fatigue [43] (for head-mounted AR systems). While our framework intentionally does not restrict design generation according to limitations of current AR technology, such limitations are pertinent considerations that practitioners should assess alongside the intended designs.
Finally, as discussed in section 7, our framework can indicate design considerations for AR applications in other domains, including tourism, cultural heritage and informal learning settings. As such, future work could explore elements of our framework that could be adapted to, or provide insights for, other domains of interest to HCI.

8 Conclusion

This paper presented a framework to support the design of AR applications with the goal of enhancing the zoo experience. The framework builds upon the findings of two scoping reviews, a workshop with zoo representatives, and a field trip. The framework details three activities that aid in determining the application goal, identifying exhibit affordances in relation to AR and deliberating on possible and desired design elements. We detail the processes involved in each activity and how they relate to the design of AR applications for enhancing the zoo experience. Finally, we demonstrate how the framework supports the generation of unique and diverse AR applications to enhance the zoo experience based on two examples. The framework addresses an important challenge found during our scoping review and workshop, of enabling the exploration of meaningful AR application designs to enhance the zoo experience.

Acknowledgments

This work was supported by Snap Inc. as part of their Snap Creative Challenge. We thank Dr. Ilyena Hirskyj-Douglas and Dr. Mia Cobb for their generous advice and help during the conception of this work. We would also like to thank Samangi Wadinambiarachchi for their help with the illustrations presented in this paper. Thank you to all the staff at Melbourne Zoo that took part in our workshop and provided valuable feedback.

Footnotes

Supplemental Material

MP4 File - Video Presentation
Video Presentation
Transcript for: Video Presentation

References

[1]
Haley Adams, Jeanine Stefanucci, Sarah Creem-Regehr, and Bobby Bodenheimer. 2022. Depth perception in augmented reality: The effects of display, shadow, and position. In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 792–801.
[2]
Matthew J Alger, Julie K Banfield, Cheng Soon Ong, Lawrence Rudnick, OI Wong, Christian Wolf, Heinz Andernach, Ray P Norris, and Stanislav S Shabala. 2018. Radio Galaxy Zoo: machine learning for radio source host galaxy cross-identification. Monthly Notices of the Royal Astronomical Society 478, 4 (2018), 5547–5563.
[3]
Don Allison, Brian Wills, Larry F Hodges, and Jean Wineman. 1997. Gorillas in the Bits. In Proceedings of IEEE 1997 Annual International Symposium on Virtual Reality. IEEE, 69–76.
[4]
Ricardo Andrade and Samar Shool. 2020. Pixeldust Studios Reptopia Magic Leap Experience. In ACM SIGGRAPH 2020 Immersive Pavilion (Virtual Event, USA) (SIGGRAPH ’20). Association for Computing Machinery, New York, NY, USA, Article 14, 2 pages. https://doi.org/10.1145/3388536.3407892
[5]
Anastassia Andreasen, Jelizaveta Zovnercuka, Kristians Konovalovs, Michele Geronazzo, Razvan Paisa, and Stefania Serafin. 2018. Navigate as a bat. real-time echolocation system in virtual reality. In 15th Sound and Music Computing Conference (SMC 2018). Sound and Music Computing Network, 192–199.
[6]
Riga Anggarendra and Margot Brereton. 2016. Engaging Children with Nature through Environmental HCI. In Proceedings of the 28th Australian Conference on Computer-Human Interaction (Launceston, Tasmania, Australia) (OzCHI ’16). Association for Computing Machinery, New York, NY, USA, 310–315. https://doi.org/10.1145/3010915.3010981
[7]
Hilary Arksey and Lisa O’Malley. 2005. Scoping studies: towards a methodological framework. International journal of social research methodology 8, 1 (2005), 19–32.
[8]
Mattias Arvola, Inger Edforss Fuchs, Ingemar Nyman, and Anders Szczepanski. 2021. Mobile augmented reality and outdoor education. Built Environment 47, 2 (2021), 223–242.
[9]
Ronald T. Azuma. 1997. A Survey of Augmented Reality. Presence: Teleoperators and Virtual Environments 6, 4 (08 1997), 355–385. https://doi.org/10.1162/pres.1997.6.4.355 arXiv:https://direct.mit.edu/pvar/article-pdf/6/4/355/1623026/pres.1997.6.4.355.pdf
[10]
Costas Boletsis and Dimitra Chasanidou. 2018. Audio augmented reality in public transport for exploring tourist sites. In Proceedings of the 10th Nordic Conference on Human-Computer Interaction. 721–725.
[11]
Doug A. Bowman, Larry F. Hodges, Don Allison, and Jean Wineman. 1999. The Educational Value of an Information-Rich Virtual Environment. Presence: Teleoper. Virtual Environ. 8, 3 (jun 1999), 317–331. https://doi.org/10.1162/105474699566251
[12]
Marcus Carter, Sarah Webber, and Sally Sherwen. 2015. Naturalism and ACI: Augmenting Zoo Enclosures with Digital Technology. In Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology (Iskandar, Malaysia) (ACE ’15). Association for Computing Machinery, New York, NY, USA, Article 61, 5 pages. https://doi.org/10.1145/2832932.2837011
[13]
Yi-Hsing Chang, Jhen-Hao Hwang, Rong-Jyue Fang, and You-Te Lu. 2017. A Kinect-and game-based interactive learning system. Eurasia Journal of Mathematics, Science and Technology Education 13, 8 (2017), 4897–4914.
[14]
Juliet Corbin and Anselm Strauss. 2014. Basics of qualitative research: Techniques and procedures for developing grounded theory. Sage publications.
[15]
Kurtis Danyluk, Barrett Ens, Bernhard Jenny, and Wesley Willett. 2021. A design space exploration of worlds in miniature. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–15.
[16]
Kody R Dillman, Terrance Tin Hoi Mok, Anthony Tang, Lora Oehlberg, and Alex Mitchell. 2018. A visual interaction cue framework from video game environments for augmented reality. In Proceedings of the 2018 CHI conference on human factors in computing systems. 1–12.
[17]
Grace Eden, Sumita Sharma, Debjani Roy, Anirudha Joshi, José Abdelnour Nocera, and Nimmi Rangaswamy. 2019. Field Trip as Method: A Rapid Fieldwork Approach. In Proceedings of the 10th Indian Conference on Human-Computer Interaction (Hyderabad, India) (IndiaHCI ’19). Association for Computing Machinery, New York, NY, USA, Article 1, 7 pages. https://doi.org/10.1145/3364183.3364188
[18]
Karin Fahlquist, Johannes Karlsson, Haibo Li, Li Liu, Keni Ren, Shafiq ur Réhman, and Tim Wark. 2010. Human Animal Machine Interaction: Animal Behavior Awareness and Digital Experience. In Proceedings of the 18th ACM International Conference on Multimedia (Firenze, Italy) (MM ’10). Association for Computing Machinery, New York, NY, USA, 1269–1274. https://doi.org/10.1145/1873951.1874201
[19]
Karin Fahlquist, Thomas Mejtoft, and Johannes Karlsson. 2011. Social Media Game Concept within the Digital Zoo: New Ways of Connecting a Tourist Attraction with Its Visitors. In 2011 International Conference on Internet of Things and 4th International Conference on Cyber, Physical and Social Computing. 170–177. https://doi.org/10.1109/iThings/CPSCom.2011.100
[20]
Juliano Franz, Mohammed Alnusayri, Joseph Malloch, Akshay Gahlon, and Derek Reilly. 2019. AR in the Gallery: The Psychogeographer’s Table. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems. 1–6.
[21]
Fiona French, Mark Kingston-Jones, David T. Schaller, Sarah Ellen Webber, Heli Väätäjä, and Mark Campbell. 2016. Don’t Cut to the Chase: Hunting Experiences for Zoo Animals and Visitors. In Proceedings of the Third International Conference on Animal-Computer Interaction (Milton Keynes, United Kingdom) (ACI ’16). Association for Computing Machinery, New York, NY, USA, Article 19, 6 pages. https://doi.org/10.1145/2995257.3014066
[22]
Zhiyong Fu, Jia Lin, and Yuyao Zhou. 2018. Design Framework for Informal Learning Based on Mobile Technologies. In Proceedings of the Sixth International Symposium of Chinese CHI (Montreal, QC, Canada) (ChineseCHI ’18). Association for Computing Machinery, New York, NY, USA, 22–30. https://doi.org/10.1145/3202667.3202671
[23]
Joseph L Gabbard, J Edward Swan, and Deborah Hix. 2006. The effects of text drawing styles, background textures, and natural lighting on text legibility in outdoor augmented reality. Presence 15, 1 (2006), 16–32.
[24]
Christoph Gebhardt, Brian Hecox, Bas van Opheusden, Daniel Wigdor, James Hillis, Otmar Hilliges, and Hrvoje Benko. 2019. Learning Cooperative Personalized Policies from Gaze Data. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (New Orleans, LA, USA) (UIST ’19). Association for Computing Machinery, New York, NY, USA, 197–208. https://doi.org/10.1145/3332165.3347933
[25]
Adam Gomes, Keegan Fernandes, and David Wang. 2021. Surface prediction for spatial augmented reality applications. Virtual Reality 25 (2021), 761–771.
[26]
Renate Haeuslschmid, Bastian Pfleging, and Florian Alt. 2016. A design space to support the development of windshield applications for the car. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 5076–5091.
[27]
Rex Hartson and Pardha S Pyla. 2012. The UX Book: Process and guidelines for ensuring a quality user experience. Elsevier.
[28]
Jason R Herrick. 2019. Assisted reproductive technologies for endangered species conservation: developing sophisticated protocols with limited access to animals with unique reproductive mechanisms. Biology of Reproduction 100, 5 (2019), 1158–1170.
[29]
Ilyena Hirskyj-Douglas, Stuart Gray, and Roosa Piitulainen. 2021. ZooDesign: Methods for Understanding and Facilitating Children’s Education at Zoos. In Interaction Design and Children (Athens, Greece) (IDC ’21). Association for Computing Machinery, New York, NY, USA, 204–215. https://doi.org/10.1145/3459990.3460697
[30]
Ilyena Hirskyj-Douglas and Roosa Piitulainen. 2020. Developing Zoo Technology Requirements for White-Faced Saki Monkeys. In Proceedings of the Seventh International Conference on Animal-Computer Interaction (Milton Keynes, United Kingdom) (ACI’2020). Association for Computing Machinery, New York, NY, USA, Article 3, 12 pages. https://doi.org/10.1145/3446002.3446123
[31]
M Hutchins and B Smith. 2003. Characteristics of a world-class zoo or aquarium in the 21st century. International Zoo Yearbook 38, 1 (2003), 130–141.
[32]
Priscilla F. Jimenez Pazmino, Brenda Lopez Silva, Brian Slattery, and Leilah Lyons. 2013. Teachable Mo[Bil]Ment: Capitalizing on Teachable Moments with Mobile Technology in Zoos. In CHI ’13 Extended Abstracts on Human Factors in Computing Systems (Paris, France) (CHI EA ’13). Association for Computing Machinery, New York, NY, USA, 643–648. https://doi.org/10.1145/2468356.2468470
[33]
RA Kalpana, S Jayashree, and P Kaaviya Darshini. 2022. Intra Zoo Expedition with AR. In 2022 1st International Conference on Computational Science and Technology (ICCST). IEEE, 808–812.
[34]
Pavel Kapoun and Jana Kapounová. 2016. Instruction outside the classroom: Mobile, or ubiquitous learning. In Proceedings of the European Conference on e-Learning, ECEL. 340–349.
[35]
Johannes Karlsson, Shafiq ur Réhman, and Haibo Li. 2010. Augmented Reality to Enhance Visitors Experience in a Digital Zoo. In Proceedings of the 9th International Conference on Mobile and Ubiquitous Multimedia (Limassol, Cyprus) (MUM ’10). Association for Computing Machinery, New York, NY, USA, Article 7, 4 pages. https://doi.org/10.1145/1899475.1899482
[36]
Haruka Kasuga, Machiko Ohashi, Masataka Yamamoto, Yusuke Konishi, Haruna Kitamura, Yuichiro Ikeda, and Takashi Murai. 2020. Exploring the Needs and Ways to Use Virtual Reality to Understand Animals’ Perceptions: A Field Study in a Science Workshop and Exhibition. In Proceedings of the Seventh International Conference on Animal-Computer Interaction (Milton Keynes, United Kingdom) (ACI’2020). Association for Computing Machinery, New York, NY, USA, Article 7, 11 pages. https://doi.org/10.1145/3446002.3446120
[37]
Ryan M. Kelly, Hasan Shahid Ferdous, Niels Wouters, and Frank Vetere. 2019. Can Mobile Augmented Reality Stimulate a Honeypot Effect? Observations from Santa’s Lil Helper. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). ACM, New York, NY, USA, Article 285, 13 pages. https://doi.org/10.1145/3290605.3300515
[38]
Jeong Ho Kim, Hemateja Ari, Charan Madasu, and Jaejin Hwang. 2020. Evaluation of the biomechanical stress in the neck and shoulders during augmented reality interactions. Applied Ergonomics 88 (2020), 103175.
[39]
Jae-Yong Kim, Young-Gu Lee, Kwang-Hyong Lee, and Moon-Seok Jun. 2011. Realization of Integrated Service for Zoos Using RFID/USN Based Location Tracking Technology. In International Conference on Ubiquitous Computing and Multimedia Applications. Springer, 361–368.
[40]
Kangsoo Kim, Byung-Kuk Seo, Jae-Hyek Han, and Jong-Il Park. 2009. Augmented reality tour system for immersive experience of cultural heritage. In Proceedings of the 8th International Conference on Virtual Reality Continuum and its Applications in Industry. 323–324.
[41]
Alexandra Kitson, Mirjana Prpa, and Bernhard E Riecke. 2018. Immersive interactive technologies for positive change: a scoping review and design considerations. Frontiers in psychology 9 (2018), 1354.
[42]
Rébecca Kleinberger, Anne H. K. Harrington, Lydia Yu, Akito van Troyer, David Su, Janet M. Baker, and Gabriel Miller. 2020. Interspecies Interactions Mediated by Technology: An Avian Case Study at the Zoo. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3313831.3376858
[43]
James F Knight and Chris Baber. 2007. Effect of head-mounted displays on posture. Human factors 49, 5 (2007), 797–807.
[44]
Jessica L. Oliver, Selen Turkay, Margot Brereton, David M. Watson, and Paul Roe. 2021. Engaging with Nature Sounds & Citizen Science by Designing for Creative & Contextual Audio Encounters. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (, Yokohama, Japan, ) (CHI ’21). Association for Computing Machinery, New York, NY, USA, Article 567, 14 pages. https://doi.org/10.1145/3411764.3445390
[45]
Gun A Lee, Ungyeon Yang, Yongwan Kim, Dongsik Jo, Ki-Hong Kim, Jae Ha Kim, and Jin Sung Choi. 2009. Freeze-Set-Go interaction method for handheld mobile augmented reality environments. In Proceedings of the 16th ACM Symposium on Virtual Reality Software and Technology. 143–146.
[46]
David Lindlbauer, Anna Maria Feit, and Otmar Hilliges. 2019. Context-Aware Online Adaptation of Mixed Reality Interfaces. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (New Orleans, LA, USA) (UIST ’19). Association for Computing Machinery, New York, NY, USA, 147–160. https://doi.org/10.1145/3332165.3347945
[47]
Eran Litvak and Tsvi Kuflik. 2020. Enhancing cultural heritage outdoor experience with augmented-reality smart glasses. Personal and ubiquitous computing 24 (2020), 873–886.
[48]
Yang Liu and Wenbin Zheng. 2021. Masked Face Recognition based on Attention Mechanism and FaceX-Zoo. In 2021 International Conference on Digital Society and Intelligent Systems (DSInS). IEEE, 107–110.
[49]
Jeremy Long, Anthony Estey, David Bartle, Sven Olsen, and Amy A. Gooch. 2010. Catalyst: Seeing through the Eyes of a Cat. In Proceedings of the Fifth International Conference on the Foundations of Digital Games (Monterey, California) (FDG ’10). Association for Computing Machinery, New York, NY, USA, 116–123. https://doi.org/10.1145/1822348.1822364
[50]
Jeremy Long and Amy A Gooch. 2011. Bee prepared: Simulating bee vision in an educational game. In 2011 16th International Conference on Computer Games (CGAMES). IEEE, 262–269.
[51]
Andrés Lucero. 2015. Using affinity diagrams to evaluate interactive prototypes. In IFIP conference on human-computer interaction. Springer, 231–248.
[52]
Diana Marques and Robert Costello. 2018. Concerns and challenges developing mobile augmented reality experiences for museum exhibitions. Curator: The Museum Journal 61, 4 (2018), 541–558.
[53]
Carlos E Mendoza-Ramírez, Juan C Tudon-Martinez, Luis C Félix-Herrán, Jorge de J Lozoya-Santos, and Adriana Vargas-Martínez. 2023. Augmented Reality: Survey. Applied Sciences 13, 18 (2023), 10491.
[54]
David Moher, Alessandro Liberati, Jennifer Tetzlaff, Douglas G Altman, and PRISMA Group*. 2009. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Annals of internal medicine 151, 4 (2009), 264–269.
[55]
Shohei Mori, Sei Ikeda, and Hideo Saito. 2017. A survey of diminished reality: Techniques for visually concealing, eliminating, and seeing through real objects. IPSJ Transactions on Computer Vision and Applications 9, 1 (2017), 1–14.
[56]
Andrew G Moss and Bethany Pavitt. 2019. Assessing the effect of zoo exhibit design on visitor engagement and attitudes towards conservation. Journal of Zoo and Aquarium Research 7, 4 (2019), 186–194.
[57]
Thomas Muender, Michael Bonfert, Anke Verena Reinschluessel, Rainer Malaka, and Tanja Döring. 2022. Haptic fidelity framework: Defining the factors of realistic haptic feedback for virtual reality. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–17.
[58]
Kenton O’Hara, Tim Kindberg, Maxine Glancy, Luciana Baptista, Byju Sukumaran, Gil Kahana, and Julie Rowbotham. 2007. Social practices in location-based collecting. In Proceedings of the SIGCHI conference on Human factors in computing systems. 1225–1234.
[59]
Yutaro Ohashi, Hideaki Ogawa, and Makoto Arisawa. 2008. Making New Learning Environment in Zoo by Adopting Mobile Devices. In Proceedings of the 10th International Conference on Human Computer Interaction with Mobile Devices and Services (Amsterdam, The Netherlands) (MobileHCI ’08). Association for Computing Machinery, New York, NY, USA, 489–490. https://doi.org/10.1145/1409240.1409323
[60]
Chris Panou, Lemonia Ragia, Despoina Dimelli, and Katerina Mania. 2018. An architecture for mobile outdoors augmented reality for cultural heritage. ISPRS International Journal of Geo-Information 7, 12 (2018), 463.
[61]
Patricia G Patrick, Catherine E Matthews, David Franklin Ayers, and Sue Dale Tunnicliffe. 2007. Conservation and education: Prominent themes in zoo mission statements. The Journal of environmental education 38, 3 (2007), 53–60.
[62]
Anandi Pendse, Michael Pate, and Bruce N Walker. 2008. The accessible aquarium: identifying and evaluating salient creature features for sonification. In Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility. 297–298.
[63]
Bonnie M. Perdue, Tara S. Stoinski, and Terry L. Maple. 2012. Using Technology to Educate Zoo Visitors About Conservation. Visitor Studies 15, 1 (2012), 16–27. https://doi.org/10.1080/10645578.2012.660839 arXiv:https://doi.org/10.1080/10645578.2012.660839
[64]
Judy Perry, Eric Klopfer, Marleigh Norton, Dan Sutch, Richard Sandford, and Keri Facer. 2008. AR Gone Wild: Two Approaches to Using Augmented Reality Learning Games in Zoos. In Proceedings of the 8th International Conference on International Conference for the Learning Sciences - Volume 3 (Utrecht, The Netherlands) (ICLS’08). International Society of the Learning Sciences, 322–329.
[65]
Manuel Peuster, Stefan Schneider, and Holger Karl. 2019. The softwarised network data zoo. In 2019 15th International Conference on Network and Service Management (CNSM). IEEE, 1–5.
[66]
Gerti Pishtari, Terje Väljataga, Priit Tammets, Pjotr Savitski, María Jesús Rodríguez-Triana, and Tobias Ley. 2017. SmartZoos: modular open educational resources for location-based games. In European conference on technology enhanced learning. Springer, 513–516.
[67]
Iulian Radu and Bertrand Schneider. 2019. What can we learn from augmented reality (AR)? Benefits and drawbacks of AR for inquiry-based learning of physics. In Proceedings of the 2019 CHI conference on human factors in computing systems. 1–12.
[68]
Ramesh Raskar, Greg Welch, and Henry Fuchs. 1999. Spatially augmented reality. Augmented Reality: Placing Artificial Objects in Real Scenes (1999), 64–71.
[69]
Keni Ren, Johannes Karlsson, and Haibo Li. 2011. Interaction Design for Digital Zoo. In 2011 International Conference on Internet of Things and 4th International Conference on Cyber, Physical and Social Computing. IEEE, 744–747.
[70]
Katie Roe and Andrew McConney. 2015. Do zoo visitors come to learn? An internationally comparative, mixed-methods study. Environmental Education Research 21, 6 (2015), 865–884.
[71]
Amalie Rosenkvist, David Sebastian Eriksen, Jeppe Koehlert, Miicha Valimaa, Mikkel Brogaard Vittrup, Anastasia Andreasen, and George Palamas. 2019. Hearing with eyes in virtual reality. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 1349–1350.
[72]
Hayato Sakamoto and Tomoyuki Ishida. 2018. Proposal of a Zoo Navigation AR Application Using Markerless Image Processing. In International Conference on P2P, Parallel, Grid, Cloud and Internet Computing. Springer, 371–380.
[73]
Carlos Saul Arboleda and Juan Diego Balanta Diego Balanta. 2019. VeZoo – Augmented Reality Experience for the Cali’s Zoo. In 2019 International Conference on Virtual Reality and Visualization (ICVRV). 302–303. https://doi.org/10.1109/ICVRV47840.2019.00078
[74]
Jae-Eun Shin and Woontack Woo. 2023. How Space is Told: Linking Trajectory, Narrative, and Intent in Augmented Reality Storytelling for Cultural Heritage Sites. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 1–14.
[75]
Ananta Srisuphab, Piyanuch Silapachote, Nuttha Sirilertworakul, and Yongchai Utara. 2014. Integrated ZooEduGuide with multimedia and AR from the largest living classrooms to wildlife conservation awareness. In TENCON 2014 - 2014 IEEE Region 10 Conference. 1–4. https://doi.org/10.1109/TENCON.2014.7022304
[76]
Brandon Victor Syiem, Ryan M. Kelly, Jorge Goncalves, Eduardo Velloso, and Tilman Dingler. 2021. Impact of Task on Attentional Tunneling in Handheld Augmented Reality. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (Yokohama, Japan) (CHI ’21). Association for Computing Machinery, New York, NY, USA, Article 193, 14 pages. https://doi.org/10.1145/3411764.3445580
[77]
Brandon Victor Syiem, Ryan M. Kelly, Eduardo Velloso, Jorge Goncalves, and Tilman Dingler. 2020. Enhancing Visitor Experience or Hindering Docent Roles: Attentional Issues in Augmented Reality Supported Installations. In 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 279–288. https://doi.org/10.1109/ISMAR50242.2020.00053
[78]
Yui Tanaka, Ryohei Egusa, Yuuki Dobashi, Fusako Kusunoki, Etsuji Yamaguchi, Shigenori Inagaki, and Tomoyuki Nogami. 2017. Children’s Evaluations of a System Supporting Observation of Anatomies and Behaviors of Animals in Zoos. In Extended Abstracts Publication of the Annual Symposium on Computer-Human Interaction in Play. 201–206.
[79]
Yui Tanaka, Ryohei Egusa, Yuuki Dobashi, Fusako Kusunoki, Etsuji Yamaguchi, Shigenori Inagaki, and Tomoyuki Nogami. 2017. Preliminary Evaluation of a System for Helping Children Observe the Anatomies and Behaviors of Animals in a Zoo. In CSEDU (2). 305–310.
[80]
Yui Tanaka, Ryohei Egusa, Etsuji Yamaguchi, Shigenori Inagaki, Fusako Kusunoki, Hideto Okuyama, and Tomoyuki Nogami. 2016. Supporting Zoo Visitors’ Scientific Observations with a Mobile Guide. In CSEDU (2). 353–358.
[81]
David R Thomas. 2003. A general inductive approach for qualitative data analysis. (2003).
[82]
Kellie Vella, Bernd Ploderer, and Margot Brereton. 2021. Human-Nature Relations in Urban Gardens: Explorations with Camera Traps. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (, Yokohama, Japan, ) (CHI ’21). Association for Computing Machinery, New York, NY, USA, Article 566, 13 pages. https://doi.org/10.1145/3411764.3445438
[83]
Sarah Webber. 2015. Design and Evaluation of Interactive Technology for Human-Animal Encounters at the Zoo. In Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology (Iskandar, Malaysia) (ACE ’15). Association for Computing Machinery, New York, NY, USA, Article 57, 3 pages. https://doi.org/10.1145/2832932.2837009
[84]
Sarah Webber, Marcus Carter, Sally Sherwen, Wally Smith, Zaher Joukhadar, and Frank Vetere. 2017. Kinecting with Orangutans: Zoo Visitors’ Empathetic Responses to Animals’ Use of Interactive Technology. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 6075–6088. https://doi.org/10.1145/3025453.3025729
[85]
Sarah Webber, Marcus Carter, Wally Smith, and Frank Vetere. 2017. Interactive technology and human–animal encounters at the zoo. International Journal of Human-Computer Studies 98 (2017), 150–168. https://doi.org/10.1016/j.ijhcs.2016.05.003
[86]
Sarah Webber, Marcus Carter, Wally Smith, and Frank Vetere. 2020. Co-Designing with Orangutans: Enhancing the Design of Enrichment for Animals. In Proceedings of the 2020 ACM Designing Interactive Systems Conference (, Eindhoven, Netherlands, ) (DIS ’20). Association for Computing Machinery, New York, NY, USA, 1713–1725. https://doi.org/10.1145/3357236.3395559
[87]
Sarah Webber, Ryan M. Kelly, Greg Wadley, and Wally Smith. 2023. Engaging with Nature through Technology: A Scoping Review of HCI Research. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (, Hamburg, Germany, ) (CHI ’23). Association for Computing Machinery, New York, NY, USA, Article 521, 18 pages. https://doi.org/10.1145/3544548.3581534
[88]
Jamie Whitehouse, Bridget M Waller, Mathilde Chanvin, Emma K Wallace, Anne M Schel, Kate Peirce, Heidi Mitchell, Alaina Macri, and Katie Slocombe. 2014. Evaluation of public engagement activities to promote science in a zoo environment. PloS one 9, 11 (2014), e113395.
[89]
JC Whitham and LJ Miller. 2016. Using technology to monitor and improve zoo animal welfare. Animal Welfare 25, 4 (2016), 395–409.
[90]
Jianghao Xiong, Guanjun Tan, Tao Zhan, and Shin-Tson Wu. 2020. Breaking the field-of-view limit in augmented reality with a scanning waveguide display. OSA Continuum 3, 10 (2020), 2730–2740.
[91]
Yinan Xu. 2018. Exploring the benefits and challenges of AR in an outdoor tourism experience.
[92]
Victor Yocco, Elizabeth H Danter, Joseph E Heimlich, Betty A Dunckel, and Chris Myers. 2011. Exploring use of new media in environmental education contexts: introducing visitors’ technology use in zoos model. Environmental Education Research 17, 6 (2011), 801–814.
[93]
Susan A Yoon, Karen Elinich, Joyce Wang, Christopher Steinmeier, and Sean Tucker. 2012. Using augmented reality and knowledge-building scaffolds to improve learning in a science museum. International Journal of Computer-Supported Collaborative Learning 7 (2012), 519–541.
[94]
Mario Martínez Zarzuela, Francisco J Díaz Pernas, Leire Barroso Martínez, David González Ortega, and Miriam Antón Rodríguez. 2013. Mobile serious game using augmented reality for supporting children’s learning about animals. Procedia computer science 25 (2013), 375–381.
[95]
Xinzhe Zhang, Huaqun Liu, and Shijie Wang. 2020. Design and Realization of Chinese Traditional Culture & Art Interactive System Based on VR&AR Technologies. In Proceedings of the 3rd International Conference on Information Technologies and Electrical Engineering. 462–467.

Cited By

View all
  • (2024)Becoming Bats with “EchoVision”: Towards Eco-Phenomenological Mixed RealitySIGGRAPH Asia 2024 Art Papers10.1145/3680530.3695460(1-7)Online publication date: 3-Dec-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '24: Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems
May 2024
18961 pages
ISBN:9798400703300
DOI:10.1145/3613904
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 11 May 2024

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. augmented reality
  2. design framework
  3. ethnography
  4. field study

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

CHI '24

Acceptance Rates

Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)3,140
  • Downloads (Last 6 weeks)413
Reflects downloads up to 12 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Becoming Bats with “EchoVision”: Towards Eco-Phenomenological Mixed RealitySIGGRAPH Asia 2024 Art Papers10.1145/3680530.3695460(1-7)Online publication date: 3-Dec-2024

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media