Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

MR Research paper

Download as pdf or txt
Download as pdf or txt
You are on page 1of 48

Linköping University | Department of Electrical Engineering

Master’s thesis, 30 ECTS | Datateknik


2023 | LiTH-ISY-EX–23/5548–SE

Mixed Reality Visualization of


3D-CAD Assemblies for Rapid
Prototyping
– a description of the current state of Mixed Reality visualization

3D-modeller i Mixed Reality för iterativ produktdesign

Arvid Westerlund

Supervisor : Harald Nautsch


Examiner : Ingemar Ragnemalm

External supervisor : Sarath Chandra Damineni, Bosch Thermoteknik

Linköpings universitet
SE–581 83 Linköping
+46 13 28 10 00 , www.liu.se
Upphovsrätt
Detta dokument hålls tillgängligt på Internet - eller dess framtida ersättare - under 25 år från publicer-
ingsdatum under förutsättning att inga extraordinära omständigheter uppstår.
Tillgång till dokumentet innebär tillstånd för var och en att läsa, ladda ner, skriva ut enstaka ko-
pior för enskilt bruk och att använda det oförändrat för ickekommersiell forskning och för undervis-
ning. Överföring av upphovsrätten vid en senare tidpunkt kan inte upphäva detta tillstånd. All annan
användning av dokumentet kräver upphovsmannens medgivande. För att garantera äktheten, säker-
heten och tillgängligheten finns lösningar av teknisk och administrativ art.
Upphovsmannens ideella rätt innefattar rätt att bli nämnd som upphovsman i den omfattning som
god sed kräver vid användning av dokumentet på ovan beskrivna sätt samt skydd mot att dokumentet
ändras eller presenteras i sådan form eller i sådant sammanhang som är kränkande för upphovsman-
nens litterära eller konstnärliga anseende eller egenart.
För ytterligare information om Linköping University Electronic Press se förlagets hemsida
http://www.ep.liu.se/.

Copyright
The publishers will keep this document online on the Internet - or its possible replacement - for a
period of 25 years starting from the date of publication barring exceptional circumstances.
The online availability of the document implies permanent permission for anyone to read, to down-
load, or to print out single copies for his/hers own use and to use it unchanged for non-commercial
research and educational purpose. Subsequent transfers of copyright cannot revoke this permission.
All other uses of the document are conditional upon the consent of the copyright owner. The publisher
has taken technical and administrative measures to assure authenticity, security and accessibility.
According to intellectual property law the author has the right to be mentioned when his/her work
is accessed as described above and to be protected against infringement.
For additional information about the Linköping University Electronic Press and its procedures
for publication and for assurance of document integrity, please refer to its www home page:
http://www.ep.liu.se/.

© Arvid Westerlund
Abstract

Mixed, virtual, and augmented reality technology blends virtual and real elements,
offering unlimited potential for various industries such as gaming, medical science, and
industrial manufacturing. This report aims to show the development process of a sys-
tem for visualization of and interaction with large and advanced CAD assemblies. The
proposed system will enable CAD engineers, service technicians, educators, and other rel-
evant parties to engage with current and emerging products for educational and demon-
stration purposes without needing a physical model. The report will show the problems
and solutions faced when developing such a system and evaluate different approaches. It
presents a technical overview of the current state of the mixed reality space concerning the
visualization of large and advanced CAD assemblies and what future innovations would
be desirable. The result of this paper is a system that imports, converts, optimizes, and
adds interaction capabilities of any .STP format file saved on the local file system, all when
the software is executing. The system allows for dual-handed interaction of any part, with
maintained quality and acceptable performance.
Acknowledgments

I want to thank Bosch Thermoteknik for allowing me to write this thesis, especially my ex-
ternal supervisor, Sarath Chandra Damineni, and external managers for their assistance and
support. I would also like to thank my examiner and my supervisor for their help in writing
my thesis.

iv
Contents

Abstract iii

Acknowledgments iv

Contents v

List of Figures vi

List of Abbreviations 1

1 Introduction 2
1.1 The need for Heat Pumps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Mixed reality in practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Aim . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.4 Research questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.5 Delimitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2 Theory 5
2.1 The virtuality continuum and MR . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.2 Head-mounted displays and Mixed reality devices . . . . . . . . . . . . . . . . . 7
2.3 Smartphone . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.4 Stereoscopy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.5 Cybersickness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

3 Method 15
3.1 Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.2 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.3 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

4 Results 20
4.1 Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

5 Discussion 25
5.1 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
5.2 Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
5.3 Issues, problems, and notes during development . . . . . . . . . . . . . . . . . . 25
5.4 The human experience . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
5.5 The work in a wider context . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

6 Conclusion 37
6.1 Summarization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
6.2 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

Bibliography 39

v
List of Figures

2.1 The virtuality continuum. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5


2.2 An example of reality [8]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.3 An example of a fully virtual element [9]. . . . . . . . . . . . . . . . . . . . . . . . . 7
2.4 An example of a person using a VR headset [10]. . . . . . . . . . . . . . . . . . . . . 8
2.5 An example of smartphones used as an AR device [13]. . . . . . . . . . . . . . . . . 9
2.6 The process of using active outside in tracking in motion capture [20]. . . . . . . . 10
2.7 The iOS Measure app using AR technology to measure real world distances [22]. . 11
2.8 An assembled Google Cardboard unit without smartphone [24]. . . . . . . . . . . . 12
2.9 A color anaglyph 3D image of the planet Mars [28]. . . . . . . . . . . . . . . . . . . 12
2.10 The SegaScope 3-D Glasses using active shutter technology [30]. . . . . . . . . . . . 13
2.11 Three pairs of polarized stereo glasses [31]. . . . . . . . . . . . . . . . . . . . . . . . 13
2.12 The function of the parallax barrier and lenticular display technology [34]. . . . . . 14

3.1 The architecture of the system from CAD-file to visualization and interaction. . . . 16

4.1 A closeup view of the internals of an exploded heat pump. . . . . . . . . . . . . . . 21


4.2 Explosion slider interaction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
4.3 A visualization of the part reset function. . . . . . . . . . . . . . . . . . . . . . . . . 22
4.4 The function of the assembly selector slider. . . . . . . . . . . . . . . . . . . . . . . . 23
4.5 Function of the assembly relocator. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

5.1 A screenshot of the Unity editor window. . . . . . . . . . . . . . . . . . . . . . . . . 30


5.2 A comparison of a convex mesh collider and a box collider on a sphere in the Unity
editor. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
5.3 An example of the pinching maneuver for interaction simulated in the Unity editor. 31
5.4 A screenshot from a video recorded with the HoloLens 2 showing the use of the
pinching feature on the explode slider. . . . . . . . . . . . . . . . . . . . . . . . . . . 32
5.5 An example of using the near and far interaction features simulated in the Unity
editor. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
5.6 An example of the function of anti-aliasing in the Unity editor. . . . . . . . . . . . . 36

vi
Abbreviations

HMD Head Mounted Display


FOV Field Of View
FPS Frames Per Second
AR Augmented Reality
VR Virtual Reality
MR Mixed Reality
MRTK Mixed Reality Toolkit
CAD Computer-Aided Design
LOD Levels Of Detail
.STP/.STEP “Standard for the Exchange of Product model data“ file format
SDK Software Development Kit

1
1 Introduction

Mixed reality (MR) has seen significant growth in recent years, and its potential applications
in various industries are constantly being researched. This thesis explores the application of
MR technology in the heat pump design and manufacturing industry.

1.1 The need for Heat Pumps


Heat pumps are an essential technology for combating climate change. They help reduce the
energy used to heat and cool buildings, lowering the emission of carbon dioxide and other
greenhouse gases into the atmosphere compared to alternative residential heating methods
[1].
In addition to the need for advanced heat pump technologies, there is also a need for
heat pumps that are network connected and compatible with smart home systems. This
would allow users to easily monitor and control the temperature of their home from their
smartphone or computer. By connecting heat pumps to the internet and incorporating smart
home features, energy savings, and efficiency gains can be achieved [2].

1.2 Mixed reality in practice


The motivation for this study stems from the need for efficient and intuitive ways of visual-
izing large CAD assemblies in the industrial setting. As CAD models become increasingly
complex and detailed with hundreds or even thousands of components, it can be challenging
for CAD engineers to fully understand and make sense of the assemblies they are working
on.
MR technology offers a promising solution to this problem by providing a more immer-
sive and interactive way of visualizing and interacting with CAD assemblies. With an MR
headset, CAD engineers can see their assemblies in a life-sized, immersive environment, al-
lowing them to better understand the spatial relationships between components and make
more informed design decisions.
However, the development of MR systems for CAD visualization is challenging. Sev-
eral technical and user experience considerations must be considered when designing such
a system, including performance, ergonomics, and safety issues. Still, MR technology could

2
1.2. Mixed reality in practice

provide value in many different areas of a design and production company. Various ideas
have come to mind that make this technology an exciting area to research, such as:

1.2.1 Rapid Development for CAD-engineers


MR technology offers a unique solution for CAD engineers looking to accelerate the develop-
ment of their CAD assemblies. Using an MR headset, engineers can quickly visualize changes
to a CAD assembly, allowing for quick and easy testing of different design iterations or solu-
tions. This could significantly reduce the time and effort required to develop a CAD design,
as engineers can quickly and easily make changes and see the results in 3D. The MR tech-
nology enables improved collaboration among team members, as it could allow for shared
experiences and accessible communication of design ideas.

1.2.2 Educate On Heat Pump Technology


MR technology may revolutionize how heat pump technology is taught and understood.
Using an MR headset, students can visualize and interact with a life-size virtual heat pump
model, allowing them to see how it works internally. This can provide a more engaging
and immersive learning experience, as students can manipulate the virtual model and see all
components inside. Additionally, MR technology allows interactive educational materials,
such as quizzes and simulations, to enhance student learning and understanding.

1.2.3 Service Technicians Training


Heat pumps are advanced machines that require training for service and repairs. MR could
be used as a replacement for a physical heat pump when educating service technicians. Inter-
active training materials and simulations can enhance the learning experience and improve
the retention of information. This could allow the education session to be performed at the
technician’s locations without transporting a heavy heat pump just for the session. The MR
system could then allow for manipulating all individual heat pump components but only al-
low disassembly in a safe sequence without the risk of causing actual damage to equipment.

1.2.4 Visualize possible failure locations based on error codes


By utilizing a database that links heat pump error codes with possible sources of issues, tech-
nicians can use an MR headset to pinpoint potential locations within the heat pump where
the error may be originating. This can provide valuable information and insights to help tech-
nicians quickly identify and resolve issues. MR technology also allows technicians to view
and interact with a virtual heat pump model, enhancing their ability to diagnose and repair
the problem.

1.2.5 Replay sensor values


Technicians could visualize sensor values streamed from a logger tool installed in a heat
pump in real-time, overlaid on top of the physical heat pump in the field. This can provide
valuable information that can help technicians quickly identify behavior in the heat pump.
MR technology also allows technicians to work more efficiently and effectively in the field.
They can quickly and easily access and view sensor data without needing additional equip-
ment or tools.

1.2.6 Stakeholders and Prototypes


One potential use of MR is virtual prototyping, which allows stakeholders to interact with
and test a virtual representation of a product design rather than creating and testing physical

3
1.3. Aim

prototypes. This can be especially useful for early-stage design review and evaluation, as it
can reduce the time and cost associated with creating and testing physical prototypes [3].
Another way MR can be used is for virtual manufacturing, which involves visualizing and
simulating the manufacturing process in a virtual environment. This can help stakeholders to
optimize the design and production process before physical prototypes are created, resulting
in a more efficient and cost-effective manufacturing process [4].

1.3 Aim
This thesis project explores the potential of MR technology for improving the efficiency and
effectiveness of the CAD engineering process for heat pump design. The project will focus on
developing a system for visualizing large and advanced heat pump CAD assemblies in MR,
allowing CAD engineers to visualize different designs and evaluate changes quickly. The
thesis will discuss the challenges and obstacles that arise when developing such a system
and potential solutions to those problems.

1.4 Research questions


1. How can a system for visualizing CAD assemblies in mixed reality be realized so that
CAD engineers can use it for rapid prototyping?

2. What software limitations and solutions are there for realizing such a system?

1.5 Delimitations
Certain delimitations are required due to time, cost, and availability limitations. These de-
limitations are described below.

1.5.1 Device
The Microsoft HoloLens 2 will be utilized for the thesis as it was the only device available
due to time and scope constraints.

1.5.2 Game Engine


The game engine used for development and visualization is Unity, version 2020.3.36f1.
This game engine is one of the recommended engines for development using the Microsoft
HoloLens 2 device [5].

1.5.3 CAD-file formats


The source CAD assembly format is STEP (ISO 10303-21) with the extensions .stp and .step
[6].

1.5.4 Automatic Pipeline


No system for automatic download and conversion of CAD files will be considered due to the
lack of standardization between organizations. The system considered in this thesis assumes
direct access to CAD files from the local file system.

4
2 Theory

Technological innovations have faded the lines between the virtual and real worlds in the
last few years. The power to blend digital assets with the real world allows you to exchange
information wherever you are.

2.1 The virtuality continuum and MR


The virtuality continuum is a framework for understanding the range of immersive technolo-
gies, from fully real to entirely virtual environments. It was first proposed by Milgram and
Kishino and has since been widely used in human-computer interaction to classify different
types of immersive technologies, including MR [7]. It can be beneficial for designers and de-
velopers working on MR applications, as it helps to provide a clear taxonomy of the different
types of MR technologies, their capabilities, and their limitations.
MR is a subset of the virtuality continuum and describe technologies that merge the real
and virtual worlds. These technologies can be classified according to their location on the
virtuality continuum as shown in Figure 2.1 based on the level of immersion and the extent
to which they integrate real and virtual elements.

Figure 2.1: The virtuality continuum.

2.1.1 Reality
At one extreme of the spectrum are fully real environments that do not involve computer-
generated elements. It is the reality we walk in and interact with in everyday life as shown in
Figure 2.2. This extreme is not included in the term mixed reality.

5
2.1. The virtuality continuum and MR

Figure 2.2: An example of reality [8].

2.1.2 Virtuality
At the other end are fully virtual environments as shown by Figure 2.3; entirely generated by
a computer and do not involve any real-world elements [7]. This extreme is also not included
in the term mixed reality.

Virtual reality
Virtual reality (VR) is an immersive technology that creates a computer-generated environ-
ment where users can interact with virtual objects and experiences. VR is typically achieved
using specialized equipment such as a head-mounted display (HMD), headset, gloves, or
other wearable devices that allow users to see, hear, and interact with a virtual environment
naturally and intuitively. An example of this is shown in Figure 2.4. VR is often used for
various applications, including gaming, training, education, design, and entertainment.

Desktop VR
Desktop VR refers to using VR technology on a computer or other device not designed to be
worn on the body. This typically involves using a VR headset connected to a computer via
cables and interacting with the VR environment through controllers, with the user sitting or
standing in front of a desk. Desktop VR typically include limited to no tracking or motion
sensing technology, so the user is not able to move their head or body within the virtual
environment [11].

Immersive VR
Immersive VR uses VR technology to immerse the user in a virtual environment fully. This
can involve using VR headsets designed to be worn on the body, such as the HMD setup
shown in Figure 2.4, and other technologies that enable the user to interact with and move

6
2.2. Head-mounted displays and Mixed reality devices

Figure 2.3: An example of a fully virtual element [9].

within the virtual environment. Some immersive VR systems also include haptic feedback,
which allows the user to feel physical sensations in response to events in the virtual environ-
ment [12].

Augmented reality
Augmented reality (AR) is an immersive technology that adds computer-generated elements
to a real-world environment, allowing users to see and interact with virtual objects in the
context of the physical world [7]. AR is typically achieved using equipment such as a headset
or smartphone, which displays virtual objects and information on top of a user’s view of the
physical world as shown by Figure 2.5.

2.2 Head-mounted displays and Mixed reality devices


HMDs and MR devices are specialized technologies that enable users to experience immer-
sive virtual environments.
Several HMDs and MR devices are available, each with unique features and capabilities.
Some HMDs are designed for fully immersive VR experiences, while others are designed for
AR. Some MR devices are head-mounted, while others are handheld or mounted on a stand.
Milgram and Kishino defined six classes of MR device environments, which are described
below [7].

1. Non-immersive, PC-connected monitors where computer-generated elements are


added to the screen and overlaid onto other elements, such as a camera feed of the
real world. This could also be described as enhancing real-world video scenes with
computer graphics.

2. Same as class 1, but with an immersive HMD.

7
2.2. Head-mounted displays and Mixed reality devices

Figure 2.4: An example of a person using a VR headset [10].

3. Half-silvered mirrors or transparent displays are used in HMDs to enable a see-through


capability, allowing computer-generated graphics to be optically superimposed onto
directly viewed real-world scenes. This is the technology used in Microsoft’s HoloLens
2, where Microsoft calls the class “Holographic devices“ [14].

4. Same as class 3 but using cameras to show the real world instead of transparent displays
or mirrors. The displayed virtual environment should be accurately aligned with the
real-world environment.

5. Virtual environments without cameras, which may be fully immersive (HMDs) or par-
tially immersive (large screen displays), and which have computer-generated elements
added to them to create a sense of “reality“. This class would be accurate to current VR
headsets used in gaming (e.g., HTC Vive) [15].

6. A completely virtual (graphic) environment that is partially immersive where real ob-
jects in the user environment can be interacted with and influence the virtual, computer-
generated scene.

2.2.1 Tracking
In the technical aspects of AR and VR technology, there are several underlying hardware and
software components that make it all possible. For example, AR and VR devices require spe-
cialized sensors, cameras, and displays to track the user’s position and movements, render

8
2.2. Head-mounted displays and Mixed reality devices

Figure 2.5: An example of smartphones used as an AR device [13].

digital content, and display it in real-time. These components work together to create an
immersive experience that combines the digital and physical worlds.
The algorithms and techniques used in AR and VR systems are also crucial to their suc-
cess. Tracking algorithms, for example, are responsible for accurately tracking the user’s
movements and position in real-time, which is essential for creating a seamless and immer-
sive experience.

Inside out
Inside-out tracking is a type of tracking technology used in VR and AR devices that uses
sensors on the device to track its movement in the real world and is the technology used in
the Microsoft HoloLens 2. The sensors typically include cameras, accelerometers, gyroscopes,
and sometimes magnetometers [16].
With inside-out tracking, the device itself is responsible for tracking its own movement,
which means there is no need for external sensors or base stations. This can make the setup
process easier and more flexible since the user does not need to set up any external sensors
or be limited by the range of the sensors [16]. Additionally, inside-out tracking can allow for
greater freedom of movement since the user is not tethered to any external sensors or base
stations.
It’s possible to use inside out tracking with a marker system. This technique uses mark-
ers, such as QR codes, to track the position and orientation of items in the real world. The
device’s camera detects the markers, which are then used to calculate the marker’s position
and orientation [17]. Marker-based tracking is commonly used in AR applications where the
connection to the physical world is important. Markers can be used in the HoloLens 2 to
superimpose the digital model onto a real [18].

Outside in
The process of outside-in tracking entails utilizing external sensors or cameras to track the
location and orientation of devices. The cameras used in this process are specially configured
to detect infrared light, and through the use of the light’s position from various cameras, the
system is capable of determining the devices’ position and orientation in three-dimensional
space. The system is known as “active“ if the infrared light is generated through emitters
attached to the device, while it is known as “passive“ if the device is outfitted with reflectors
and the emitters are attached to the cameras [19].
This technology is common in motion capture for movies or video game design for track-
ing items or limbs. A disadvantage of outside-in tracking is the need for external cameras to

9
2.3. Smartphone

be set up in the room, which can be inconvenient and may require additional setup time. A
common use case of tracking for motion capture can be seen in Figure 2.6.

Figure 2.6: The process of using active outside in tracking in motion capture [20].

Lighthouse
Lighthouse tracking is a method of tracking the position and orientation of VR equipment
through the use of emitters known as lighthouses. This tracking technology was introduced
by Valve Corporation, a video game development company, as part of their SteamVR plat-
form [19].
The lighthouses are stationary units that emit a sweeping laser beam. The VR equipment
is outfit with sensors that detect the laser beam as it passes over them. The sensors then
calculate the exact position and orientation of the equipment in real-time based on the timing
and location of the laser beam sweep [19].

2.3 Smartphone
Smartphones have emerged as viable platforms for experiencing VR and AR content. The uti-
lization of smartphones for VR and AR can be achieved through the utilization of specialized
VR and AR applications, as well as through the use of cost-effective headset accessories.

2.3.1 Measure in iOS


One example of an AR application used with a smartphone is the iOS Measure app, which
allows users to use their smartphone’s camera to measure real-world objects and distances.
The app shown in Figure 2.7 uses AR technology to superimpose virtual measuring tools
onto the camera view, allowing users to measure objects in the physical world quickly [21].

10
2.4. Stereoscopy

Figure 2.7: The iOS Measure app using AR technology to measure real world distances [22].

2.3.2 AR Gaming
Another way a smartphone can be used for AR is through AR mobile games, which use the
smartphone’s camera and display to create interactive AR experiences. These games typically
involve overlaying virtual objects or characters onto the real-world environment, allowing
users to interact with the virtual elements naturally and intuitively [23].

2.3.3 Used as a VR headset


One way in which a smartphone can be used for VR experiences is through the use of low-
cost VR headsets such as Google Cardboard [24]. These headsets use the smartphone’s dis-
play and sensors to create VR experiences, allowing users to immerse themselves in virtual
environments and interact with virtual objects and experiences.
The use of low-cost, smartphone-based VR headsets like the Google Cardboard shown in
Figure 2.8 has the potential to significantly increase the accessibility of VR technology to a
broader audience. Google Cardboard and similar products can be used with a wide range of
smartphones and are relatively inexpensive, making them an attractive and accessible option
for those who want to experience VR but may not have the budget or technical knowledge to
use more advanced VR systems [25].

2.4 Stereoscopy
Stereoscopy is the technique of creating the illusion of three-dimensional depth in a two-
dimensional image or display. It has a long history dating back to the early 19th century. It has

11
2.4. Stereoscopy

Figure 2.8: An assembled Google Cardboard unit without smartphone [24].

played a vital role in developing many different types of immersive technologies, including
VR and AR [26].

2.4.1 Anaglyphic systems


Color anaglyph 3D technology makes it possible to display 3D images in color, using glasses
with colored filters to separate the left and right images. The colors used are chromatically
opposite (commonly red and blue as shown in Figure 2.9) to get the best effect [27].

Figure 2.9: A color anaglyph 3D image of the planet Mars [28].

2.4.2 Interference system


The interference filter system employs the utilization of two three-band, narrow-band pass
chromatic filters and wavelength multiplexing to present a distinct spectrum of red, green,
and blue color to each eye, thereby coding the image to be displayed to each eye [29].

2.4.3 Active shutter 3D system/Stereo-image alternator


Active shutter systems use glasses with electronic shutters to synchronize the display of left
and right images and create 3D effects. The glasses can be synced using an infrared emitter
connected to the monitor of with a cable as shown in Figure 2.10 that then blocks the right
eye when the left image is shown on the monitor and vice versa [27].

12
2.4. Stereoscopy

Figure 2.10: The SegaScope 3-D Glasses using active shutter technology [30].

2.4.4 Polarization system


The polarization system is a type of stereoscopy technology that uses polarizing filters to
separate the left and right images and create 3D effects. The polarization system works by
encoding the left and right images in opposing polarization, and the 3D effects can be experi-
enced by using polarized glasses similar to those in Figure 2.11 to separate the left and right
images [27].

Figure 2.11: Three pairs of polarized stereo glasses [31].

2.4.5 Autostereoscopy
Autostereoscopy is a technique used in 3D display systems that allows the user to view 3D
images without the need for special glasses or headgear.

13
2.5. Cybersickness

Parallax barrier
A parallax barrier (as shown in Figure 2.12) is a display technology used in autostereoscopic
displays that works by using a thin, opaque barrier with a series of precisely placed slits
placed in front of the display screen that allow only certain parts of the image to be seen by
each eye [32].

Lenticular autostereoscopic displays


Lenticular autostereoscopic displays are a type of display technology that uses a series of thin
lenses or “lenticules“ to create 3D effects without the need for glasses. These displays work
by directing the left and right images toward the viewer’s eyes at different angles as shown
in Figure 2.12, creating the illusion of 3D depth [33].

Screen R
Parallax L
R
barrier L
R
L
R
L

Left eye

Right eye

Screen R
Lenticular L
R
lens L
R
L
R
L

Left eye

Right eye
Figure 2.12: The function of the parallax barrier and lenticular display technology [34].

2.5 Cybersickness
Cybersickness, also known as VR sickness, occurs when a person experiences symptoms sim-
ilar to motion sickness while using a VR device [35]. Common symptoms are nausea, dry
mouth, disorientation and vertigo [36].
Several factors can contribute to the likelihood of experiencing cybersickness while using
a VR device. One important factor is the level of immersion in the virtual environment.
The more realistic, spacious, and immersive the environment, the less likely a person is to
experience cybersickness. Other factors that may contribute to the likelihood of experiencing
cybersickness include the intensity of the motion in the virtual environment, duration of use,
and individual differences in susceptibility [35].

14
3 Method

The MR system for visualizing advanced heat pump CAD assemblies was developed using
Microsoft’s HoloLens 2 device. Third-party CAD conversion and importing tools are used
to convert the heat pump CAD assemblies into a format compatible with the Unity game
engine. Custom scripts and algorithms are implemented to simplify the interaction with the
CAD model within the MR environment.

3.1 Architecture
Before implementing the MR system, the required steps and features were carefully planned
and defined after discussion with engineers in the heat pump industry and from user testing
of the HoloLens’ features. The following were identified as the crucial steps that needed to
be incorporated into the system:

1. Conversion from .STP format to a format that Unity can read and import at runtime.

2. Runtime importing the converted CAD files into the Unity project, allowing visualiza-
tion and interaction with the CAD assemblies without recompilation.

3. Optimization of the imported CAD assemblies to improve performance and have an


acceptable frame rate during use.

4. Addition of part colliders and interaction components to the objects within the im-
ported CAD assemblies, allowing the user to manipulate and explore the assemblies
naturally and intuitively.

5. Development of a set of quality-of-life features to enhance the usability and intractabil-


ity of the system, including an explosion view, part position reset, assembly selector,
and assembly relocator.

The mixed-reality system was implemented with a clear understanding of the steps and fea-
tures required to create an efficient, effective, and user-friendly tool for visualizing and work-
ing with heat pump CAD assemblies in a mixed-reality environment.

15
3.2. Implementation

Figure 3.1: The architecture of the system from CAD-file to visualization and interaction.

3.2 Implementation
Implementation was made with Unity version 2021.3.14f1 and began with the initial setup of
a “3D Core“ project template. “Microsoft Mixed Reality Feature Tool v.1.0.2206.1 Preview“
was used to install the necessary components for the system. Mainly “Mixed Reality Toolkit
Foundation“ version 2.8.2, “Mixed Reality Toolkit Standard Assets“ version 2.8.2, and “Mixed
Reality OpenXR Plugin“ version 1.5.0. The project’s build settings were changed to “Uni-
versal Windows Platform“ with the HoloLens as the target device and ARM64 architecture.
Microsoft’s “Introduction to the Mixed Reality Toolkit“ guide was followed for configuration
[37]. CAD Exchanger SDK was used for importing and converting the CAD files into Unity
at runtime [38].

3.3 Evaluation
Problems encountered during implementation as described in Chapter 3.2 are noted, and
possible solutions to these problems are studied and evaluated. The following categories
describe the potential issues relevant to this thesis.

3.3.1 Hardware Performance


These problems are related to performance limitations while running the software. That
could be low or irregular frame rate and long response times etc.

3.3.2 Hardware Features/Capabilities


Issues in this category are related to the hardware’s features or capabilities that do not exist
or are not advanced enough for the desired use case.

3.3.3 Software Shortcomings


These minor issues might refer to the lack of certain features or capabilities in the software or
limitations in the ways that certain features can be used.

16
3.3. Evaluation

3.3.4 Software Fundamentals


These issues refer to a software system’s inherent limitations or constraints that cannot be
overcome through improvements or updates to the software. These limitations may be due
to the design or architecture of the software, the limits of the hardware it runs on, or the
fundamental laws of physics or computation. Examples of fundamental limitations in soft-
ware may include the maximum amount of memory that can be used or the boundaries of a
particular programming language or framework.

3.3.5 Human
Problems in this category refer to limitations or issues that arise from human factors, such as
user error, lack of training or knowledge, or negative impact on the user experience. These
problems may include limitations in the ability of users to understand or operate the system,
VR-induced nausea or headaches as described in Chapter 2.5, or other adverse physical or
psychological effects caused by the technology.

3.3.6 Detailed Implementation Steps


The following is a detailed installation guide performed on a Windows 10 Professional (x64)
Version 21H2 PC running an AMD Ryzen 5 5600X CPU, 32 GB DDR4 RAM, and NVIDIA RTX
3070 GPU. This guide follows the steps required to start HoloLens development through the
Unity game engine.

1. Download and install the Unity Hub software for Windows.

2. Download and install the Mixed Reality Feature Tool for Windows [39].

3. Start the Unity Hub software and install the Unity Editor version 2021.3.14, Microsoft
Visual Studio Community 2019, and Universal Windows Platform Build Support.

4. When installing Microsoft Visual Studio 2022, select the following workloads [5].

a) “.NET desktop development“.


b) “Desktop development with C++“.
c) “Universal Windows Platform development“.
d) “Game development with Unity“.

5. Select (or verify pre-selection of) the following components for “Universal Windows
Platform development“ [5].

a) “Windows 10 SDK (10.0.19041.0)“.


b) “USB Device Connectivity“.
c) “C++ (v142) Universal Windows Platform tools“.

6. Create a new “3D Core“ project from the Unity Hub.

7. In the Unity Editor

a) Select Menu Bar > File > Build Settings > Platform > Universal Windows Platform.
b) Change Architecture to ARM 64-bit.
c) Click the Switch Platform button.
d) Close the Build Settings window when the switch is done.

8. Start the Mixed Reality Feature Tool for Unity and press “Start“.

17
3.3. Evaluation

9. Browse to the Unity project location.


10. Press “Discover Features“.
11. At the “Discover Features“ page, select the following:
a) Mixed Reality Toolkit (0 of 10) > Mixed Reality Toolkit Foundation version 2.8.2.
b) Platform Support (0 of 5) > Mixed Reality OpenXR Plugin version 1.6.0.
c) Click “Get Features“.
12. Click “Validate“, followed by “Import“, followed by “Approve“.
13. Close the window.
14. Open the Unity Editor and wait for the importing to finish.
15. A warning appears asking you to enable “backends“. Press “Yes“.
16. Go to Menu Bar > Mixed Reality > Toolkit > Utilities > Configure Projekt for MRTK.
17. Increase the size of the window that opens and select “Unity OpenXR plugin“ (recom-
mended).
18. Click “Show XR Plug-in Management Settings“.
19. In the Project Settings > XR Plug-in Management page, do the following:
a) Click the “Universal Windows Platform settings“.
b) Make sure “Initialize XR on Startup“ is ticked.
c) Tick the OpenXR box and the first tick box that appears (“Microsoft HoloLens
feature group“).
d) Close the Project Settings window.
20. Select “Apply Settings“ in the “MRTK Project Configurator“ window.
21. In the Project Settings > Project Validation window that opens, go to the “Universal
Windows Platform“ settings tab and click “Fix All“, and if issues remain, press “Fix
All“ again. Ignore issues marked “Scene specific“.
22. Close the Project Settings window.
23. Press “Next“ in the “MRTK Project Configurator“ window, followed by “Apply“, then
“Next“, then “Import TMP Essentials“, and finally press “Done“.
24. Restart the Unity Editor and go to Menu Bar > Edit > Project Settings > Player > Uni-
versal Windows Platform settings > Publishing Settings > Packaging.
25. Change the “Package name“ field to a suitable name that is not used previously since it
is a unique identifier.
26. Close the Project Settings window.
27. Create a new scene with File > New Scene from the menu bar and select the template
“Basic (Built-in)“ followed by “Create“.
28. Add the necessary Mixed Reality component by going to Menu Bar > Mixed Reality >
Toolkit > “Add to Scene and Configure...“.
29. Save the changes by File > Save As and save to the Assets > Scenes folder with a name
for the scene.

The Unity project is now prepared for the development of HoloLens 2 software.

18
3.3. Evaluation

3.3.7 Build and Deploy for the HoloLens 2


1. From the Unity Editor, select Menu Bar > File > “Build Settings....“.

2. In the Build Settings window, press “Add Open Scenes“.

3. Click “Build“.

4. Navigate to the folder where you want the build to be saved and press “Select Folder“.
Wait for the build process to finish.

5. Navigate to the folder in the file explorer where the build was created.

6. Open the file with the extension .sln with Visual Studio.

7. Change the configuration from “Debug“ to “Release“ and the architecture from “ARM“
to “ARM64“.

8. Change the deployment target to “Remote Machine“ if deploying via Wi-Fi or “Device“
if deploying through USB.

9. In the menu bar, press Project > Properties > Configuration Properties > Debugging and
set Debugger to launch to “Remote Machine“.

10. Set “Machine Name“ to the IP address of the HoloLens 2. The IP address can be found
in the developer setting on the HoloLens.

11. Change “Authentication Type“ to “Universal (Unencrypted Protocol)“.

12. In the main Visual Studio window, select Debug > Start Without Debugging to deploy
and start the program on the HoloLens 2 automatically.

13. You will be prompted for a PIN when deploying to the HoloLens for the first time.
That PIN can be found in the HoloLens through Settings > Updates & Security > For
developers > Pair

14. The software should be deploying to the HoloLens now and will start automatically
when deployment is finished.

19
4 Results

This chapter presents the results of the mixed reality system for the HoloLens 2 device that
has been developed for visualizing large CAD assemblies of heat pumps.

4.1 Features
This section describes the software system features described in Chapter 3.1 after develop-
ment.

4.1.1 Fundamentals
The system developed for visualizing and interacting with large CAD assemblies of heat
pumps using a HoloLens 2 device includes the critical steps described in Chapter 3.1. One
of which is the conversion of CAD files from the .STP format to a format that Unity can read
and import. The system can import the converted CAD files into the software at runtime if
the source .STP file is saved on the PC’s file system. CAD files can not be imported at run
time when running on the HoloLens 2 device but can when running the software on a PC.
Scripts for the optimization of the imported CAD assemblies have been developed. This
improves performance by reducing the number of duplicated components in the assemblies,
making them more manageable for the device to handle. This feature is activated through the
Unity editor in edit time but can be adapted for use in run time.
Finally, the system automatically adds interaction components to the objects within the
imported CAD assemblies. This includes adding mesh- and bounding box collisions and the
near and far interaction components of Microsoft’s Mixed Reality Toolkit (MRTK) to each
assembly part. This allows the user to manipulate and explore the assemblies naturally and
intuitively with both hands, from a near and far distance.

4.1.2 Quality-of-life
Below are the quality-of-life features described in Chapter 3.1 implemented in the software
as Unity scripts and components.

20
4.1. Features

Explosion view

Figure 4.1: A closeup view of the internals of an exploded heat pump.

The explosion view feature shown in Figure 4.1 is a tool that allows a better understanding
of a CAD assembly by breaking it down into its components. This feature can be activated
by saying the voice command “explode“, which toggles a slider that the user can control by
pinching with the thumb and index finger and moving the slider cursor to the right on the
slider bar.

Figure 4.2: Explosion slider interaction.

21
4.1. Features

When the slider is moved to the right as in Figure 4.2, the assembly will “explode“ by
moving all the components away from the middle of the bottom of the assembly, with speed
proportional to how far the slider is slid to the right. This allows the user to decide the rate
at which the assembly explodes, giving more control over the feature. Once the user releases
the slider, the pointer will automatically move back to the left, and the explosion animation
will stop. The explosion height is limited through the Unity editor to an approximate ceiling
height. The component will only move on the X and Z-axis (width and depth) when any part
of the assembly has reached the height limit.

Part reset

Figure 4.3: A visualization of the part reset function.

The part position reset feature shown in Figure 4.3 is a tool that allows for moving all
assembly components to their original place within the assembly when the feature is acti-
vated through the voice command “reset“. This feature can help reset the location of parts
that have been moved or adjusted through manual component manipulation or the explosion
view feature.

Assembly selector
The assembly selector tool shown in Figure 4.4 enables fast switching between assemblies by
activating the voice command “slider“, which displays a user-controlled slider similar to the
explosion view described in this chapter. The user selects from loaded assemblies by moving
the slider cursor to any tick on the slider bar. The number of ticks adapts to the number of
loaded assemblies. The name of the currently selected assembly is shown under the slider
and is automatically assigned from the assembly name. This selector eliminates the need
for a separate menu or assembly list, making it useful for tasks such as comparing assembly
versions or switching between assemblies for different tasks.

Assembly relocator
The assembly relocator feature shown in Figure 4.5 allows the user to move the entire assem-
bly to a different location within the mixed reality environment by using a cube that can be
toggled by a voice command “move“. Activating the feature enables the user to grab and

22
4.1. Features

Figure 4.4: The function of the assembly selector slider.

Figure 4.5: Function of the assembly relocator.

move the cube; this relocates the entire assembly. This feature allows the user to reposition
the assembly to avoid real-world obstructions. The cube can be rotated only on the Y-axis to
always have the imported CAD model in a vertical orientation.

New user detection


A script was implemented to allow for the software to quickly be reset for new, inexperienced
users at student fairs. The script detects when the HoloLens 2 is upside down (by an editable
percentage) and will then trigger a reset process. Firstly, the location reset script is activated,

23
4.1. Features

and all components are reset to their original position. Then the movement cube, selector
slider, and explode slider are disabled and hidden from view.

24
5 Discussion

The results shown in Chapter 4 show the possibility of creating a system for rapid prototyp-
ing in the product design process. The method, issues during development, and results are
discussed here. Future work and potential societal impacts are also discussed.

5.1 Results
While runtime importing of CAD files was implemented, the feature only works at run time
on the development PC running the Unity editor. The reason for this is the lack of an ac-
cessible file system on the Windows Holographic OS running on the HoloLens 2 [40]. While
alternative solutions, such as downloading a file over the air through the web, were con-
sidered, I ultimately decided not to implement this due to time constraints. The compiled
software includes known good assets where duplicated components were removed manu-
ally for demonstration and testing purposes. When developing features, they were required
to be scalable and usable with assemblies imported at runtime.

5.2 Method
One of the risks associated with the implementation steps described in Chapter 3.2 is its
potential to become outdated quickly. This can require continually updating implementa-
tion guides to remain relevant and functional. Software changes are particularly relevant for
Unity, which is updated regularly. Updates to the CAD Exchanger SDK, which is used for
importing and converting CAD files, may impact its compatibility with the implementation.

5.3 Issues, problems, and notes during development


Issues covered in Chapter 3.3 are described and discussed in this section.

5.3.1 Input CAD-files (.stp)


When developing an automatic pipeline for visualizing CAD files in MR, the source files play
a crucial role in the system’s success. For the pipeline to be valuable, the source files must be
accurate, and up to date.

25
5.3. Issues, problems, and notes during development

Reliable data
Ensuring that the latest version of a source file is used is essential for both the accuracy and
the relevance of the assembly visualization. It ensures that the final visualizations are accu-
rate, informative, and up-to-date. This is especially important in cases where visualization
is used for identifying errors, taking measurements, or making design decisions. To avoid
errors, it is recommended to develop a robust system to check if a source file is of the latest
version before importing it into the software. This could be done through version tagging
and comparison with a central system.
CAD assemblies are composed of multiple parts and subassemblies, and the correct con-
figuration of these parts and subassemblies is crucial for accurate, reliable visualization.

Accurate Data
The design of accurate CAD assemblies of a product requires a collaborative effort between
CAD engineers and sub-component suppliers. Both parties play an important role in design
and development. Their cooperation is essential for ensuring that the final product meets the
desired specifications and avoids costly errors, redesigns, and delays in the product develop-
ment process.
The CAD assemblies used for this project are relatively detailed and contain over one
thousand individual components. Access to CAD files from suppliers is not guaranteed,
but they are used when available. For example, CAD files for cable harnesses are currently
unavailable in the assemblies.
One case of CAD files from suppliers was that a printed circuit board (PCB) for the net-
work interface on the heat pump had each PCB-mounted component as an individual CAD
part. This resulted in about one thousand components not larger than 1mm2 on a PCB smaller
than a human hand. The size of these components was impossible for the user to interact with
and entirely unnecessary for the system.
CAD engineers are responsible for creating, maintaining, and updating CAD files and
data; they need to ensure that the files are correctly created and maintained and that the data
is accurate and up-to-date. Issues were found where exported CAD assemblies had compo-
nents piercing through other components or simply floating in space away from anything
else. Some of those issues can be attributed to the troubles of exporting assemblies with a
specific configuration, as explained later in this section on duplicated components.

Metadata
Metadata in 3D CAD refers to information that describes the characteristics, properties, and
attributes of a 3D model. This information can include details such as the model’s creator,
date of creation, version number, file format, and any other relevant information that can help
to identify, manage, and track the 3D model. Metadata can also include information about
the model’s geometry, topology, materials, and other information pertinent to the design or
manufacturing process. This information can help automate specific tasks, such as generating
bills of materials, and can also be used for data management and analysis.
One of the most common risks when exporting or converting CAD files is that the meta-
data may not be adequately translated. This can happen if the export or conversion process
is not set up correctly or if the target file format does not support the metadata in the original
file. In such cases, meaningful information may be lost, making it difficult to understand the
file’s context.
Part metadata regarding material and part description was lost when exporting the CAD
assemblies from Siemens NX. Luckily, the lost metadata was not essential for the visualization
of the CAD assembly.
The part hierarchy was one crucial piece of information that remained after exporting the
assembly. This hierarchy was helpful for the removal of duplicated components since each

26
5.3. Issues, problems, and notes during development

assembly, subassembly, and part was named with a unique part number. An allowlist could
be implemented based on the part numbers.
The color of the components remained after exporting and conversion to Unity. This meta-
data was helpful in the visualization since it gave a sense of what material a component was
made of (metal, plastic, foam, etc.).

5.3.2 Conversion of CAD files


Converting CAD files from one file format to another can present several issues that need to
be addressed. One of the main issues is that different file formats may have different levels
of support for certain features, such as materials, lighting, or textures, which can result in the
loss of important information.

External Libraries
To address conversion issues, software tools such as Pixyz Studio, Pixyz Plugin, and CAD-
Exchanger SDK can perform the conversion. These tools are specifically designed to convert
CAD files from one file format to another and can help to ensure that the files are properly
converted, and that important information is not lost.

Pixyz Plugin Pixyz Plugin is a software plugin for Unity that automates the CAD file im-
porting and optimization process. It does not convert the CAD file to a format available from
outside Unity (strictly to a proprietary file format), and it is easy to use and powerful in its
features [41]. This tool worked great when implemented in Unity edit mode. Still, a license
limitation disallows the use of Pixyz Plugin at runtime in Unity. It is impossible to use in the
final software where runtime importing is essential [42].

Pixyz Studio and Pixyz Scenario Processor Pixyz Studio and Pixyz Scenario Processor al-
low converting CAD files to many of the most popular file types for 3D CAD models. Pixyz
Studio is a standalone software that allows for the conversion and optimization of complex
CAD models through a GUI or Python API [43]. Pixyz Scenario Editor uses optimization sce-
narios created in Pixyz Studio and allows for automatic execution of the scenarios through a
cloud service [44]. The high software cost made it impossible to use for this thesis’s purpose.
Conversion to the glTF format would be needed to import CAD files into Unity at runtime.
This would work in tandem with the glTFast plugin for Unity and allow runtime import [45].
This workflow is recommended by Pixyz for importing at runtime [46].

CAD Exchanger SDK CAD Exchanger SDK provides a complete solution for importing
CAD files at runtime [38]. This tool was used in the final software and was implemented by
adapting the provided C# Unity scripts. The software provides ready-to-use C# scripts for
Unity and works well. I did not explore the mesh modification features of the tool since the
performance was acceptable.

5.3.3 Assembly Refinement


The imported CAD file might not always be perfect, and some post-processing may be re-
quired.

Duplicated Components
During development, an issue was encountered when exporting CAD assemblies from
Siemens NX and Teamcenter. The problem was that the different assembly configurations
were defined through reference sets in Siemens NX and when exporting to .STP format, all
parts from all reference sets were included. This resulted in duplicated components which

27
5.3. Issues, problems, and notes during development

caused issues when trying to interact with the assembly. The issues were mainly with a re-
duction of intuitiveness when moving parts. I counted up to nine of the same component in
the same location.

Allowlist filter The initial approach to addressing this issue was to use an allowlist filter,
where a Unity script read a list of allowed components. Components not on that list were
deleted based on component IDs. However, this method proved ineffective as the exported
bill of materials for a particular product did not accurately match the CAD model. As a result,
this approach was abandoned to focus on other alternatives.

Mesh-based filtering The second alternative was to use a filter based on the mesh triangle
count of each component. This method worked well for removing large duplicated compo-
nents with more unique triangle counts. Still, it did not work well for smaller components,
like screws, that all had the same triangle counts, resulting in just one random screw compo-
nent surviving the filter.

Location-based filtering The third alternative was to use a filter based on the component’s
precise world positions. This method worked well for removing duplicates of the same part,
as it was counted as duplicates if the absolute world positions of the middle of the compo-
nents matched precisely. One possible improvement could be to check for other components
in a box around the middle of each component; this could find components of different shapes
but in approximately the same location.

5.3.4 Mesh modification


A triangle mesh is a foundation for rendering the part and simulating interactions in a 3D
environment. It is essential that the mesh is optimized for real-time performance and meets
the requirements for the desired level of detail and accuracy in the mixed-reality experience.

Lower Quality and Triangle Count


To improve the performance of the imported CAD assembly, I attempted to reduce the num-
ber of triangles in the model using Pixyz Plugin’s decimation feature. Decimation involves
the reduction of triangles by removing the ones that are not necessary for accurately repre-
senting the model’s shape and form. Different triangle count targets were used, from 100k to
250k.
The method was applied to the imported CAD assembly, and a notable performance im-
provement was observed. However, this improvement came at a considerable reduction in
model quality. The simplification of the geometry caused a decrease in detail and precision,
rendering it less appropriate for CAD engineers to use when assessing design modifications.
In the end, it was determined that this method was not fitting for the objective of the thesis, as
the emphasis should be on maintaining high-quality models and efficient runtime importing.

Levels of Detail
Levels of Detail (LODs) linearly change the quality of the CAD model based on the distance
to the model or how much of the screen area a model occupies. This method would utilize
Unity’s LOD system, which automatically creates and switches between different levels of
detail for a given object. By using LODs, it’s possible to reduce the number of triangles in
the model at a distance, which helps improve the overall system’s performance. This was not
attempted for this thesis since the performance was good enough for the system to be usable,
and high model quality was of importance.

28
5.3. Issues, problems, and notes during development

5.3.5 Programming
Programming for the HoloLens 2 is done primarily with Unity, a popular and powerful game
development engine that allows developers to create interactive 3D applications and experi-
ences.

Mixed Reality Toolkit


Microsoft’s MRTK is a collection of scripts, prefabs, and other resources designed to help
developers create MR applications for the HoloLens 2 and other Windows MR devices. It
aims to simplify the development process by providing common input and interaction sys-
tems, including gesture recognition, speech input, and hand tracking. MRTK’s scripts and
components have been straightforward, powerful, and adaptable for this project.

Unity Preview
One advantage of using Unity for programming the HoloLens 2 is the Unity Preview feature.
This feature allows developers to quickly test their latest changes, either locally or on the
HoloLens 2, which makes the development process more efficient. The Unity Interface is
easy to use, intuitive, and powerful, which makes it an excellent tool for developers of all
skill levels.

Unity Interface
Unity’s editor interface is shown in Figure 5.1 and is divided into several panels and win-
dows, each with its specific purpose.

• The Scene window is where you can view and edit the objects in your scene. It allows
you to move, rotate, and scale objects and adjust lighting and camera settings.

• The Hierarchy window shows the hierarchical structure of the objects in your scene,
making it easy to organize and manage your objects.

• The Inspector window allows you to view and edit the properties of the selected object,
such as its position, rotation, and scale. It also allows you to add and remove compo-
nents, such as scripts, colliders, and materials.

• The Project window is where you can access and manage all of the assets in your project,
such as models, textures, and audio files.

Colliders
It is necessary to add colliders to each part that has been imported into the Unity project to
allow for the use of the interaction components. The aim was to have colliders that are as
accurate to the actual part mesh as possible while maintaining good performance. It was
determined that mesh colliders were the best choice to have good intuitiveness.

Convex Mesh Collider Limitations The near and far interaction components provided by
the MRTK require the components to have one of a subset of Unity’s available colliders, ei-
ther Box, Sphere, Capsule, or Mesh. When using a mesh collider, it was required to be a
convex mesh collider. Unity’s convex mesh has a limitation of 255 triangles. This limited the
possibility of using this collider since most components have a mesh of over 255 triangles.
The solution to this was to check the number of triangles in each mesh in the imported as-
sembly and, from that, use convex mesh colliders if less than 256 triangles and a box collider
otherwise. A comparison of the two mesh alternatives are shown in Figure 5.2.

29
5.3. Issues, problems, and notes during development

Figure 5.1: A screenshot of the Unity editor window.

Figure 5.2: A comparison of a convex mesh collider and a box collider on a sphere in the
Unity editor.

30
5.4. The human experience

Pixyz Solutions Pixyz Plugin allowed for more complex mesh structures with mesh deci-
mation and decomposition. This solution was not possible because of the license limitation
described in Chapter 5.3.2.

5.4 The human experience


This section discusses the human aspects of using the software and hardware.

5.4.1 Hand Sensing


One of the essential features of the HoloLens 2 is its hand-sensing capability, which allows
users to interact with virtual objects and holograms naturally and intuitively. The device uses
a combination of cameras and sensors to track the movement and position of the user’s hands,
making it possible to perform gestures such as pinching, tapping, and swiping to control
virtual objects. The pinching function is shown in Figure 5.4. The hand detection performance
is good, as the system accurately tracks the fingers and enables the use of sophisticated hand
gestures.

Figure 5.3: An example of the pinching maneuver for interaction simulated in the Unity
editor.

5.4.2 Speech
Microsoft’s HoloLens 2 includes built-in microphones, and advanced speech recognition ca-
pabilities, which allow users to speak commands and control virtual objects using their voice.
This makes it easy to perform feature activation tasks without interrupting the interaction
with the CAD models.
It’s possible to define any English word to be recognized by the HoloLens 2 user, and
the connection to activating a feature is easy in the Unity editor. The device is good at only
recognizing the voice command when the actual user is talking. During my testing, it has not
yet been triggered by a bystander talking.

5.4.3 Part Interactions


The HoloLens 2 also includes near and far interaction components, which allow users to
interact with virtual objects and holograms at different distances from the device. The result

31
5.4. The human experience

Figure 5.4: A screenshot from a video recorded with the HoloLens 2 showing the use of the
pinching feature on the explode slider.

of using box colliders as described in Section 5.3.5 was that a user could sometimes grab a
component when pinching the air or sometimes grab a different component than anticipated.
The advantage is that the entire component is inside the box collider, and you would always
grab a component when pinching on something visible.

Far Interaction
The far interaction component allows users to interact with virtual objects farther away.
This works through a virtual laser pointer projected in the HoloLens 2 environment from
the user’s hands as shown in Figure 5.5. Interaction with the object occurs when the “laser
pointer“ touches an object and the user pinches their fingers simultaneously.

Near Interaction
The near interaction component allows users to interact with virtual objects close to the de-
vice, such as when working with a virtual keyboard or manipulating objects using hand
gestures and pinching as shown in Figure 5.5. I found the near interaction with pinching the
most intuitive for new users since it feels like you are grabbing an actual component.

32
5.4. The human experience

Figure 5.5: An example of using the near and far interaction features simulated in the Unity
editor.

Pinching Intuitiveness
One of the main ways users interact with the device is through pinching gestures, which
are used to select and manipulate virtual objects. However, I found many new users had
difficulty performing the pinch gesture correctly, particularly when asked to “pinch with
your thumb and index finger, and do a large pinch that is visible by the device.“
One of the main challenges in performing the pinch gesture is that it requires precise hand-
eye coordination and a good understanding of the device’s tracking capabilities. To perform
the pinch gesture correctly, users must be able to spread their thumb and index finger more
than what is intuitive and then perform the pinch gesture in a way that is visible to the device.
This can be difficult for users with limited experience with the device or who have difficulty
with fine motor skills.

5.4.4 Other Features


The software featured other functions that will be discussed further.

Movement Cube
One limitation of the movement cube described in Chapter 4.1.2 is that it does not know
where the ground level is. It is up to the user to place the assembly at the desired height
rather than the software automatically aligning it onto the ground. This can make it more
difficult for the user to correctly position the assembly on the floor, as they need to pay close
attention to the placement of the cube. An early attempt to always have the assembly placed
on the highest physical object under itself resulted in the assembly jumping onto people’s
heads when walking past the user. It was then determined that the software should not care
for the real-world environment. Nevertheless, the movement cube function is a helpful tool
for quickly and easily moving the assembly in the environment and can help with ergonomics
since the user can position the model at any height and allow for interaction without crouch-
ing.

33
5.4. The human experience

Slider interactions
The sliders, as described in Chapter 4.1.2, were implemented to be scalable for any number
of imported CAD files. Initial versions of the slider had the location of the slider as an offset
to the Unity camera (HoloLens 2 device), with some smoothing applied. New users found
it initially difficult to accurately pinch the slider cursor, especially people with short arms,
since the slider sometimes was too far away to reach. This was not user-friendly since small
head movements made the slider move slightly and made it difficult to grab the cursor. Later
versions used a different interaction script that froze the slider’s position when a hand was
near. This made it much more intuitive to use.

5.4.5 Visualize In The HoloLens 2


Microsoft’s HoloLens 2 device was utilized in this thesis. It was used internally by stakehold-
ers and colleagues during the software’s development and at a student fair where students
who wanted to experience AR for the first time used it. The experience provided a thorough
understanding of the strengths and weaknesses of the device, as many students gave their
initial impressions and feedback on the software and device.

Ease Of Use
The hardware and software must be easy to use and comfortable, especially if used for pro-
longed periods or often during work.

Ergonomics Ergonomics were generally found to be good. The device is easily adaptable
for different head sizes through a wheel on the back of the device; it is designed with glasses-
wearers in mind and is not interfering with the user’s glasses [40]. There was no noticeable
concern about the device sliding or falling off, and most users were confident enough to use
both hands for interaction.

Nausea Each user only used the HoloLens 2 for a short duration, and almost no one com-
plained about nausea.

Battery Life The software was almost always running during the eight-hour student fair
and was connected to a 65W USB Type-C charger when not in use. The battery percentage
was around 50% most of the day. Battery life was around one hour when fully charged and
running the software, so easy access to a charger would be recommended. Microsoft claims
2-3 hours of active usage and a minimum of 15W charge power to maintain battery level [40].

Performance
The balance of performance to power draw and weight is a continuous struggle with tether-
less, battery-powered devices.

FPS Some users experienced the framerate of the device to be low. The software sometimes
ran below 30 FPS, and I expected many people to notice and comment on that. I found
that the way the HoloLens visualizes the holograms makes it entirely usable, even below the
recommended 60 FPS. It is primarily when manipulating the components that judder occurs.
This experience reinforced my thought that image quality is more important than high and
stable framerates for this type of application.

34
5.4. The human experience

Responsiveness The HoloLens 2 projects a virtual interpretation of your hands and does an
excellent job following your movements. No users complained about a lack of responsiveness
when manipulating components. The lack of responsiveness is balanced with a smoothing
effect that reduces discomfort when your movements are not tracked quickly enough.

Alternatives
There are software products designed explicitly for visualizing CAD files in VR. For example,
Pixyz Review and Siemens NX Virtual Reality [47, 48]. The HoloLens 2 is different from other
HMDs because it is a computer running a separate OS. This makes it difficult for software
manufacturers to implement support for the HoloLens 2, and both Pixyz Review and Siemens
NX Virtual Reality lack support for the HoloLens 2.

View
FOV Some users complained about a small FOV when using the HoloLens 2. The FOV
is improved from the first generation HoloLens, and will hopefully only improve in future
generations.

Resolution The resolution is good, and I heard no complaints from users. The readability
of text on components is quite similar to what is expected in real life, where you need to
approach smaller text to increase readability.

Colors The CAD models used for this thesis were not that colorful, and no analysis of color
accuracy can be made other than that colors appear bright and clear.

Anti-Aliasing Some users complained about bad aliasing. The edges of components appear
jagged as shown in Figure 5.6 and reduce the immersion for the user. Anti-aliasing technolo-
gies could reduce the aliasing but at the cost of performance. This was not researched or
tested for the thesis.

Presentation Features The HoloLens 2 provides a suite of functions for sharing content with
others. It is a business-focused product with good integration into the Microsoft/Windows
ecosystem.

PC-connection
The HoloLens 2 device can be connected to a PC through a USB Type-C cable or Wi-Fi. It is
possible to charge and use the device at the same time.

Real-time View It is possible to see a real-time view of what a user sees when using the
HoloLens 2 device. The device blends the holographic view with the view of the device’s
cameras and sends it over Wi-Fi to the Microsoft HoloLens companion app or through Mira-
cast. The latency of Miracast is much lower than with the companion app, and the connection
was found to be more stable. It is also possible to cast through other methods that have not
been used for this thesis [49].

On Device Recording
The HoloLens 2 supports on-device recordings in 1080p 30 FPS. These recordings can then
be sent to another device or uploaded online. The recordings are better quality than when
casting in real-time, but the performance and framerate impact is substantial.

35
5.5. The work in a wider context

Figure 5.6: An example of the function of anti-aliasing in the Unity editor.

5.5 The work in a wider context


An MR system allowing CAD engineers to rapidly interact with their design changes can
have several positive societal impacts. Some of the most notable benefits include:

1. Improved Product Design: By allowing CAD engineers to iterate on their designs


rapidly, the system can lead to better and more functional products. This can result
in improved quality of life for consumers, and reduced product recalls due to design
flaws.

2. Increased Production Efficiency: The ability to rapidly make changes to designs can also
lead to increased production efficiency, as engineers can quickly identify and resolve
any issues that arise during the design process. This can result in faster time-to-market
for products and increased competitiveness for companies.

3. Cost Savings: Improved production efficiency can also result in cost savings, as compa-
nies can produce products faster and with fewer mistakes, reducing the need for rework
and retooling.

4. Better Collaboration: The MR system can also facilitate better collaboration between
CAD engineers and other stakeholders, as engineers can easily demonstrate their de-
signs to others and get feedback. This can lead to more informed design decisions and
a better overall design process.

5. Improved Accessibility: By allowing engineers to interact with their designs in a more


immersive, intuitive, and natural way, the system can also improve accessibility for
those with disabilities or other limitations that might make it difficult for them to use
traditional CAD tools.

6. Increased Awareness and Interest: MR can be a valuable educational tool, allowing peo-
ple to learn more about heat pumps and efficient residential heating systems. By expe-
riencing the designs in an MR environment, users can gain a deeper understanding of
the technology and its potential benefits, potentially increasing interest and investment
in heat pumps and other efficient heating systems.

36
6 Conclusion

In this chapter, we will summarize the key findings and contributions of this thesis, as well
as discuss future research directions and potential applications of the developed system.

6.1 Summarization
This thesis project aimed to investigate the use of MR technology in the CAD engineering
process for heat pump design where large and advanced CAD assemblies are used.

6.1.1 How can a system for visualizing CAD assemblies in mixed reality be
realized so that CAD engineers can use it for rapid prototyping?
An MR system for visualizing CAD assemblies can be realized using an HMD, such as the Mi-
crosoft HoloLens 2, which has high-quality, semi-transparent displays and inside-out track-
ing capabilities and can run custom software. Third-party CAD conversion and importing
tools can convert the CAD assemblies into a format compatible with a graphics engine at
runtime. The implementation should include conversion of CAD files, runtime importing
of converted files, optimization of imported assemblies, and addition of interaction compo-
nents and quality-of-life features to create an efficient, effective, and easy visualizing tool for
working with CAD assemblies in an MR environment.

6.1.2 What software limitations and solutions are there for realizing such a
system?
Software limitations must be addressed to realize a system for visualizing advanced heat
pump CAD assemblies in MR. The MRTK’s interaction components require that the CAD
components have either a convex mesh collider or a primitive collider for near or far interac-
tion. However, Unity has a constraint of 255 triangles for convex mesh colliders, which can
hinder the ability to use this collider on all components, especially when many components
have a mesh with more than 255 triangles. Software tools such as Pixyz Studio, Pixyz Plugin,
and CAD-Exchanger SDK are available to address issues with converting CAD files from one
file format to another. Pixyz Plugin is a software plugin for Unity that feature advanced op-
timization tools but has a license limitation that prevents its use at runtime in Unity. Pixyz

37
6.2. Future work

Studio and Pixyz Scenario Processor allow converting CAD files to many popular file types,
but the cost of these tools may make them difficult to use. CAD Exchanger SDK is a complete
solution for importing CAD files at runtime, providing ready-to-use C# scripts for Unity.

6.1.3 Societal benefits


The development of an MR system that enables CAD engineers to quickly and easily visualize
their designs and evaluate changes has the potential to bring about several significant benefits
for society. Some of the most significant impacts include improved product design, increased
production efficiency, cost savings, enhanced collaboration, and improved ergonomics for
engineers.

6.2 Future work


Future research in the field of large CAD assembly visualization in MR can include the fol-
lowing areas:

6.2.1 Other Game Engines


While Unity 2020.3.36f1 is a powerful game engine that is one of the recommended engines
to use with the Microsoft HoloLens 2, many other game engines could be used for large CAD
assembly visualization. Future research could explore the system’s compatibility with other
game engines and compare the performance.

6.2.2 Other MR/VR/AR devices


Microsoft’s HoloLens 2 is a powerful and versatile device, but many other available MR de-
vices could be used for large CAD assembly visualization. Future research could explore the
use of the system with other devices and compare the performance of the systems.

6.2.3 Other CAD-file Formats


While the system described in this thesis uses the STEP (ISO 10303-21) format, many other
CAD file formats could be used. Future research could explore the system’s compatibility
with other CAD file formats and maybe produce a file-type agnostic CAD visualization soft-
ware.

6.2.4 Real-time Collaboration


Real-time collaboration is an exciting area for future research in CAD assembly visualization
in MR. As MR technology becomes increasingly widespread in industry and design, the abil-
ity to work together in real-time with team members or stakeholders from remote locations
will become increasingly important. This can be especially useful in product development,
where engineers and designers can collaborate on a design in real-time, make changes, and
see the results instantly.

6.2.5 Automatic Exporting and Conversion Pipeline


The system described in this thesis assumes direct access to CAD files. Still, having an au-
tomatic pipeline for downloading and converting CAD files in a real-world scenario may be
more convenient. Future research could explore the development of an automatic pipeline
for CAD file conversion and evaluate the system’s performance when using such a pipeline.

38
Bibliography

[1] Iain Staffell, Dan Brett, Nigel Brandon, and Adam Hawkes. “A review of domestic
heat pumps”. In: Energy Environ. Sci. 5 (11 2012), pp. 9291–9306. DOI: 10 . 1039 /
C2EE22653G. URL: http://dx.doi.org/10.1039/C2EE22653G.
[2] Sophie Nyborg and Inge Røpke. “Energy impacts of the smart home: Conflicting vi-
sions”. English. In: Energy Efficiency First: The foundation of a low-carbon society. European
Council for an Energy Efficient Economy, 2011, pp. 1849–1860. ISBN: 978-91-633-4455-8.
[3] S.H. Choi and A.M.M. Chan. “A virtual prototyping system for rapid product de-
velopment”. In: Computer-Aided Design 36.5 (2004), pp. 401–412. ISSN: 0010-4485. DOI:
https://doi.org/10.1016/S0010-4485(03)00110-6. URL: https://www.
sciencedirect.com/science/article/pii/S0010448503001106.
[4] Chetan Shukla, Michelle Vazquez, and F. Frank Chen. “Virtual manufacturing: An
overview”. In: Computers & Industrial Engineering 31.1 (1996). Proceedings of the 19th
International Conference on Computers and Industrial Engineering, pp. 79–82. ISSN:
0360-8352. DOI: https : / / doi . org / 10 . 1016 / 0360 - 8352(96 ) 00083 -
6. URL: https : / / www . sciencedirect . com / science / article / pii /
0360835296000836.
[5] Microsoft. Windows Mixed Reality - Develop - Install the tools. Accessed: 2023-01-13. URL:
https : / / learn . microsoft . com / en - us / windows / mixed - reality /
develop/install-the-tools#installation-checklist.
[6] Industrial automation systems and integration — Product data representation and exchange
— Part 21: Implementation methods: Clear text encoding of the exchange structure. Standard.
Geneva, CH: International Organization for Standardization, Mar. 2016.
[7] Paul Milgram and Fumio Kishino. “A taxonomy of mixed reality visual displays”. In:
IEICE TRANSACTIONS on Information and Systems 77.12 (1994), pp. 1321–1329.
[8] Shakespearesmonkey. Dreams are the new Reality. Accessed February 27, 2023, Public
Domain Mark 1.0. 2019. URL: https://www.flickr.com/photos/67592622@
N00/48355780392.
[9] Alexei Vranich. Virtual model pumapunku. Accessed February 27, 2023, Creative Com-
mons Attribution 4.0 International available at https://creativecommons.org/licenses/by/4.0/deed.en.
2018. URL: https : / / heritagesciencejournal . springeropen . com /
articles/10.1186/s40494-018-0231-0.

39
Bibliography

[10] NASA/Chris Gunn. Exploring the Universe in Virtual Reality. Accessed Febru-
ary 27, 2023, Creative Commons Attribution 4.0 International available at
https://creativecommons.org/licenses/by/4.0/deed.en. 2019. URL: https : / / go .
nasa.gov/2RFSOLw.
[11] A. Tait. “Desktop virtual reality”. In: IEE Colloquium on Using Virtual Worlds. 1992,
pp. 5/1–5/5.
[12] Frank Biocca and Ben Delaney. “Immersive virtual reality technology”. In: Communica-
tion in the age of virtual reality 15.32 (1995), pp. 10–5555.
[13] OyundariZorigtbaatar. Augmented-reality. Accessed February 27, 2023, Cre-
ative Commons Attribution-Share Alike 4.0 International available at
https://creativecommons.org/licenses/by-sa/4.0/deed.en. 2016. URL: https : / /
commons.wikimedia.org/wiki/File:Augmented-reality.jpg.
[14] Microsoft. What is mixed reality? [Accessed January 2, 2023]. 2022. URL: https : / /
learn.microsoft.com/en-us/windows/mixed-reality/discover/mixed-
reality.
[15] Miguel Borges, Andrew Symington, Brian Coltin, Trey Smith, and Rodrigo Ventura.
“HTC Vive: Analysis and Accuracy Improvement”. In: 2018 IEEE/RSJ International Con-
ference on Intelligent Robots and Systems (IROS). 2018, pp. 2610–2615. DOI: 10 . 1109 /
IROS.2018.8593707.
[16] Michael J Gourlay and Robert T Held. “Head-Mounted-Display Tracking for Aug-
mented and Virtual Reality”. In: Information Display 33.1 (2017), pp. 6–10.
[17] Katharina Pentenrieder, Peter Meier, Gudrun Klinker, et al. “Analysis of tracking ac-
curacy for single-camera square-marker-based tracking”. In: Proc. Dritter Workshop
Virtuelle und Erweiterte Realitt der GIFachgruppe VR/AR, Koblenz, Germany. Citeseer. 2006.
[18] Svitlana Alkhimova and Illia Davydovych. “Accuracy assessment of marker recogni-
tion using ultra wide angle camera”. In: Technology audit and production reserves 3.2/65
(2022), pp. 6–10.
[19] Soumitra P Sitole, Andrew K LaPre, and Frank C Sup. “Application and evaluation
of lighthouse technology for precision motion capture”. In: IEEE Sensors Journal 20.15
(2020), pp. 8576–8585.
[20] Vazquez88. Motion Capture with Chad Phantom. Accessed February 28,
2023, Creative Commons Attribution-Share Alike 3.0 Unported available at
https://creativecommons.org/licenses/by-sa/3.0/deed.en. URL: https : / /
commons . wikimedia . org / wiki / File : Motion _ Capture _ with _ Chad _
Phantom.png.
[21] Apple Inc. Use the Measure app on your iPhone, iPad, or iPod touch. [Accessed January 9,
2023]. 2021. URL: https://support.apple.com/en-us/HT208924.
[22] Atomicdragon136. IOS measure app demonstration. Accessed February
27, 2023, Creative Commons Attribution 4.0 International available at
https://creativecommons.org/licenses/by/4.0/deed.en. 2018. URL: https : / /
commons.wikimedia.org/wiki/File:IOS_measure_app_demonstration.
jpg.
[23] Clem Bastow. From Pokéstops to Pikachu: everything you need to know about Pokémon
Go. [Accessed January 9, 2023]. 2016. URL: https : / / www . theguardian . com /
technology / 2016 / jul / 11 / from - pokestops - to - pikachu - everything -
you-need-to-know-about-pokemon-go.
[24] Google. Google Cardboard. [Accessed January 9, 2023]. URL: https://arvr.google.
com/cardboard/.

40
Bibliography

[25] Ananda Bibek Ray and Suman Deb. “Smartphone Based Virtual Reality Systems in
Classroom Teaching — A Study on the Effects of Learning Outcome”. In: 2016 IEEE
Eighth International Conference on Technology for Education (T4E). 2016, pp. 68–71. DOI:
10.1109/T4E.2016.022.
[26] Kevin R. Brooks. “Depth Perception and the History of Three-Dimensional Art: Who
Produced the First Stereoscopic Images?” In: i-Perception 8.1 (2017). PMID: 28203349,
p. 2041669516680114. DOI: 10.1177/2041669516680114. eprint: https://doi.
org / 10 . 1177 / 2041669516680114. URL: https : / / doi . org / 10 . 1177 /
2041669516680114.
[27] Michael Doneus and Klaus Hanke. “Anaglyph images-still a good way to look at 3D-
objects”. In: Proceedings of the 17th CIPA Colloquium: Mapping and Preservation for the New
Millenium: 3-6 October 1999; Olinda, Brazil. 1999.
[28] NASA/JPL-Caltech. PIA17948: Martian Landscape With Rock Rows and Mount Sharp
(Stereo). Accessed February 27, 2023. 2014. URL: https : / / photojournal . jpl .
nasa.gov/catalog/PIA17948.
[29] A. Simon, M. G. Prager, S. Schwarz, M. Fritz, and H. Jorke. “Interference-filter-based
stereoscopic 3D LCD”. In: Journal of Information Display 11.1 (2010), pp. 24–27. DOI: 10.
1080/15980316.2010.9652114.
[30] Boffy b. Master system 3d glasses. Accessed February 27, 2023, Creative Commons Attri-
bution 2.5 Generic available at https://creativecommons.org/licenses/by/2.5/deed.en.
2006. URL: https://commons.wikimedia.org/wiki/File:Master_system_
3d_glasses.jpg.
[31] Handige Harrie. Polarised stereo glasses. Accessed February 27, 2023. 2010. URL: https:
//commons.wikimedia.org/wiki/File:Polarised_stereo_glasses.JPG.
[32] Douglas Lanman, Gordon Wetzstein, Matthew Hirsch, Wolfgang Heidrich, and
Ramesh Raskar. “Beyond parallax barriers: applying formal optimization methods to
multilayer automultiscopic displays”. In: Stereoscopic Displays and Applications XXIII.
Ed. by Andrew J. Woods, Nicolas S. Holliman, and Gregg E. Favalora. Vol. 8288. In-
ternational Society for Optics and Photonics. SPIE, 2012, 82880A. DOI: 10.1117/12.
907146. URL: https://doi.org/10.1117/12.907146.
[33] R. Barry Johnson and Gary A. Jacobsen. “Advances in lenticular lens arrays for vi-
sual display”. In: Current Developments in Lens Design and Optical Engineering VI. Ed. by
Pantazis Z. Mouroulis, Warren J. Smith, and R. Barry Johnson. Vol. 5874. International
Society for Optics and Photonics. SPIE, 2005, p. 587406. DOI: 10.1117/12.618082.
URL : https://doi.org/10.1117/12.618082.

[34] Cmglee. Parallax barrier vs lenticular screen. Accessed February 27, 2023,
Creative Commons Attribution-Share Alike 3.0 Unported available at
https://creativecommons.org/licenses/by-sa/3.0/deed.en. URL: https : / /
commons.wikimedia.org/wiki/File:Parallax_barrier_vs_lenticular_
screen.svg.
[35] Lisa Rebenitsch and Charles Owen. “Estimating cybersickness from virtual reality ap-
plications”. In: Virtual Reality 25.1 (2021), pp. 165–174.
[36] Yasin Farmani and Robert J Teather. “Evaluating discrete viewpoint control to reduce
cybersickness in virtual reality”. In: Virtual Reality 24.4 (2020), pp. 645–664.
[37] Introduction to the Mixed Reality Toolkit-Set Up Your Project and Use Hand Interaction. URL:
https://learn.microsoft.com/en-us/training/modules/learn-mrtk-
tutorials/ (visited on 11/14/2022).
[38] Software Libraries to Read, Write and Visualize 3D CAD files. URL: https : / /
cadexchanger.com/products/sdk/ (visited on 11/29/2022).

41
Bibliography

[39] Mixed Reality Feature Tool. URL: https : / / aka . ms / MRFeatureTool (visited on
11/30/2022).
[40] Microsoft. About HoloLens 2. [Accessed January 30, 2023]. 2022. URL: https://learn.
microsoft.com/en-us/hololens/hololens2-hardware.
[41] Unity Technologies. Pixyz Plugin. [Accessed January 26, 2023]. 2022. URL: https://
www.pixyz-software.com/plugin/.
[42] Unity Technologies. Licensing Policy. [Accessed January 26, 2023]. 2022. URL: https://
www.pixyz-software.com/documentations/html/2022.1/plugin4unity/
LicensingPolicy.html.
[43] Unity Technologies. Pixyz Studio. [Accessed January 26, 2023]. 2022. URL: https://
www.pixyz-software.com/studio/.
[44] Unity Technologies. Pixyz Scenario Editor. [Accessed January 26, 2023]. 2022. URL:
https://www.pixyz-software.com/scenario-processor/.
[45] Andreas Atteneder. glTFast. [Accessed January 26, 2023]. 2022. URL: https : / /
github.com/atteneder/glTFast.
[46] Unity Technologies. Export Unity Prefab. [Accessed January 26, 2023]. 2022. URL:
https : / / pixyz - software . com / documentations / html / 2022 . 1 /
scenarioprocessor/Workflowexamples.html.
[47] Unity Technologies. Pixyz Review. [Accessed January 30, 2023]. 2023. URL: https://
www.pixyz-software.com/review/.
[48] Siemens. NX Virtual Reality. [Accessed January 30, 2023]. URL: https://www.plm.
automation.siemens.com/global/en/products/mechanical-design/nx-
virtual-reality.html.
[49] Microsoft. Create mixed reality photos and videos. [Accessed January 30, 2023]. 2022. URL:
https://learn.microsoft.com/en-us/hololens/holographic-photos-
and-videos.

42

You might also like