MR Research paper
MR Research paper
MR Research paper
Arvid Westerlund
Linköpings universitet
SE–581 83 Linköping
+46 13 28 10 00 , www.liu.se
Upphovsrätt
Detta dokument hålls tillgängligt på Internet - eller dess framtida ersättare - under 25 år från publicer-
ingsdatum under förutsättning att inga extraordinära omständigheter uppstår.
Tillgång till dokumentet innebär tillstånd för var och en att läsa, ladda ner, skriva ut enstaka ko-
pior för enskilt bruk och att använda det oförändrat för ickekommersiell forskning och för undervis-
ning. Överföring av upphovsrätten vid en senare tidpunkt kan inte upphäva detta tillstånd. All annan
användning av dokumentet kräver upphovsmannens medgivande. För att garantera äktheten, säker-
heten och tillgängligheten finns lösningar av teknisk och administrativ art.
Upphovsmannens ideella rätt innefattar rätt att bli nämnd som upphovsman i den omfattning som
god sed kräver vid användning av dokumentet på ovan beskrivna sätt samt skydd mot att dokumentet
ändras eller presenteras i sådan form eller i sådant sammanhang som är kränkande för upphovsman-
nens litterära eller konstnärliga anseende eller egenart.
För ytterligare information om Linköping University Electronic Press se förlagets hemsida
http://www.ep.liu.se/.
Copyright
The publishers will keep this document online on the Internet - or its possible replacement - for a
period of 25 years starting from the date of publication barring exceptional circumstances.
The online availability of the document implies permanent permission for anyone to read, to down-
load, or to print out single copies for his/hers own use and to use it unchanged for non-commercial
research and educational purpose. Subsequent transfers of copyright cannot revoke this permission.
All other uses of the document are conditional upon the consent of the copyright owner. The publisher
has taken technical and administrative measures to assure authenticity, security and accessibility.
According to intellectual property law the author has the right to be mentioned when his/her work
is accessed as described above and to be protected against infringement.
For additional information about the Linköping University Electronic Press and its procedures
for publication and for assurance of document integrity, please refer to its www home page:
http://www.ep.liu.se/.
© Arvid Westerlund
Abstract
Mixed, virtual, and augmented reality technology blends virtual and real elements,
offering unlimited potential for various industries such as gaming, medical science, and
industrial manufacturing. This report aims to show the development process of a sys-
tem for visualization of and interaction with large and advanced CAD assemblies. The
proposed system will enable CAD engineers, service technicians, educators, and other rel-
evant parties to engage with current and emerging products for educational and demon-
stration purposes without needing a physical model. The report will show the problems
and solutions faced when developing such a system and evaluate different approaches. It
presents a technical overview of the current state of the mixed reality space concerning the
visualization of large and advanced CAD assemblies and what future innovations would
be desirable. The result of this paper is a system that imports, converts, optimizes, and
adds interaction capabilities of any .STP format file saved on the local file system, all when
the software is executing. The system allows for dual-handed interaction of any part, with
maintained quality and acceptable performance.
Acknowledgments
I want to thank Bosch Thermoteknik for allowing me to write this thesis, especially my ex-
ternal supervisor, Sarath Chandra Damineni, and external managers for their assistance and
support. I would also like to thank my examiner and my supervisor for their help in writing
my thesis.
iv
Contents
Abstract iii
Acknowledgments iv
Contents v
List of Figures vi
List of Abbreviations 1
1 Introduction 2
1.1 The need for Heat Pumps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Mixed reality in practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Aim . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.4 Research questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.5 Delimitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2 Theory 5
2.1 The virtuality continuum and MR . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.2 Head-mounted displays and Mixed reality devices . . . . . . . . . . . . . . . . . 7
2.3 Smartphone . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.4 Stereoscopy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.5 Cybersickness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3 Method 15
3.1 Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.2 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.3 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
4 Results 20
4.1 Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
5 Discussion 25
5.1 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
5.2 Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
5.3 Issues, problems, and notes during development . . . . . . . . . . . . . . . . . . 25
5.4 The human experience . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
5.5 The work in a wider context . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
6 Conclusion 37
6.1 Summarization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
6.2 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Bibliography 39
v
List of Figures
3.1 The architecture of the system from CAD-file to visualization and interaction. . . . 16
vi
Abbreviations
1
1 Introduction
Mixed reality (MR) has seen significant growth in recent years, and its potential applications
in various industries are constantly being researched. This thesis explores the application of
MR technology in the heat pump design and manufacturing industry.
2
1.2. Mixed reality in practice
provide value in many different areas of a design and production company. Various ideas
have come to mind that make this technology an exciting area to research, such as:
3
1.3. Aim
prototypes. This can be especially useful for early-stage design review and evaluation, as it
can reduce the time and cost associated with creating and testing physical prototypes [3].
Another way MR can be used is for virtual manufacturing, which involves visualizing and
simulating the manufacturing process in a virtual environment. This can help stakeholders to
optimize the design and production process before physical prototypes are created, resulting
in a more efficient and cost-effective manufacturing process [4].
1.3 Aim
This thesis project explores the potential of MR technology for improving the efficiency and
effectiveness of the CAD engineering process for heat pump design. The project will focus on
developing a system for visualizing large and advanced heat pump CAD assemblies in MR,
allowing CAD engineers to visualize different designs and evaluate changes quickly. The
thesis will discuss the challenges and obstacles that arise when developing such a system
and potential solutions to those problems.
2. What software limitations and solutions are there for realizing such a system?
1.5 Delimitations
Certain delimitations are required due to time, cost, and availability limitations. These de-
limitations are described below.
1.5.1 Device
The Microsoft HoloLens 2 will be utilized for the thesis as it was the only device available
due to time and scope constraints.
4
2 Theory
Technological innovations have faded the lines between the virtual and real worlds in the
last few years. The power to blend digital assets with the real world allows you to exchange
information wherever you are.
2.1.1 Reality
At one extreme of the spectrum are fully real environments that do not involve computer-
generated elements. It is the reality we walk in and interact with in everyday life as shown in
Figure 2.2. This extreme is not included in the term mixed reality.
5
2.1. The virtuality continuum and MR
2.1.2 Virtuality
At the other end are fully virtual environments as shown by Figure 2.3; entirely generated by
a computer and do not involve any real-world elements [7]. This extreme is also not included
in the term mixed reality.
Virtual reality
Virtual reality (VR) is an immersive technology that creates a computer-generated environ-
ment where users can interact with virtual objects and experiences. VR is typically achieved
using specialized equipment such as a head-mounted display (HMD), headset, gloves, or
other wearable devices that allow users to see, hear, and interact with a virtual environment
naturally and intuitively. An example of this is shown in Figure 2.4. VR is often used for
various applications, including gaming, training, education, design, and entertainment.
Desktop VR
Desktop VR refers to using VR technology on a computer or other device not designed to be
worn on the body. This typically involves using a VR headset connected to a computer via
cables and interacting with the VR environment through controllers, with the user sitting or
standing in front of a desk. Desktop VR typically include limited to no tracking or motion
sensing technology, so the user is not able to move their head or body within the virtual
environment [11].
Immersive VR
Immersive VR uses VR technology to immerse the user in a virtual environment fully. This
can involve using VR headsets designed to be worn on the body, such as the HMD setup
shown in Figure 2.4, and other technologies that enable the user to interact with and move
6
2.2. Head-mounted displays and Mixed reality devices
within the virtual environment. Some immersive VR systems also include haptic feedback,
which allows the user to feel physical sensations in response to events in the virtual environ-
ment [12].
Augmented reality
Augmented reality (AR) is an immersive technology that adds computer-generated elements
to a real-world environment, allowing users to see and interact with virtual objects in the
context of the physical world [7]. AR is typically achieved using equipment such as a headset
or smartphone, which displays virtual objects and information on top of a user’s view of the
physical world as shown by Figure 2.5.
7
2.2. Head-mounted displays and Mixed reality devices
4. Same as class 3 but using cameras to show the real world instead of transparent displays
or mirrors. The displayed virtual environment should be accurately aligned with the
real-world environment.
5. Virtual environments without cameras, which may be fully immersive (HMDs) or par-
tially immersive (large screen displays), and which have computer-generated elements
added to them to create a sense of “reality“. This class would be accurate to current VR
headsets used in gaming (e.g., HTC Vive) [15].
6. A completely virtual (graphic) environment that is partially immersive where real ob-
jects in the user environment can be interacted with and influence the virtual, computer-
generated scene.
2.2.1 Tracking
In the technical aspects of AR and VR technology, there are several underlying hardware and
software components that make it all possible. For example, AR and VR devices require spe-
cialized sensors, cameras, and displays to track the user’s position and movements, render
8
2.2. Head-mounted displays and Mixed reality devices
digital content, and display it in real-time. These components work together to create an
immersive experience that combines the digital and physical worlds.
The algorithms and techniques used in AR and VR systems are also crucial to their suc-
cess. Tracking algorithms, for example, are responsible for accurately tracking the user’s
movements and position in real-time, which is essential for creating a seamless and immer-
sive experience.
Inside out
Inside-out tracking is a type of tracking technology used in VR and AR devices that uses
sensors on the device to track its movement in the real world and is the technology used in
the Microsoft HoloLens 2. The sensors typically include cameras, accelerometers, gyroscopes,
and sometimes magnetometers [16].
With inside-out tracking, the device itself is responsible for tracking its own movement,
which means there is no need for external sensors or base stations. This can make the setup
process easier and more flexible since the user does not need to set up any external sensors
or be limited by the range of the sensors [16]. Additionally, inside-out tracking can allow for
greater freedom of movement since the user is not tethered to any external sensors or base
stations.
It’s possible to use inside out tracking with a marker system. This technique uses mark-
ers, such as QR codes, to track the position and orientation of items in the real world. The
device’s camera detects the markers, which are then used to calculate the marker’s position
and orientation [17]. Marker-based tracking is commonly used in AR applications where the
connection to the physical world is important. Markers can be used in the HoloLens 2 to
superimpose the digital model onto a real [18].
Outside in
The process of outside-in tracking entails utilizing external sensors or cameras to track the
location and orientation of devices. The cameras used in this process are specially configured
to detect infrared light, and through the use of the light’s position from various cameras, the
system is capable of determining the devices’ position and orientation in three-dimensional
space. The system is known as “active“ if the infrared light is generated through emitters
attached to the device, while it is known as “passive“ if the device is outfitted with reflectors
and the emitters are attached to the cameras [19].
This technology is common in motion capture for movies or video game design for track-
ing items or limbs. A disadvantage of outside-in tracking is the need for external cameras to
9
2.3. Smartphone
be set up in the room, which can be inconvenient and may require additional setup time. A
common use case of tracking for motion capture can be seen in Figure 2.6.
Figure 2.6: The process of using active outside in tracking in motion capture [20].
Lighthouse
Lighthouse tracking is a method of tracking the position and orientation of VR equipment
through the use of emitters known as lighthouses. This tracking technology was introduced
by Valve Corporation, a video game development company, as part of their SteamVR plat-
form [19].
The lighthouses are stationary units that emit a sweeping laser beam. The VR equipment
is outfit with sensors that detect the laser beam as it passes over them. The sensors then
calculate the exact position and orientation of the equipment in real-time based on the timing
and location of the laser beam sweep [19].
2.3 Smartphone
Smartphones have emerged as viable platforms for experiencing VR and AR content. The uti-
lization of smartphones for VR and AR can be achieved through the utilization of specialized
VR and AR applications, as well as through the use of cost-effective headset accessories.
10
2.4. Stereoscopy
Figure 2.7: The iOS Measure app using AR technology to measure real world distances [22].
2.3.2 AR Gaming
Another way a smartphone can be used for AR is through AR mobile games, which use the
smartphone’s camera and display to create interactive AR experiences. These games typically
involve overlaying virtual objects or characters onto the real-world environment, allowing
users to interact with the virtual elements naturally and intuitively [23].
2.4 Stereoscopy
Stereoscopy is the technique of creating the illusion of three-dimensional depth in a two-
dimensional image or display. It has a long history dating back to the early 19th century. It has
11
2.4. Stereoscopy
played a vital role in developing many different types of immersive technologies, including
VR and AR [26].
12
2.4. Stereoscopy
Figure 2.10: The SegaScope 3-D Glasses using active shutter technology [30].
2.4.5 Autostereoscopy
Autostereoscopy is a technique used in 3D display systems that allows the user to view 3D
images without the need for special glasses or headgear.
13
2.5. Cybersickness
Parallax barrier
A parallax barrier (as shown in Figure 2.12) is a display technology used in autostereoscopic
displays that works by using a thin, opaque barrier with a series of precisely placed slits
placed in front of the display screen that allow only certain parts of the image to be seen by
each eye [32].
Screen R
Parallax L
R
barrier L
R
L
R
L
Left eye
Right eye
Screen R
Lenticular L
R
lens L
R
L
R
L
Left eye
Right eye
Figure 2.12: The function of the parallax barrier and lenticular display technology [34].
2.5 Cybersickness
Cybersickness, also known as VR sickness, occurs when a person experiences symptoms sim-
ilar to motion sickness while using a VR device [35]. Common symptoms are nausea, dry
mouth, disorientation and vertigo [36].
Several factors can contribute to the likelihood of experiencing cybersickness while using
a VR device. One important factor is the level of immersion in the virtual environment.
The more realistic, spacious, and immersive the environment, the less likely a person is to
experience cybersickness. Other factors that may contribute to the likelihood of experiencing
cybersickness include the intensity of the motion in the virtual environment, duration of use,
and individual differences in susceptibility [35].
14
3 Method
The MR system for visualizing advanced heat pump CAD assemblies was developed using
Microsoft’s HoloLens 2 device. Third-party CAD conversion and importing tools are used
to convert the heat pump CAD assemblies into a format compatible with the Unity game
engine. Custom scripts and algorithms are implemented to simplify the interaction with the
CAD model within the MR environment.
3.1 Architecture
Before implementing the MR system, the required steps and features were carefully planned
and defined after discussion with engineers in the heat pump industry and from user testing
of the HoloLens’ features. The following were identified as the crucial steps that needed to
be incorporated into the system:
1. Conversion from .STP format to a format that Unity can read and import at runtime.
2. Runtime importing the converted CAD files into the Unity project, allowing visualiza-
tion and interaction with the CAD assemblies without recompilation.
4. Addition of part colliders and interaction components to the objects within the im-
ported CAD assemblies, allowing the user to manipulate and explore the assemblies
naturally and intuitively.
The mixed-reality system was implemented with a clear understanding of the steps and fea-
tures required to create an efficient, effective, and user-friendly tool for visualizing and work-
ing with heat pump CAD assemblies in a mixed-reality environment.
15
3.2. Implementation
Figure 3.1: The architecture of the system from CAD-file to visualization and interaction.
3.2 Implementation
Implementation was made with Unity version 2021.3.14f1 and began with the initial setup of
a “3D Core“ project template. “Microsoft Mixed Reality Feature Tool v.1.0.2206.1 Preview“
was used to install the necessary components for the system. Mainly “Mixed Reality Toolkit
Foundation“ version 2.8.2, “Mixed Reality Toolkit Standard Assets“ version 2.8.2, and “Mixed
Reality OpenXR Plugin“ version 1.5.0. The project’s build settings were changed to “Uni-
versal Windows Platform“ with the HoloLens as the target device and ARM64 architecture.
Microsoft’s “Introduction to the Mixed Reality Toolkit“ guide was followed for configuration
[37]. CAD Exchanger SDK was used for importing and converting the CAD files into Unity
at runtime [38].
3.3 Evaluation
Problems encountered during implementation as described in Chapter 3.2 are noted, and
possible solutions to these problems are studied and evaluated. The following categories
describe the potential issues relevant to this thesis.
16
3.3. Evaluation
3.3.5 Human
Problems in this category refer to limitations or issues that arise from human factors, such as
user error, lack of training or knowledge, or negative impact on the user experience. These
problems may include limitations in the ability of users to understand or operate the system,
VR-induced nausea or headaches as described in Chapter 2.5, or other adverse physical or
psychological effects caused by the technology.
2. Download and install the Mixed Reality Feature Tool for Windows [39].
3. Start the Unity Hub software and install the Unity Editor version 2021.3.14, Microsoft
Visual Studio Community 2019, and Universal Windows Platform Build Support.
4. When installing Microsoft Visual Studio 2022, select the following workloads [5].
5. Select (or verify pre-selection of) the following components for “Universal Windows
Platform development“ [5].
a) Select Menu Bar > File > Build Settings > Platform > Universal Windows Platform.
b) Change Architecture to ARM 64-bit.
c) Click the Switch Platform button.
d) Close the Build Settings window when the switch is done.
8. Start the Mixed Reality Feature Tool for Unity and press “Start“.
17
3.3. Evaluation
The Unity project is now prepared for the development of HoloLens 2 software.
18
3.3. Evaluation
3. Click “Build“.
4. Navigate to the folder where you want the build to be saved and press “Select Folder“.
Wait for the build process to finish.
5. Navigate to the folder in the file explorer where the build was created.
6. Open the file with the extension .sln with Visual Studio.
7. Change the configuration from “Debug“ to “Release“ and the architecture from “ARM“
to “ARM64“.
8. Change the deployment target to “Remote Machine“ if deploying via Wi-Fi or “Device“
if deploying through USB.
9. In the menu bar, press Project > Properties > Configuration Properties > Debugging and
set Debugger to launch to “Remote Machine“.
10. Set “Machine Name“ to the IP address of the HoloLens 2. The IP address can be found
in the developer setting on the HoloLens.
12. In the main Visual Studio window, select Debug > Start Without Debugging to deploy
and start the program on the HoloLens 2 automatically.
13. You will be prompted for a PIN when deploying to the HoloLens for the first time.
That PIN can be found in the HoloLens through Settings > Updates & Security > For
developers > Pair
14. The software should be deploying to the HoloLens now and will start automatically
when deployment is finished.
19
4 Results
This chapter presents the results of the mixed reality system for the HoloLens 2 device that
has been developed for visualizing large CAD assemblies of heat pumps.
4.1 Features
This section describes the software system features described in Chapter 3.1 after develop-
ment.
4.1.1 Fundamentals
The system developed for visualizing and interacting with large CAD assemblies of heat
pumps using a HoloLens 2 device includes the critical steps described in Chapter 3.1. One
of which is the conversion of CAD files from the .STP format to a format that Unity can read
and import. The system can import the converted CAD files into the software at runtime if
the source .STP file is saved on the PC’s file system. CAD files can not be imported at run
time when running on the HoloLens 2 device but can when running the software on a PC.
Scripts for the optimization of the imported CAD assemblies have been developed. This
improves performance by reducing the number of duplicated components in the assemblies,
making them more manageable for the device to handle. This feature is activated through the
Unity editor in edit time but can be adapted for use in run time.
Finally, the system automatically adds interaction components to the objects within the
imported CAD assemblies. This includes adding mesh- and bounding box collisions and the
near and far interaction components of Microsoft’s Mixed Reality Toolkit (MRTK) to each
assembly part. This allows the user to manipulate and explore the assemblies naturally and
intuitively with both hands, from a near and far distance.
4.1.2 Quality-of-life
Below are the quality-of-life features described in Chapter 3.1 implemented in the software
as Unity scripts and components.
20
4.1. Features
Explosion view
The explosion view feature shown in Figure 4.1 is a tool that allows a better understanding
of a CAD assembly by breaking it down into its components. This feature can be activated
by saying the voice command “explode“, which toggles a slider that the user can control by
pinching with the thumb and index finger and moving the slider cursor to the right on the
slider bar.
21
4.1. Features
When the slider is moved to the right as in Figure 4.2, the assembly will “explode“ by
moving all the components away from the middle of the bottom of the assembly, with speed
proportional to how far the slider is slid to the right. This allows the user to decide the rate
at which the assembly explodes, giving more control over the feature. Once the user releases
the slider, the pointer will automatically move back to the left, and the explosion animation
will stop. The explosion height is limited through the Unity editor to an approximate ceiling
height. The component will only move on the X and Z-axis (width and depth) when any part
of the assembly has reached the height limit.
Part reset
The part position reset feature shown in Figure 4.3 is a tool that allows for moving all
assembly components to their original place within the assembly when the feature is acti-
vated through the voice command “reset“. This feature can help reset the location of parts
that have been moved or adjusted through manual component manipulation or the explosion
view feature.
Assembly selector
The assembly selector tool shown in Figure 4.4 enables fast switching between assemblies by
activating the voice command “slider“, which displays a user-controlled slider similar to the
explosion view described in this chapter. The user selects from loaded assemblies by moving
the slider cursor to any tick on the slider bar. The number of ticks adapts to the number of
loaded assemblies. The name of the currently selected assembly is shown under the slider
and is automatically assigned from the assembly name. This selector eliminates the need
for a separate menu or assembly list, making it useful for tasks such as comparing assembly
versions or switching between assemblies for different tasks.
Assembly relocator
The assembly relocator feature shown in Figure 4.5 allows the user to move the entire assem-
bly to a different location within the mixed reality environment by using a cube that can be
toggled by a voice command “move“. Activating the feature enables the user to grab and
22
4.1. Features
move the cube; this relocates the entire assembly. This feature allows the user to reposition
the assembly to avoid real-world obstructions. The cube can be rotated only on the Y-axis to
always have the imported CAD model in a vertical orientation.
23
4.1. Features
and all components are reset to their original position. Then the movement cube, selector
slider, and explode slider are disabled and hidden from view.
24
5 Discussion
The results shown in Chapter 4 show the possibility of creating a system for rapid prototyp-
ing in the product design process. The method, issues during development, and results are
discussed here. Future work and potential societal impacts are also discussed.
5.1 Results
While runtime importing of CAD files was implemented, the feature only works at run time
on the development PC running the Unity editor. The reason for this is the lack of an ac-
cessible file system on the Windows Holographic OS running on the HoloLens 2 [40]. While
alternative solutions, such as downloading a file over the air through the web, were con-
sidered, I ultimately decided not to implement this due to time constraints. The compiled
software includes known good assets where duplicated components were removed manu-
ally for demonstration and testing purposes. When developing features, they were required
to be scalable and usable with assemblies imported at runtime.
5.2 Method
One of the risks associated with the implementation steps described in Chapter 3.2 is its
potential to become outdated quickly. This can require continually updating implementa-
tion guides to remain relevant and functional. Software changes are particularly relevant for
Unity, which is updated regularly. Updates to the CAD Exchanger SDK, which is used for
importing and converting CAD files, may impact its compatibility with the implementation.
25
5.3. Issues, problems, and notes during development
Reliable data
Ensuring that the latest version of a source file is used is essential for both the accuracy and
the relevance of the assembly visualization. It ensures that the final visualizations are accu-
rate, informative, and up-to-date. This is especially important in cases where visualization
is used for identifying errors, taking measurements, or making design decisions. To avoid
errors, it is recommended to develop a robust system to check if a source file is of the latest
version before importing it into the software. This could be done through version tagging
and comparison with a central system.
CAD assemblies are composed of multiple parts and subassemblies, and the correct con-
figuration of these parts and subassemblies is crucial for accurate, reliable visualization.
Accurate Data
The design of accurate CAD assemblies of a product requires a collaborative effort between
CAD engineers and sub-component suppliers. Both parties play an important role in design
and development. Their cooperation is essential for ensuring that the final product meets the
desired specifications and avoids costly errors, redesigns, and delays in the product develop-
ment process.
The CAD assemblies used for this project are relatively detailed and contain over one
thousand individual components. Access to CAD files from suppliers is not guaranteed,
but they are used when available. For example, CAD files for cable harnesses are currently
unavailable in the assemblies.
One case of CAD files from suppliers was that a printed circuit board (PCB) for the net-
work interface on the heat pump had each PCB-mounted component as an individual CAD
part. This resulted in about one thousand components not larger than 1mm2 on a PCB smaller
than a human hand. The size of these components was impossible for the user to interact with
and entirely unnecessary for the system.
CAD engineers are responsible for creating, maintaining, and updating CAD files and
data; they need to ensure that the files are correctly created and maintained and that the data
is accurate and up-to-date. Issues were found where exported CAD assemblies had compo-
nents piercing through other components or simply floating in space away from anything
else. Some of those issues can be attributed to the troubles of exporting assemblies with a
specific configuration, as explained later in this section on duplicated components.
Metadata
Metadata in 3D CAD refers to information that describes the characteristics, properties, and
attributes of a 3D model. This information can include details such as the model’s creator,
date of creation, version number, file format, and any other relevant information that can help
to identify, manage, and track the 3D model. Metadata can also include information about
the model’s geometry, topology, materials, and other information pertinent to the design or
manufacturing process. This information can help automate specific tasks, such as generating
bills of materials, and can also be used for data management and analysis.
One of the most common risks when exporting or converting CAD files is that the meta-
data may not be adequately translated. This can happen if the export or conversion process
is not set up correctly or if the target file format does not support the metadata in the original
file. In such cases, meaningful information may be lost, making it difficult to understand the
file’s context.
Part metadata regarding material and part description was lost when exporting the CAD
assemblies from Siemens NX. Luckily, the lost metadata was not essential for the visualization
of the CAD assembly.
The part hierarchy was one crucial piece of information that remained after exporting the
assembly. This hierarchy was helpful for the removal of duplicated components since each
26
5.3. Issues, problems, and notes during development
assembly, subassembly, and part was named with a unique part number. An allowlist could
be implemented based on the part numbers.
The color of the components remained after exporting and conversion to Unity. This meta-
data was helpful in the visualization since it gave a sense of what material a component was
made of (metal, plastic, foam, etc.).
External Libraries
To address conversion issues, software tools such as Pixyz Studio, Pixyz Plugin, and CAD-
Exchanger SDK can perform the conversion. These tools are specifically designed to convert
CAD files from one file format to another and can help to ensure that the files are properly
converted, and that important information is not lost.
Pixyz Plugin Pixyz Plugin is a software plugin for Unity that automates the CAD file im-
porting and optimization process. It does not convert the CAD file to a format available from
outside Unity (strictly to a proprietary file format), and it is easy to use and powerful in its
features [41]. This tool worked great when implemented in Unity edit mode. Still, a license
limitation disallows the use of Pixyz Plugin at runtime in Unity. It is impossible to use in the
final software where runtime importing is essential [42].
Pixyz Studio and Pixyz Scenario Processor Pixyz Studio and Pixyz Scenario Processor al-
low converting CAD files to many of the most popular file types for 3D CAD models. Pixyz
Studio is a standalone software that allows for the conversion and optimization of complex
CAD models through a GUI or Python API [43]. Pixyz Scenario Editor uses optimization sce-
narios created in Pixyz Studio and allows for automatic execution of the scenarios through a
cloud service [44]. The high software cost made it impossible to use for this thesis’s purpose.
Conversion to the glTF format would be needed to import CAD files into Unity at runtime.
This would work in tandem with the glTFast plugin for Unity and allow runtime import [45].
This workflow is recommended by Pixyz for importing at runtime [46].
CAD Exchanger SDK CAD Exchanger SDK provides a complete solution for importing
CAD files at runtime [38]. This tool was used in the final software and was implemented by
adapting the provided C# Unity scripts. The software provides ready-to-use C# scripts for
Unity and works well. I did not explore the mesh modification features of the tool since the
performance was acceptable.
Duplicated Components
During development, an issue was encountered when exporting CAD assemblies from
Siemens NX and Teamcenter. The problem was that the different assembly configurations
were defined through reference sets in Siemens NX and when exporting to .STP format, all
parts from all reference sets were included. This resulted in duplicated components which
27
5.3. Issues, problems, and notes during development
caused issues when trying to interact with the assembly. The issues were mainly with a re-
duction of intuitiveness when moving parts. I counted up to nine of the same component in
the same location.
Allowlist filter The initial approach to addressing this issue was to use an allowlist filter,
where a Unity script read a list of allowed components. Components not on that list were
deleted based on component IDs. However, this method proved ineffective as the exported
bill of materials for a particular product did not accurately match the CAD model. As a result,
this approach was abandoned to focus on other alternatives.
Mesh-based filtering The second alternative was to use a filter based on the mesh triangle
count of each component. This method worked well for removing large duplicated compo-
nents with more unique triangle counts. Still, it did not work well for smaller components,
like screws, that all had the same triangle counts, resulting in just one random screw compo-
nent surviving the filter.
Location-based filtering The third alternative was to use a filter based on the component’s
precise world positions. This method worked well for removing duplicates of the same part,
as it was counted as duplicates if the absolute world positions of the middle of the compo-
nents matched precisely. One possible improvement could be to check for other components
in a box around the middle of each component; this could find components of different shapes
but in approximately the same location.
Levels of Detail
Levels of Detail (LODs) linearly change the quality of the CAD model based on the distance
to the model or how much of the screen area a model occupies. This method would utilize
Unity’s LOD system, which automatically creates and switches between different levels of
detail for a given object. By using LODs, it’s possible to reduce the number of triangles in
the model at a distance, which helps improve the overall system’s performance. This was not
attempted for this thesis since the performance was good enough for the system to be usable,
and high model quality was of importance.
28
5.3. Issues, problems, and notes during development
5.3.5 Programming
Programming for the HoloLens 2 is done primarily with Unity, a popular and powerful game
development engine that allows developers to create interactive 3D applications and experi-
ences.
Unity Preview
One advantage of using Unity for programming the HoloLens 2 is the Unity Preview feature.
This feature allows developers to quickly test their latest changes, either locally or on the
HoloLens 2, which makes the development process more efficient. The Unity Interface is
easy to use, intuitive, and powerful, which makes it an excellent tool for developers of all
skill levels.
Unity Interface
Unity’s editor interface is shown in Figure 5.1 and is divided into several panels and win-
dows, each with its specific purpose.
• The Scene window is where you can view and edit the objects in your scene. It allows
you to move, rotate, and scale objects and adjust lighting and camera settings.
• The Hierarchy window shows the hierarchical structure of the objects in your scene,
making it easy to organize and manage your objects.
• The Inspector window allows you to view and edit the properties of the selected object,
such as its position, rotation, and scale. It also allows you to add and remove compo-
nents, such as scripts, colliders, and materials.
• The Project window is where you can access and manage all of the assets in your project,
such as models, textures, and audio files.
Colliders
It is necessary to add colliders to each part that has been imported into the Unity project to
allow for the use of the interaction components. The aim was to have colliders that are as
accurate to the actual part mesh as possible while maintaining good performance. It was
determined that mesh colliders were the best choice to have good intuitiveness.
Convex Mesh Collider Limitations The near and far interaction components provided by
the MRTK require the components to have one of a subset of Unity’s available colliders, ei-
ther Box, Sphere, Capsule, or Mesh. When using a mesh collider, it was required to be a
convex mesh collider. Unity’s convex mesh has a limitation of 255 triangles. This limited the
possibility of using this collider since most components have a mesh of over 255 triangles.
The solution to this was to check the number of triangles in each mesh in the imported as-
sembly and, from that, use convex mesh colliders if less than 256 triangles and a box collider
otherwise. A comparison of the two mesh alternatives are shown in Figure 5.2.
29
5.3. Issues, problems, and notes during development
Figure 5.2: A comparison of a convex mesh collider and a box collider on a sphere in the
Unity editor.
30
5.4. The human experience
Pixyz Solutions Pixyz Plugin allowed for more complex mesh structures with mesh deci-
mation and decomposition. This solution was not possible because of the license limitation
described in Chapter 5.3.2.
Figure 5.3: An example of the pinching maneuver for interaction simulated in the Unity
editor.
5.4.2 Speech
Microsoft’s HoloLens 2 includes built-in microphones, and advanced speech recognition ca-
pabilities, which allow users to speak commands and control virtual objects using their voice.
This makes it easy to perform feature activation tasks without interrupting the interaction
with the CAD models.
It’s possible to define any English word to be recognized by the HoloLens 2 user, and
the connection to activating a feature is easy in the Unity editor. The device is good at only
recognizing the voice command when the actual user is talking. During my testing, it has not
yet been triggered by a bystander talking.
31
5.4. The human experience
Figure 5.4: A screenshot from a video recorded with the HoloLens 2 showing the use of the
pinching feature on the explode slider.
of using box colliders as described in Section 5.3.5 was that a user could sometimes grab a
component when pinching the air or sometimes grab a different component than anticipated.
The advantage is that the entire component is inside the box collider, and you would always
grab a component when pinching on something visible.
Far Interaction
The far interaction component allows users to interact with virtual objects farther away.
This works through a virtual laser pointer projected in the HoloLens 2 environment from
the user’s hands as shown in Figure 5.5. Interaction with the object occurs when the “laser
pointer“ touches an object and the user pinches their fingers simultaneously.
Near Interaction
The near interaction component allows users to interact with virtual objects close to the de-
vice, such as when working with a virtual keyboard or manipulating objects using hand
gestures and pinching as shown in Figure 5.5. I found the near interaction with pinching the
most intuitive for new users since it feels like you are grabbing an actual component.
32
5.4. The human experience
Figure 5.5: An example of using the near and far interaction features simulated in the Unity
editor.
Pinching Intuitiveness
One of the main ways users interact with the device is through pinching gestures, which
are used to select and manipulate virtual objects. However, I found many new users had
difficulty performing the pinch gesture correctly, particularly when asked to “pinch with
your thumb and index finger, and do a large pinch that is visible by the device.“
One of the main challenges in performing the pinch gesture is that it requires precise hand-
eye coordination and a good understanding of the device’s tracking capabilities. To perform
the pinch gesture correctly, users must be able to spread their thumb and index finger more
than what is intuitive and then perform the pinch gesture in a way that is visible to the device.
This can be difficult for users with limited experience with the device or who have difficulty
with fine motor skills.
Movement Cube
One limitation of the movement cube described in Chapter 4.1.2 is that it does not know
where the ground level is. It is up to the user to place the assembly at the desired height
rather than the software automatically aligning it onto the ground. This can make it more
difficult for the user to correctly position the assembly on the floor, as they need to pay close
attention to the placement of the cube. An early attempt to always have the assembly placed
on the highest physical object under itself resulted in the assembly jumping onto people’s
heads when walking past the user. It was then determined that the software should not care
for the real-world environment. Nevertheless, the movement cube function is a helpful tool
for quickly and easily moving the assembly in the environment and can help with ergonomics
since the user can position the model at any height and allow for interaction without crouch-
ing.
33
5.4. The human experience
Slider interactions
The sliders, as described in Chapter 4.1.2, were implemented to be scalable for any number
of imported CAD files. Initial versions of the slider had the location of the slider as an offset
to the Unity camera (HoloLens 2 device), with some smoothing applied. New users found
it initially difficult to accurately pinch the slider cursor, especially people with short arms,
since the slider sometimes was too far away to reach. This was not user-friendly since small
head movements made the slider move slightly and made it difficult to grab the cursor. Later
versions used a different interaction script that froze the slider’s position when a hand was
near. This made it much more intuitive to use.
Ease Of Use
The hardware and software must be easy to use and comfortable, especially if used for pro-
longed periods or often during work.
Ergonomics Ergonomics were generally found to be good. The device is easily adaptable
for different head sizes through a wheel on the back of the device; it is designed with glasses-
wearers in mind and is not interfering with the user’s glasses [40]. There was no noticeable
concern about the device sliding or falling off, and most users were confident enough to use
both hands for interaction.
Nausea Each user only used the HoloLens 2 for a short duration, and almost no one com-
plained about nausea.
Battery Life The software was almost always running during the eight-hour student fair
and was connected to a 65W USB Type-C charger when not in use. The battery percentage
was around 50% most of the day. Battery life was around one hour when fully charged and
running the software, so easy access to a charger would be recommended. Microsoft claims
2-3 hours of active usage and a minimum of 15W charge power to maintain battery level [40].
Performance
The balance of performance to power draw and weight is a continuous struggle with tether-
less, battery-powered devices.
FPS Some users experienced the framerate of the device to be low. The software sometimes
ran below 30 FPS, and I expected many people to notice and comment on that. I found
that the way the HoloLens visualizes the holograms makes it entirely usable, even below the
recommended 60 FPS. It is primarily when manipulating the components that judder occurs.
This experience reinforced my thought that image quality is more important than high and
stable framerates for this type of application.
34
5.4. The human experience
Responsiveness The HoloLens 2 projects a virtual interpretation of your hands and does an
excellent job following your movements. No users complained about a lack of responsiveness
when manipulating components. The lack of responsiveness is balanced with a smoothing
effect that reduces discomfort when your movements are not tracked quickly enough.
Alternatives
There are software products designed explicitly for visualizing CAD files in VR. For example,
Pixyz Review and Siemens NX Virtual Reality [47, 48]. The HoloLens 2 is different from other
HMDs because it is a computer running a separate OS. This makes it difficult for software
manufacturers to implement support for the HoloLens 2, and both Pixyz Review and Siemens
NX Virtual Reality lack support for the HoloLens 2.
View
FOV Some users complained about a small FOV when using the HoloLens 2. The FOV
is improved from the first generation HoloLens, and will hopefully only improve in future
generations.
Resolution The resolution is good, and I heard no complaints from users. The readability
of text on components is quite similar to what is expected in real life, where you need to
approach smaller text to increase readability.
Colors The CAD models used for this thesis were not that colorful, and no analysis of color
accuracy can be made other than that colors appear bright and clear.
Anti-Aliasing Some users complained about bad aliasing. The edges of components appear
jagged as shown in Figure 5.6 and reduce the immersion for the user. Anti-aliasing technolo-
gies could reduce the aliasing but at the cost of performance. This was not researched or
tested for the thesis.
Presentation Features The HoloLens 2 provides a suite of functions for sharing content with
others. It is a business-focused product with good integration into the Microsoft/Windows
ecosystem.
PC-connection
The HoloLens 2 device can be connected to a PC through a USB Type-C cable or Wi-Fi. It is
possible to charge and use the device at the same time.
Real-time View It is possible to see a real-time view of what a user sees when using the
HoloLens 2 device. The device blends the holographic view with the view of the device’s
cameras and sends it over Wi-Fi to the Microsoft HoloLens companion app or through Mira-
cast. The latency of Miracast is much lower than with the companion app, and the connection
was found to be more stable. It is also possible to cast through other methods that have not
been used for this thesis [49].
On Device Recording
The HoloLens 2 supports on-device recordings in 1080p 30 FPS. These recordings can then
be sent to another device or uploaded online. The recordings are better quality than when
casting in real-time, but the performance and framerate impact is substantial.
35
5.5. The work in a wider context
2. Increased Production Efficiency: The ability to rapidly make changes to designs can also
lead to increased production efficiency, as engineers can quickly identify and resolve
any issues that arise during the design process. This can result in faster time-to-market
for products and increased competitiveness for companies.
3. Cost Savings: Improved production efficiency can also result in cost savings, as compa-
nies can produce products faster and with fewer mistakes, reducing the need for rework
and retooling.
4. Better Collaboration: The MR system can also facilitate better collaboration between
CAD engineers and other stakeholders, as engineers can easily demonstrate their de-
signs to others and get feedback. This can lead to more informed design decisions and
a better overall design process.
6. Increased Awareness and Interest: MR can be a valuable educational tool, allowing peo-
ple to learn more about heat pumps and efficient residential heating systems. By expe-
riencing the designs in an MR environment, users can gain a deeper understanding of
the technology and its potential benefits, potentially increasing interest and investment
in heat pumps and other efficient heating systems.
36
6 Conclusion
In this chapter, we will summarize the key findings and contributions of this thesis, as well
as discuss future research directions and potential applications of the developed system.
6.1 Summarization
This thesis project aimed to investigate the use of MR technology in the CAD engineering
process for heat pump design where large and advanced CAD assemblies are used.
6.1.1 How can a system for visualizing CAD assemblies in mixed reality be
realized so that CAD engineers can use it for rapid prototyping?
An MR system for visualizing CAD assemblies can be realized using an HMD, such as the Mi-
crosoft HoloLens 2, which has high-quality, semi-transparent displays and inside-out track-
ing capabilities and can run custom software. Third-party CAD conversion and importing
tools can convert the CAD assemblies into a format compatible with a graphics engine at
runtime. The implementation should include conversion of CAD files, runtime importing
of converted files, optimization of imported assemblies, and addition of interaction compo-
nents and quality-of-life features to create an efficient, effective, and easy visualizing tool for
working with CAD assemblies in an MR environment.
6.1.2 What software limitations and solutions are there for realizing such a
system?
Software limitations must be addressed to realize a system for visualizing advanced heat
pump CAD assemblies in MR. The MRTK’s interaction components require that the CAD
components have either a convex mesh collider or a primitive collider for near or far interac-
tion. However, Unity has a constraint of 255 triangles for convex mesh colliders, which can
hinder the ability to use this collider on all components, especially when many components
have a mesh with more than 255 triangles. Software tools such as Pixyz Studio, Pixyz Plugin,
and CAD-Exchanger SDK are available to address issues with converting CAD files from one
file format to another. Pixyz Plugin is a software plugin for Unity that feature advanced op-
timization tools but has a license limitation that prevents its use at runtime in Unity. Pixyz
37
6.2. Future work
Studio and Pixyz Scenario Processor allow converting CAD files to many popular file types,
but the cost of these tools may make them difficult to use. CAD Exchanger SDK is a complete
solution for importing CAD files at runtime, providing ready-to-use C# scripts for Unity.
38
Bibliography
[1] Iain Staffell, Dan Brett, Nigel Brandon, and Adam Hawkes. “A review of domestic
heat pumps”. In: Energy Environ. Sci. 5 (11 2012), pp. 9291–9306. DOI: 10 . 1039 /
C2EE22653G. URL: http://dx.doi.org/10.1039/C2EE22653G.
[2] Sophie Nyborg and Inge Røpke. “Energy impacts of the smart home: Conflicting vi-
sions”. English. In: Energy Efficiency First: The foundation of a low-carbon society. European
Council for an Energy Efficient Economy, 2011, pp. 1849–1860. ISBN: 978-91-633-4455-8.
[3] S.H. Choi and A.M.M. Chan. “A virtual prototyping system for rapid product de-
velopment”. In: Computer-Aided Design 36.5 (2004), pp. 401–412. ISSN: 0010-4485. DOI:
https://doi.org/10.1016/S0010-4485(03)00110-6. URL: https://www.
sciencedirect.com/science/article/pii/S0010448503001106.
[4] Chetan Shukla, Michelle Vazquez, and F. Frank Chen. “Virtual manufacturing: An
overview”. In: Computers & Industrial Engineering 31.1 (1996). Proceedings of the 19th
International Conference on Computers and Industrial Engineering, pp. 79–82. ISSN:
0360-8352. DOI: https : / / doi . org / 10 . 1016 / 0360 - 8352(96 ) 00083 -
6. URL: https : / / www . sciencedirect . com / science / article / pii /
0360835296000836.
[5] Microsoft. Windows Mixed Reality - Develop - Install the tools. Accessed: 2023-01-13. URL:
https : / / learn . microsoft . com / en - us / windows / mixed - reality /
develop/install-the-tools#installation-checklist.
[6] Industrial automation systems and integration — Product data representation and exchange
— Part 21: Implementation methods: Clear text encoding of the exchange structure. Standard.
Geneva, CH: International Organization for Standardization, Mar. 2016.
[7] Paul Milgram and Fumio Kishino. “A taxonomy of mixed reality visual displays”. In:
IEICE TRANSACTIONS on Information and Systems 77.12 (1994), pp. 1321–1329.
[8] Shakespearesmonkey. Dreams are the new Reality. Accessed February 27, 2023, Public
Domain Mark 1.0. 2019. URL: https://www.flickr.com/photos/67592622@
N00/48355780392.
[9] Alexei Vranich. Virtual model pumapunku. Accessed February 27, 2023, Creative Com-
mons Attribution 4.0 International available at https://creativecommons.org/licenses/by/4.0/deed.en.
2018. URL: https : / / heritagesciencejournal . springeropen . com /
articles/10.1186/s40494-018-0231-0.
39
Bibliography
[10] NASA/Chris Gunn. Exploring the Universe in Virtual Reality. Accessed Febru-
ary 27, 2023, Creative Commons Attribution 4.0 International available at
https://creativecommons.org/licenses/by/4.0/deed.en. 2019. URL: https : / / go .
nasa.gov/2RFSOLw.
[11] A. Tait. “Desktop virtual reality”. In: IEE Colloquium on Using Virtual Worlds. 1992,
pp. 5/1–5/5.
[12] Frank Biocca and Ben Delaney. “Immersive virtual reality technology”. In: Communica-
tion in the age of virtual reality 15.32 (1995), pp. 10–5555.
[13] OyundariZorigtbaatar. Augmented-reality. Accessed February 27, 2023, Cre-
ative Commons Attribution-Share Alike 4.0 International available at
https://creativecommons.org/licenses/by-sa/4.0/deed.en. 2016. URL: https : / /
commons.wikimedia.org/wiki/File:Augmented-reality.jpg.
[14] Microsoft. What is mixed reality? [Accessed January 2, 2023]. 2022. URL: https : / /
learn.microsoft.com/en-us/windows/mixed-reality/discover/mixed-
reality.
[15] Miguel Borges, Andrew Symington, Brian Coltin, Trey Smith, and Rodrigo Ventura.
“HTC Vive: Analysis and Accuracy Improvement”. In: 2018 IEEE/RSJ International Con-
ference on Intelligent Robots and Systems (IROS). 2018, pp. 2610–2615. DOI: 10 . 1109 /
IROS.2018.8593707.
[16] Michael J Gourlay and Robert T Held. “Head-Mounted-Display Tracking for Aug-
mented and Virtual Reality”. In: Information Display 33.1 (2017), pp. 6–10.
[17] Katharina Pentenrieder, Peter Meier, Gudrun Klinker, et al. “Analysis of tracking ac-
curacy for single-camera square-marker-based tracking”. In: Proc. Dritter Workshop
Virtuelle und Erweiterte Realitt der GIFachgruppe VR/AR, Koblenz, Germany. Citeseer. 2006.
[18] Svitlana Alkhimova and Illia Davydovych. “Accuracy assessment of marker recogni-
tion using ultra wide angle camera”. In: Technology audit and production reserves 3.2/65
(2022), pp. 6–10.
[19] Soumitra P Sitole, Andrew K LaPre, and Frank C Sup. “Application and evaluation
of lighthouse technology for precision motion capture”. In: IEEE Sensors Journal 20.15
(2020), pp. 8576–8585.
[20] Vazquez88. Motion Capture with Chad Phantom. Accessed February 28,
2023, Creative Commons Attribution-Share Alike 3.0 Unported available at
https://creativecommons.org/licenses/by-sa/3.0/deed.en. URL: https : / /
commons . wikimedia . org / wiki / File : Motion _ Capture _ with _ Chad _
Phantom.png.
[21] Apple Inc. Use the Measure app on your iPhone, iPad, or iPod touch. [Accessed January 9,
2023]. 2021. URL: https://support.apple.com/en-us/HT208924.
[22] Atomicdragon136. IOS measure app demonstration. Accessed February
27, 2023, Creative Commons Attribution 4.0 International available at
https://creativecommons.org/licenses/by/4.0/deed.en. 2018. URL: https : / /
commons.wikimedia.org/wiki/File:IOS_measure_app_demonstration.
jpg.
[23] Clem Bastow. From Pokéstops to Pikachu: everything you need to know about Pokémon
Go. [Accessed January 9, 2023]. 2016. URL: https : / / www . theguardian . com /
technology / 2016 / jul / 11 / from - pokestops - to - pikachu - everything -
you-need-to-know-about-pokemon-go.
[24] Google. Google Cardboard. [Accessed January 9, 2023]. URL: https://arvr.google.
com/cardboard/.
40
Bibliography
[25] Ananda Bibek Ray and Suman Deb. “Smartphone Based Virtual Reality Systems in
Classroom Teaching — A Study on the Effects of Learning Outcome”. In: 2016 IEEE
Eighth International Conference on Technology for Education (T4E). 2016, pp. 68–71. DOI:
10.1109/T4E.2016.022.
[26] Kevin R. Brooks. “Depth Perception and the History of Three-Dimensional Art: Who
Produced the First Stereoscopic Images?” In: i-Perception 8.1 (2017). PMID: 28203349,
p. 2041669516680114. DOI: 10.1177/2041669516680114. eprint: https://doi.
org / 10 . 1177 / 2041669516680114. URL: https : / / doi . org / 10 . 1177 /
2041669516680114.
[27] Michael Doneus and Klaus Hanke. “Anaglyph images-still a good way to look at 3D-
objects”. In: Proceedings of the 17th CIPA Colloquium: Mapping and Preservation for the New
Millenium: 3-6 October 1999; Olinda, Brazil. 1999.
[28] NASA/JPL-Caltech. PIA17948: Martian Landscape With Rock Rows and Mount Sharp
(Stereo). Accessed February 27, 2023. 2014. URL: https : / / photojournal . jpl .
nasa.gov/catalog/PIA17948.
[29] A. Simon, M. G. Prager, S. Schwarz, M. Fritz, and H. Jorke. “Interference-filter-based
stereoscopic 3D LCD”. In: Journal of Information Display 11.1 (2010), pp. 24–27. DOI: 10.
1080/15980316.2010.9652114.
[30] Boffy b. Master system 3d glasses. Accessed February 27, 2023, Creative Commons Attri-
bution 2.5 Generic available at https://creativecommons.org/licenses/by/2.5/deed.en.
2006. URL: https://commons.wikimedia.org/wiki/File:Master_system_
3d_glasses.jpg.
[31] Handige Harrie. Polarised stereo glasses. Accessed February 27, 2023. 2010. URL: https:
//commons.wikimedia.org/wiki/File:Polarised_stereo_glasses.JPG.
[32] Douglas Lanman, Gordon Wetzstein, Matthew Hirsch, Wolfgang Heidrich, and
Ramesh Raskar. “Beyond parallax barriers: applying formal optimization methods to
multilayer automultiscopic displays”. In: Stereoscopic Displays and Applications XXIII.
Ed. by Andrew J. Woods, Nicolas S. Holliman, and Gregg E. Favalora. Vol. 8288. In-
ternational Society for Optics and Photonics. SPIE, 2012, 82880A. DOI: 10.1117/12.
907146. URL: https://doi.org/10.1117/12.907146.
[33] R. Barry Johnson and Gary A. Jacobsen. “Advances in lenticular lens arrays for vi-
sual display”. In: Current Developments in Lens Design and Optical Engineering VI. Ed. by
Pantazis Z. Mouroulis, Warren J. Smith, and R. Barry Johnson. Vol. 5874. International
Society for Optics and Photonics. SPIE, 2005, p. 587406. DOI: 10.1117/12.618082.
URL : https://doi.org/10.1117/12.618082.
[34] Cmglee. Parallax barrier vs lenticular screen. Accessed February 27, 2023,
Creative Commons Attribution-Share Alike 3.0 Unported available at
https://creativecommons.org/licenses/by-sa/3.0/deed.en. URL: https : / /
commons.wikimedia.org/wiki/File:Parallax_barrier_vs_lenticular_
screen.svg.
[35] Lisa Rebenitsch and Charles Owen. “Estimating cybersickness from virtual reality ap-
plications”. In: Virtual Reality 25.1 (2021), pp. 165–174.
[36] Yasin Farmani and Robert J Teather. “Evaluating discrete viewpoint control to reduce
cybersickness in virtual reality”. In: Virtual Reality 24.4 (2020), pp. 645–664.
[37] Introduction to the Mixed Reality Toolkit-Set Up Your Project and Use Hand Interaction. URL:
https://learn.microsoft.com/en-us/training/modules/learn-mrtk-
tutorials/ (visited on 11/14/2022).
[38] Software Libraries to Read, Write and Visualize 3D CAD files. URL: https : / /
cadexchanger.com/products/sdk/ (visited on 11/29/2022).
41
Bibliography
[39] Mixed Reality Feature Tool. URL: https : / / aka . ms / MRFeatureTool (visited on
11/30/2022).
[40] Microsoft. About HoloLens 2. [Accessed January 30, 2023]. 2022. URL: https://learn.
microsoft.com/en-us/hololens/hololens2-hardware.
[41] Unity Technologies. Pixyz Plugin. [Accessed January 26, 2023]. 2022. URL: https://
www.pixyz-software.com/plugin/.
[42] Unity Technologies. Licensing Policy. [Accessed January 26, 2023]. 2022. URL: https://
www.pixyz-software.com/documentations/html/2022.1/plugin4unity/
LicensingPolicy.html.
[43] Unity Technologies. Pixyz Studio. [Accessed January 26, 2023]. 2022. URL: https://
www.pixyz-software.com/studio/.
[44] Unity Technologies. Pixyz Scenario Editor. [Accessed January 26, 2023]. 2022. URL:
https://www.pixyz-software.com/scenario-processor/.
[45] Andreas Atteneder. glTFast. [Accessed January 26, 2023]. 2022. URL: https : / /
github.com/atteneder/glTFast.
[46] Unity Technologies. Export Unity Prefab. [Accessed January 26, 2023]. 2022. URL:
https : / / pixyz - software . com / documentations / html / 2022 . 1 /
scenarioprocessor/Workflowexamples.html.
[47] Unity Technologies. Pixyz Review. [Accessed January 30, 2023]. 2023. URL: https://
www.pixyz-software.com/review/.
[48] Siemens. NX Virtual Reality. [Accessed January 30, 2023]. URL: https://www.plm.
automation.siemens.com/global/en/products/mechanical-design/nx-
virtual-reality.html.
[49] Microsoft. Create mixed reality photos and videos. [Accessed January 30, 2023]. 2022. URL:
https://learn.microsoft.com/en-us/hololens/holographic-photos-
and-videos.
42