Measuring Quality of Experience
using Augmented Reality
in Training Systems
Electronic Systems Design and Innovation, Specialization Project
This repository contains the specialization project done by Peter Remøy Paulsen, the fall of 2020, as a part of his masters degree in electronical engineering.
We took a deeper look into realism in AR, and tried to figure out what the level of occlusion has to say for the Quailty of Experience for the user in an augmented reality system.
To test if occlusion can have anything to say, we built an augemnted reality app using Unity and ARCore.
In the app the participants exposed of either environment 1 or environment 2
|Environment 1||Occlusion algorithm component from ARCores API enabled, so that the environment is able to capture depth data and occlude virtual objects in the real world|
|Environment 2||Occlusion algorithm component disabled. No dapth data captured. Virtual objects will float above real objects.|
|Environment 1||Environment 2|
The pre-compiled APK can be found in the releases.
Compile on your own
You can compile the project on you own in Unity. Download the project and import the project to you Unity installation
git clone https://github.com/petrepa/TFE4580.git
git clone firstname.lastname@example.org:petrepa/TFE4580.git
Can be found in the Wiki
- Researcher and Developer: Peter Remøy Paulsen 👨🎓
- Supervisor: Andrew Perkis 👨🏫
- Co-supervisor: Shafaq Irshad 👩🏫
Further documentation needed for:
- Experimental Application Overview
- With GIFs demonstrating anything
- TL;DR of report
- But should link the entire report
- Data analysis
- The Jupyter Notebook
- Casual notes
- The wiki