SALTE (Unity part)
This is the VR listening test interface of SALTE framwork.
Built with Unity 2019.2.3f1.
This part of the framework is still under development.
We are aiming to publish version 1.0 early next year (Please “Watch” our project for latest update).
We would like to hear your feedback and have your contribution to this project.
1. Building VR listening tests without prior knowledge of VR game engines (e.g. Unity or Unreal)
- No prior knowledge in Unity or C# code
- Drag and drop interface
- Well documented with templates and tutorial available
2. Open Source
- Know the back-end
- Flexible (easy to modify)
- Everyone is welcome to contribute
3. Standardise work-flow
- Standardise data
- Stable and easy to deploy
- Repeatable test = Good science
- Comparable results
- Easy to expand the scale of the test
4. Share tests and results
- One person’s noise is another person’s data
- No need to build or repeat similar tests
- Share test easily (with a single .JSON file)
- Big data era for machine learning
5. Easy to use and ergonomic interface
- Intuitive design (reduce training time)
- Improve test speed
- Better interface = longer test
Make sure you have setup your VR headset.
(The interface have been tested with Oculus Rift and Oculus Rift S. )
Open demo scene
- Download and start the SALTE renderer
- Open Unity (make sure it is on Unity 2019.2.3f1)
- Load the SALTE-VR-interface/Listening_TESTS_VR/Assets/Scenes/NYC.unity scene
- Press the Begin button in the renderer
- Have fun
- You will see a ray cast from the right hand controller
- Index trigger to “press” any button
- Move the joystick up or down to adjust the slider
- More control options will come later
- Create a .JSON that consolidate all the data for the listening test
- For people who are not familiar with coding or editing .JSON file
- Drag and drop interface
- More flexible (allow researchers to change background, interface and control method)
- A safer way to create listening test
1. Video tutorials
2. Localisation tests
3. 360 videos playback
4. 6 degrees of freedom (6 DoF)
5. More options for the VR interface and control method
1 and 2 will be included in version 1.0 (publishing in early next year).
Sneak peek: Localsation test
- Demoed in the 2019 AES International Conference in Immersive and Interactive audio (March 27-29, 2019) Localisation test with head pointing
- Game like interface (easy to embed into game in the future)
- Competitive gaming experience
- Externalisation response by changing the distance of the red dot (optional)
- Researchers can collect participants localisation and externalisation response with head movement data
Head pointing localisation test demo (Demoed in AES IIA 2019)
- The right-hand side is the participant’s view in VR.
- The left-hand side is for researchers to observe the participant’s realtime response, the purple dot is the target sound source location which is hidden in VR.
MATLAB app post processing localisation test data
- Plot multiple head movement and response
- Extract and plot data based on target angle, responded angle, error, HRTF sets etc.
Hand pointing method with head tracked grid (New)
Our E-Brief and poster presented in AES New York Convention 2019