Solid Ar Unity

by giova984

Adding Occlusion Masks to Unity applications for Hololens

( Crawled 29 minutes ago )
+

SolidAR framework: Occlusion masks for Unity

The current trend in AR is to adopt optical see-through displays to combine the virtual world with the real world. The Microsoft HoloLens— the first fully self-contained AR system—is currently available with other companies planning to release their own devices soon. In optical see-through (OST) HMDs virtual images are combined with the real-world view by means of half-transparent mirrors or using light-additive transparent displays. A shortcoming of the current generation of OST AR headsets is making virtual elements appear solid not transparent. The adopted displays are not capable of real-world occlusion or they are partially capable of occlusion but result too bulky and have a limited field of view. As a result of this lack, the augmented contents are affected by the real-world lighting and dark elements appear to be transparent causing virtual objects to be often perceived as translucent ghosts, making it more difficult to use these devices in various situations.

Our approach exploits projector-based lighting control to add real-world occlusion capabilities to the Microsoft HoloLens headset. The idea is to mask the real-world surfaces that lay behind the augmented contents in order to improve AR imagery by projecting occlusion masks on top of them. In order to compute the occlusion mask, the geometries of both the real and the virtual environments must be known. The real environment can be modelled by using a CAD software or acquired by means of 3D scanners or RGB-D cameras. If the real environment is static, an initial acquisition of its geometry is sufficient; conversely, for dynamic scenes, the geometry needs to be continuously updated. This framework currently support only static meshes imported in Unity. It's up to the developer to generate the 3D model of the environment.

The framework, that we called SolidAR, exploits only commodity hardware that can be easily bought by developers and researchers. The system, due to the necessary instrumentation of the environment, is suitable only for laboratory setups not for commercial installations, so our purpose is to freely release it to anyone interested in conducting studies in which this capabilities matter.

Minimum Hardware Requirements

  • a Microsoft HoloLens
  • a stereo projector (Optoma GT750 and GT1080 tested)
  • a pair of shutter glasses (and signal connected to graphic card or projector) that needs to be fixed in front of the HoloLens display (Optoma 3D-RF and ZF2300 tested)
  • a workstation with a graphic card that can run stereo projectors (AMD FirePro and Nvidia 980GTXm tested)

Software Requirements

  • Windows 10 OS installed on the workstation
  • Visual Studio 2015 or above
  • Unity 5.5.0f3 (last version tested)
  • Configure the workstation to be able to run stereo application on the projector
  • Configure the workstation to develop Unity-based HoloLens application (you can follow this guide)

Integrating occlusion masks in a Unity Application

The steps needed for a developer to integrate the lighting control into his AR applications are the following:

  1. Import the SolidAR package to the Unity Project
  2. Adding a HoloCamera prefab to the scene.
  3. Adding one or more ProjectorCameras prefabs to the project according to the system’s actual configuration and assign to configure the intrinsic projector’s calibration.
  4. Configure the HoloSender script (attached to the HoloCamera) with the actual parameters of the used network. This component performs the streaming of the HoloLens tracking data.
  5. Configure the HoloReceiver script (attached to the each ProjectorCamera) with the actual parameters of the used network. This component receiving the HoloLens tracking data allows the occlusion masks calculation.
  6. Assign the virtual objects to the RemoteScene layer (using the Unity tagging mechanism).
  7. Assign all the real objects to the LocalScene layer (using the Unity tagging mechanism).
  8. Deploy the application to the HoloLens (set UWP target platform).
  9. Deploy the same application to the each workstation connected to one or more projectors.

Calibration

Projector's intrinsics can be calculated using several methods (demanded to the user) and the parameters can be added in the CameraFrustum script attached to the ProjectorCamera.

Running the system

Run the applications on the HoloLens and on the PC connected to the same configured network. The HoloLens is enabled to listen for "speech commands", while the pc application to (keyboard commands).

  • "Reference Calibration": start a manual root reference system procedure. A virtual object is shown on the HoloLens and must be aligned with the real counterpart. The calibration ends on air tap gesture or with the "Stop Calibration" speech command.

  • "Projector [1...8] Calibration": start a manual calibration of the chosen projector. A virtual projector object is shown on the HoloLens and must be aligned with the real counterpart. The calibration ends on air tap gesture or with the "Stop Calibration" speech command.

The calibrations are stored persistently on the HoloLens, however sometimes the calibration need to be performed again if the HoloLens tend to drift between different runs.

  • "Clear Calibration": reset the whole calibration.
  • "Show Calibration" show the calibration objects.
  • "Hide Calibration": hide the calibration objects.
  • "Quit Application": quit the HoloLens application.

  • key C: toggle visualization of calibration objects.

  • key O: toggle occlusion masks.
  • key S: toggle virtual shadows (shadows of virtual objects on the real scene).
  • key 0: activate manual refinement of root calibration.
  • key 1[...8]: activate manual refinement of the projector [1...8] calibration.
  • key 9: disable manual calibrations refinement.
  • keys [Up, Down, Left, Right, PgUp, PgDown]: refine the selected active calibration by moving the active calibration object in 3D space. The calibration is persistently saved on the HoloLens.
  • keys Shift + [Up, Down, Left, Right, PgUp, PgDown]: refine the selected active calibration by rotating the active calibration object in 3D space. The calibration is persistently saved on the HoloLens.