'Injustice' trailer, edited by Yeongmin Won
The Unity engine has far more available space for interface elements than the VR headset allows.
Because of the limited space available on the VR headset, I had to confirm all UI elements by wearing the headset every time I modified the UI.
1. Most users felt they had enough time to make choices.
2&3. They also figured out how to give back audio and visual feedback almost immediately -- no learning curve required.
4. And, interestingly enough, most users liked both the eye-tracking and voice recognition. So we decided to use both interactions for our final experience.
We have two types User Interface in our experience;
1. Gaze interaction UI that appears when users look at certain objects.
2. Voice recognition UI that shows dialog trees users can choose by saying.