As part of our virtual reality story, IAN, we needed a way to display information to the player to help them learn about who they are. This was going to be in the form of journals and log entries as well as memories IAN has of conversations between researchers. We wanted this information to be available as both text and audio form for the player so they can more easily absorb the information.
For this, we had to design a UI that could be interacted in a 3D space, ideally using the players hands to control it. As such, I created the initial design for the UI below.
This initial UI was created in InkScape, a vector drawing program. Using a vector program allowed me to easily adjust the different elements in the future if required including resizing, repositioning and changing the curves as lines easily. The resolution of the final image was exported at 1280×720.
For the design of the UI, I wanted a holographic feeling as it was designed to be projected above the player’s hand from a device. As we were designing a lab that had a very clean look, the UI was designed to contrast against it so it is more easily readable (in this case, the dark grey background on the UI would stand out against the white surroundings of the player). A solid black background was not used as it can feel too overpowering and we’re generally more used to perceiving near black colours.
The blue outline of the UI sections helps break up the information into the relevant area via proximity, grouping them together. Each section also has a relevant header to allow the player to understand what each section is for.
For the buttons themselves, I designed them to be larger than standard button sizes to accommodate for the fact that they would be interacted using fingers which are much larger than a cursor button. The style of buttons fits in with the overall theme of the UI with rounded edges except for the left side of the play button which has sharp corners but follows the outline.
The buttons change colour once they have been pushed/active to indicate to the player that they have been successfully pressed. At this time, they currently don’t play a sound however for the play/stop buttons, the feedback for them would be the fact that the audio log associated with the entry has started/stopped playing. The orange colour is a contrast to the blue colour and as a bright colour, it indicates that is active.
The text elements of the UI are all capitals which are more indicative of a title or header font. In this case, the same font was used for the buttons and the section headers as a large font needed to be used so it could be seen in the Rift which has a low screen resolution, making text harder to read. The text elements on the sections are located at the top to indicate the content below is related to them and the button text is centered inside the button which is the standard convention for buttons.
Testing the design
Before we started doing playability and usability testing, we implemented the UI in the game ourselves and attempted to use it to see if there was any glaring issues we could spot and fix before testing. We came across a number of usability issues relating to hand tracking as well as placement of some elements of the UI.
The most noticeable and first issue we came across was the fact that when your hands overlapped, the Leap Motion had a hard time tracking them and often caused one hand to disappear or not be able to put one in the correct position. This had the effect of causing the hand the UI is mounted on to disappear thus causing the UI to disappear. In some cases, it made it impossible to click the UI buttons as although it was tracking both hands, it would not track the hand attempting to push the buttons well enough causing it to not reach the button or push the incorrect one.
A solution we came up for this was to instead of having the UI mounted above the hand like we originally planned, was to instead mount the UI to the left of the players hand. This has the advantage that the two hands will never overlap thus preventing the issue of hand tracking failure.
However we also found that having a separate button for start and stop was an unusual design as although the audio was playing, you could still click the play button causing it to do nothing. The same was true for the stop button and it felt like an inefficient use of the space. The issue was still present of accidentally clicking the wrong button too due to the close proximity of the two and that your hand can overlap the player’s view of the UI.
For this, the button was designed to one larger button called “audio” that toggle the playing of the log file. The larger design of the button makes it easier to click whilst still being obvious what it does.
The updated design is much easier to interact with due to the new positioning compared to above the hand. The one single large button is also much easier to push and interact with compared to the two separate ones from before.
Now that I had a functioning UI that I felt worked well to used in the story, it needed to be properly tested in a usability test. I will cover the in depth details and feedback from the test in another blog post however for now, I will give a quick overview of the feedback directly relating to the UI as well as any changes that was made.
Participants liked the UI with comments saying they enjoyed the style and interaction with it. There was mentions however that the UI wasn’t permanent enough and that an issue with hand tracking caused it to disappear without anyway to bring it back up. I solved this by changing the way it handles failed hand tracking. It now has a longer re-track period (how long it will keep looking for the hand) but it still needs improvement. Currently there is no way to bring up the UI without touching another iterable object but this can be solved with the use of gestures.
Most of the issues people had interacting with it was related to software issues unfortunately which greatly reduced the experience (the test was conducted on the 2.3 SDK where the tracking of hands isn’t as accurate compared to the Orion SDK). Sometimes the hands would randomly disappear or act erratic causing invalid button presses or making the log scroll. The main solution to this is to require the latest version of the SDK installed and which greatly improves hand tracking.
Another issue that arose was related to the DK2 in that the text was not readable by half of the testers. Part of this was caused by the DK2 not being properly calibrated to the user’s eyes (the lenses were set at the furthest back position) as well as the low resolution of the screen itself. This can be negated however by making the text larger which I did as a result of the feedback.
Overall, the reaction to the final UI was very positive despite issues mostly relating to hardware or drivers. The UI went through a few different iterations based upon our own internal testing as well as based upon the feedback from the usability. I believe the UI makes good use of the CRAP principles with stark contrast between the button states as well as the layout as well as grouping the various areas together. It is fully functional and although it could do with a little polish, I am very happy with the results.