Equipment’s Impact on Experience

From our observations, we noticed that the participants who used the Google Cardboard had to move their heads more often and had more pronounced head movements than the participants that used the Google Daydream headset. Since our experience was built with interactive targets in the entire 360° space, all of the participants were pretty active while locating elements with which they could interact.

The three participants who used the Google Cardboard headset had to move their heads a bit more to align the focus reticle on the target. For example, they moved their heads to close the pop-up displays. In fact, one participant thought that the large close X within the pop-up displays were too high and caused him to move his head to look upward. The participants who used the Google Daydream didn’t need to change their head position. Instead they each moved the hand that was holding the controller. Since the controller had alignment issues, one participant commented that he thought it was easier to just focus on the element with the Google Cardboard headset.

Equipment: Headset

The headsets are designed differently; the Google Cardboard doesn’t have any way of attaching to the user’s head. Instead the user needs to hold it up to their face in order to use it. The Google Daydream, on the other hand, has a single strap that holds the headset in place. We were curious if participants would need to make adjustments to the headset during the duration of in which they were used.

We observed that the participants using the Google Cardboard didn’t need to make as many adjustments to how the headset was positioned on the face. Instead, they switched the hand that was holding the headset. Only one of the Google Cardboard users had a negative observation about holding the headset:

“I felt like my view with the [Google] Cardboard, I had to keep squeezing at different sides of my head. It's harder to keep the same field of view.”

A participant using the Google Daydream headset commented about the comfort of the headset:

“The Daydream did not seem significantly different from the actual quality of using my phone and Cardboard. But the feel was much more cushy. That felt good.”

We observed that most of the adjustments that the Google Daydream users made happened at the outset, taking anywhere from a few seconds to close to a minute. One participant, despite being given the option to adjust the strap, chose to hold onto the Google Daydream headset for nearly the entire interview.

Though users of the Google Cardboard needed to hold the headset up to their face, it did not have a negative impact on their enjoyment. One participant commented:

“I wouldn’t need one of those headband situations to hold this [Google Cardboard headset]. I’m OK with holding it on my face.”

We observed the three participants that used Google Cardboard started off holding the headset with both hands, but has time passed they would only hold with one hand. Then, they would switch from hand to hand to hold the headset. For one participant we observed that he switched hands holding the headset 17 times during the 30 minutes while viewing Version A.

All of our users who wore glasses during the session had to do some re-adjusting. They noted that the glasses didn’t impact comfort, both the Google Cardboard and Google Daydream seemed to go around them comfortably. Two users of the Daydream headset noted readability issues with the text. An adjustment to the Daydream strap, in each case, made the text more readable for them.

One of our participants wore trifocals; the prescription in these glasses is different from top to bottom of the lens to accommodate viewing things at a distance and close up. The participant accommodated this variation in prescription by tilting her head to adjust which part of the lens she looked through. VR experiences are designed for distance viewing, participants that wear bifocals, trifocals, or glasses for nearsightedness may have better focus by removing their glasses.

Another participant commented that the photos were out of focus for him when he peered over the top of his glasses. In order to accommodate he needed to tilt his head more to view the experience through his glasses. Glasses did not seem to limit the enjoyment for the participants that needed to wear them, but they need to take extra time to get them comfortable under the headset and also tilt their head more for the experience to be in focus.

Equipment: Hand controller

We did not ask directly what the participants thought of the Google Daydream hand controller; instead we listened to their comments to learn how a hand controller influenced their experiences. Six of our participants used the Google Daydream headset and controller. Three commented about the hand controller losing alignment with their position over time.

"It seemed like the remote was a little — not 100% accurate. Delay or it felt like I was pointing somewhere, but it wasn't always exactly where I was wanting."

It should be noted that there is a two step process to initiate the Google Daydream controller for WebVR. When participants first put the Pixel phone into headset they were required to initiate the controller to establish connection. They did this by pointing the controller directly in front of them and pressing the touchpad area of the controller. Participants needed to do this each time they put the Pixel into the headset. However, for our WebVR experience built with A-frame there was another click initiation step. Once participants loaded the experience within the browser, they needed to click the touchpad area to be able to see the cursor displayed. For our experience, there was no instruction screen letting the participants know that they would need to complete this second step. When creating the usability study, we were sure to include instructions for the two-step process that would help avoid accidentally clicking on an interactive item in the experience. In spite of the instructions, two participants accidentally clicked on an interactive target and were surprised when the experienced changed. As long as this second step is needed, we recommend starting WebVR experiences with an instruction screen. When users click the controller, it will take them into the experience.

Hand controller interactions

The VR experience that Four Kitchens built had limited interactions that used only the touchpad area of the controller to interact with two different options:

  • Fuse with a link to a new location.

  • Fuse with a link to display pop-up window, close by fusing with on an “X”.

With these limited interactions, it was easy for users of both the Google Cardboard or Google Daydream to navigate the experience. We did not observe any noticeable difference of the participants’ experiences between the participants that used the Google Cardboard versus those that used the Google Daydream. One participant, who had previously used a cardboard headset but for our session used the Google Daydream, commented that he would have preferred Google Cardboard for the experience. He thought it would be easier to just look at what he wanted to interact with than to use the controller.

Since the interactions were minimal, the experience was built to use only the touchpad area of the controller. One participant accidentally clicked on the middle button or the “App” button. Since this button was not mapped to anything for our experience, it took her out of the browser window that was used for our VR experience. This was a surprise to the participant. We recommend deactivating or reprogramming the “App” button if you don’t plan on using it, so that accidental clicks do not disrupt the experience.

In a later section of this study, we discuss using a home screen for a menu of navigation options. One participant suggested that instead of using a home screen for the location options, a menu towards the bottom of 360° photo should be used. During the analysis, the UX strategists, designers, and developers discussed this idea. We were concerned about where to place the menu that it would be natural for users to find but not in their way of the experience. Future projects should consider if there is a location that would work universally for 360° photos.

When talking with Patrick Coffey, the lead developer for the WebVR experience, about the participant that clicked on the middle button, he mentioned that he could program functionality to the middle button. It is possible to make a menu of location options appear when the “App” button is pressed. We recommend a comparative study between using a home screen for navigation and using the “App” button to display a navigation menu to see if there is a preference and ease difference between the two options.

As participants explored the experience, they also attempted other actions with the controller. Though our experience did not have any additional functionalities, the following would be interesting topics to explore in additional research:

  • Hover to display additional information on the navigational links.

"I wish it would say when I'm hovering over it, I'm pointing-I'm just circling within the footsteps icon in the game room now, and I kind of wish it would say where that was leading."

  • Toggle on and off functionality, especially for audio elements. For another WebVR experience, Successful Farming, Four Kitchens added the ability to toggle on/off audio.

"I tried to click it again, the same icon. Then I tried to click away from it. I clicked somewhere else. Now it-as far as I can tell, I don't know how to turn it off, if I wanted to-if I decided I didn't want to listen to it."

Last updated