Best Practices and Virtual Reality Design Principles

Best Practices and Virtual Reality Design Principles

The design of virtual reality (VR) experiences is different from the designs of other applications. The immersive nature of virtual reality presents a whole new set of challenges. Keep the following points and best practices in mind when designing for virtual reality.
Virtual Reality Design: Giving the User Control
One of the basic principles of virtual reality is to give users control over their surroundings. In real life, users are in complete control of how they move and perceive the world around them. When users “lose control” in real life, it seems that their movements and perception of the world around them are no longer aligned. This feeling can be equivalent to feeling drunk, or what is commonly referred to as simulated sickness.
Warning
Emulation sickness should be avoided at all costs – users hate it and it will drive them away from your VR product. You want to make sure that your users always feel in control. Their movements should always be mirrored by movement within the virtual environment. In addition, you should not take control away from the user. You don’t want to move the user without their actions triggering that movement.
Also, do not rotate or change the position of the user’s view of the virtual environment. If repositioning is required, it is recommended that it fade to black for a moment, then fade back up to the repositioned environment. Although not ideal, fading to black (caused by user action of course) and back can be a way to reposition the users’ environment without the user feeling as if they have given up control.
Understanding movement in virtual reality experiences
Motion in VR is not safely resolved. One of the strengths of VR is the ability to create compelling environments that the user wants to explore. But it doesn’t matter how attractive the environment is if the user can’t move around to explore it.

If your experience is more than just a static and well-established one, you need to enable users to navigate your space. You can create a way for the user to move forward with a standard non-VR method, such as a joystick, but that kind of movement can cause nausea. It tends to elicit a feeling of acceleration, which in turn leads to mimesis.
Hint
“When adding motion to a VR app, ask yourself how motion improves the user’s VR experience. Unnecessary motion can be overwhelming for users. Focusing on the added value of motion over the experience can help enhance a VR app.”
Many applications find ways for users to rely on some kind of machine or platform, and then port the platform itself instead of the user. This can help mitigate some of the potential issues of simulation sickness, especially if the user remains seated.
For room-scale virtual reality experiences, “teleportation” is one of the current standards for seamlessly moving users over great distances in virtual worlds. The user aims at the place he wants to go, shows some kind of fee to select the target destination, then the user turns on the teleport.
Locomotion is by far the best cutting edge virtual reality practice, and requires a lot of exploration of what’s best for your app. Application developers implement and improve this mechanism in several ways.
Robo Recall, the Oculus Rift game, enables the user to specify which direction they will face when they reach their teleportation location, rather than simply teleporting directly to the location in whatever direction they are currently searching. Budget Cuts, a game by publisher Neat Corp, gives the user the ability to peek into their destination and how they’ll encounter it before they go remotely, eliminating the confusion that can often occur when the user instantly switches to a new location.
Teleportation is not the only way to get around. Many apps offer a standard “walking” motion to users. Smooth navigation, or gliding through virtual environments without jerky acceleration, can help retain some immersion in a standard way of locomotion while reducing some potential “mimetics” triggers.
Other solutions for mobility within a limited space are also being explored. Saccade Led Redirected Walk is a way to redirect users away from real world obstacles that allows users to traverse large virtual scenes in a small physical space. In saccade redirection, the virtual scene is rotated slightly in a way that is not visible to the user, causing the user to change their gait slightly in response to digital scene changes. For example, using this method, the user might think that they are walking in a straight line in the digital world, but in the physical world direct them in a more circular path.
Large-scale movement in virtual reality is a mechanism that has not yet been fully resolved. Teleportation is often used, but it is only one of many possible solutions for motion. If your app requires movement, see other apps and ways to move them and see what you think makes sense. You may be the one creating a new standard of motion for virtual reality experiences!
VR design: provide user feedback
In the real world, a person’s actions are usually met with some kind of feedback, visual or otherwise. Even with your eyes closed, touching a hot stove provides haptic feedback for a burning sensation. Pick up a thrown ball and you will feel the stroke of the ball in the palm of your hand and the weight of the ball in your hand. Even something as simple as holding a doorknob or tapping your finger on a computer key provides haptic feedback to your nervous system.
VR doesn’t yet have a way to fully perceive haptic feedback, but you can still find ways to provide feedback to the user. If it’s available on the VR device you’re targeting, haptic feedback (via controller vibrations or the like) can help improve an immersive user experience. Audio can also help notify the user of actions (when the user clicks a button, for example). Providing these acoustic and tactile cues along with the visuals can help make your VR environments feel more immersive and help notify the user when actions occur.
Follow the user’s view of the design of virtual reality
Knowing where to focus a user’s gaze is a necessary part of VR interactions, particularly in current versions of head-mounted displays (HMDs) that do not provide eye tracking. Many virtual reality applications depend on the user’s gaze to choose. In order to take advantage of the gaze, you may want to provide a visual aid, such as a reticle to help the user target objects. Reticles are usually visually distinguished from the rest of the environment in order to stand out, but are small and unobtrusive enough not to draw the user’s attention away from the rest of the application. Returning articles should result in some kind of indication to the user of the interactive elements within the environment.
Hint
Depending on your VR implementation, you can also choose to view the reticle only when the user is close to objects they can interact with. This allows the user not to be bothered by the additional visual information of the network when focusing on things that they cannot interact with at the moment.
Not every VR app needs a reticle. When motion controllers are used to locate or interact with objects outside of the user’s reach, the grid is usually discarded in favor of a laser pointer and a selection pointer. You can only display the cursor, but it is better for you to display a combination of the default model of the controller, the laser beam and the pointer together. Doing so helps users notice the motion controller and pointer, helps communicate the angle of the laser beam, and provides real-time feedback and an intuitive feel to the user about how motion controller orientation affects beam and pointer input.
Avoid simulation sickness in VR design
Mimicry is the feeling of nausea caused by a mismatch between the user’s visual and physical movement signals. At its simplest, your eyes might tell you that you’re moving, but your body is different. Nothing will make a user leave your app faster than feeling dizzy in the simulator.
There are a number of ways to avoid mimesis.
Maintain the application’s frame rate. Sixty frames per second (fps) is generally considered the minimum frame rate that VR applications must run in order to prevent users from simulating sickness. If your app is running at less than 60fps, you need to find ways to go back to at least 60fps. Maintaining this frame rate is probably the most important tip to follow, even if it means cutting out other parts of your app.
Maintain constant head tracking. Head tracking in virtual reality refers to the application that constantly tracks the movement of your head, and these movements reflect themselves in the virtual environment. Aligning your app’s virtual positioning with the user’s realistic head movements is vital to avoiding simulation sickness. Even a slight pause while tracking a user’s movements can cause motion sickness.
Avoid acceleration. In the real world, our bodies notice much greater acceleration than we do moving at a constant speed. While you’re traveling in a car traveling at 65 mph on a highway, you might not feel any different than if you were sitting on a park bench. However, your body definitely feels the acceleration difference from zero to 65 mph.

Real-world acceleration or deceleration provides a visual change as well as a sense of motion for the end user. However, virtual reality provides only a visual update. This lack of sense of movement in virtual reality can lead to simulated vertigo. Avoid speeding up or slowing down the user in VR. If movement within the space is required, try to keep users moving at a constant speed.
Avoid fixed-width items. Any drawing that “fits” itself into the user’s point of view can make one feel sick. In general, keep all 3D objects while in virtual reality rather than pinning any elements to the 2D user screen.
More VR Best Practices to Consider
Here are some helpful best practices for color, sounds, and text usage, all of which can impact VR user experiences:
Bright colors and environments: Imagine the feeling of leaving a dark theater and stepping out on a bright sunny day. You find yourself shielding your eyes from the glare of the sun, squinting and waiting for your eyes to adjust. In virtual reality, the same feeling can be induced by quickly changing from any dark scene to a bright one.
Changing the instant brightness from dark to light can annoy and confuse users, and unlike stepping out into bright sunlight, when the headset user has no way to protect their eyes from glare. Avoid harsh or rapid changes between darker scenes to lighter scenes or items.

Very bright colors and scenes can be difficult to look at for an extended period of time and can cause eye strain for users. Make sure to keep scene color palettes and items in mind when building your experiences.
Background sound: VR apps must be immersive. In the real world, sound plays a huge role in helping you identify your environment. From the loud noise of a busy street to the white noise and background noise of an office environment, to the darkened silence of a cave, audio cues alone are often enough to describe the environment. Be sure to consider how not only event-based audio (such as audio players when a user interacts), but also background audio will play a role in your experiences.
Text input and output: When in VR, users are surrounded with visual information from the environment. Adding large blocks of text to this environment can overload the user with input. Where possible, avoid using large blocks of small-font text. Short text excerpts rendered in large print are typically preferred.
Similarly, it can be difficult for a user in VR to input a large amount of text. Text input in VR has yet to be completely solved. If text input is a requirement of your application, consider carefully how this can occur in the VR space.