Designing Augmented Reality Apps: Interacting with Objects

Designing Augmented Reality Apps: Interacting with Objects

When you set out to design your own Augmented Reality app, you will need to think about how the user will interact with the objects. Most virtual reality (VR) interactions happen via a motion controller, but most headset-based augmented reality (AR) devices use a combination of staring and hand tracking to interact. Oftentimes, AR headsets use gaze-based navigation to track where the user is looking for target items within the environment. When an item is targeted, the user often interacts with that item via hand gestures.
As such, you need to design your AR experience to keep the user’s hands within the headset recognition area and work with the specific set of gestures for each headset. Educating the user about the realm of gesture recognition — and notifying users when their gestures are near the limits — can help create a more successful user experience.
Since this method of interaction is new to almost everyone, it is important to keep interactions as simple as possible. Most of your users will already go through a learning curve to interact in AR, and discover the gestures specific to their specific device (because the global AR gesture set is not yet developed). Most AR headsets that use manual tracking come with a standard set of basic gestures. Try to stick with these prepackaged gestures and avoid confusing users by introducing new gestures specific to your app.
Grabbing an object in the real world gives the user feedback such as the feel of the object, the weight of the object in his hand, etc. Hand gestures made to select virtual holograms will not provide the user with any of these standard haptic feedback. It is therefore important to inform the user about the status of digital holograms in the environment in different ways.
Provide cues to the user regarding the state of an object or environment, especially when the user attempts to place or interact with digital holograms. For example, if a user is supposed to place a digital 3D image in a 3D space, providing a visual indication can help communicate with them where the object will be placed. If the user can interact with an object in your scene, you may want to visually indicate it on the object, potentially using affinity to alert the user that they are approaching an object they can interact with. If the user is trying to select one object among many, then highlight the one you are currently selecting and provide audio cues for its actions.
Mobile device interaction in AR apps
Many augmented reality design principles apply to both headset and mobile experiences. However, there is a significant difference between the interactive functionality of AR headphones and mobile augmented reality experiences. Due to the different form factor between AR headsets and augmented reality mobile devices, interaction requires a few different rules.
Keeping interactions simple and providing feedback when placing or interacting with an object are rules that apply to both headsets and mobile AR experiences. But most user interaction on mobile devices will occur through gestures on the device’s touch screen rather than users interacting directly with 3D objects or using hand gestures in a 3D space.
“A number of libraries, such as ManoMotion, can provide 3D hand gesture tracking and gesture recognition to control holograms in mobile AR experiences. These libraries may be worth exploring depending on your application requirements. Just remember that the user will likely keep the device in One hand while trying your app, which can make it hard to try to get her other hand also in front of the rear camera.”
Users will likely already understand mobile device gestures such as one-finger tap, swipe, two-finger pinch, rotate, etc. However, most users understand these interactions in relation to the two-dimensional world of a screen rather than the three-dimensional world of the real world.
After placing a hologram in space, consider allowing this hologram to move in only two dimensions, essentially allowing it to only slide across the surface it is placed on. Similarly, consider limiting the object’s rotation to one axis. Allowing movement or rotation on all three axes can become very confusing for the end user and result in unintended consequences or holograms.
If you are rotating an object, consider allowing only rotation around the y axis. Locking these animations prevents the user from inadvertently moving objects in unexpected ways. You may also want to create a way to “undo” any unintended movement of the holograms, as placing these holograms in realistic space can be a challenge for users to get the correct information.
Most mobile devices support “pinch” interaction with the screen to either enlarge an area or scale an object. Since the user is at a fixed point in space in both the real world and the hologram world, you probably won’t want to use this gesture to zoom in on augmented reality.

Similarly, consider excluding a user’s ability to scale an object in augmented reality. The two-finger pinch gesture of the scale is standard interaction for mobile users. In augmented reality, the scale gesture often doesn’t make sense. 3D AR models are often 3D models of a certain size. The visual appearance of the size of the 3D model is affected by the distance from the AR device. Having the user measure an object in place to make the object appear closer to the camera is really just making the object larger in place, often not what the user intended. Pinch-to-scale can still be used in augmented reality, but its use should be considered carefully.
Voice interaction in AR apps
Some AR devices also support voice interaction capabilities. While the interaction for most AR headphones is mainly staring and gestures, for headphones with audio capabilities, you need to think about how you can take advantage of all the interaction methods and how to make them work well together. Voice controls can be a very convenient way to control your app. With processing power growing exponentially, expect voice control to be introduced and further improved on AR headphones.

Here are some things to keep in mind when developing voice commands for AR devices that support this feature:
Use simple commands. Keeping your commands simple will help you avoid potential problems for users who speak different dialects or dialects. It also reduces the learning curve for your application. For example, “read more” is probably a better option than “give more information about the selected item”.
Make sure that you can undo your voice commands. Voice interactions can sometimes be triggered unintentionally by picking up a voice of others nearby. Make sure any voice command can be undone if an accidental interaction is triggered.
Eliminate similar-sounding interactions. To prevent the user from triggering incorrect actions, get rid of any spoken commands that may sound the same but perform different actions. For example, if Read More performs a certain action in your application (such as revealing more text), it should always perform the same interaction throughout your application. Similar sounding commands should also be avoided. For example, “Open Reference” and “Open Preferences” are very likely to be confused with each other.
Avoid system commands. Make sure that your software does not override the voice commands that the system has already reserved. If a command like “home screen” is being held by the AR device, do not reprogram that command to perform different functions within your app.
Provide feedback. Voice interactions should provide the same level of feedback to the user as standard interaction methods do. If the user is using voice commands, provide feedback that your app has heard and understood the command. One way to do this is to provide on-screen text for commands that the system interprets from the user. This will provide the user with feedback on how the system understands their commands and allow them to adjust their commands if necessary.

Leave a Comment