Error Prevention
Last updated
Last updated
Because users have never had to understand before that the rotation of their head or the end of their pointer finger is being tracked, it is important that we onboard our users to the specific interaction paradigms we have created. Control systems for interactions, regardless of the input chosen, need to be overtly explained to the user.
AR design often avoids large and fast-moving objects in the physical world due to limitations of the technology, but losing small and slow moving objects in the real world is easy to do even when those objects have matter.
Particle effects can create a grounding effect of the digital object(s) to the physical environment. It helps integrate the digital objects better into the space around them by creating an atmospheric presence of digital content, especially when imitating real-life content like a fog or plankton. Breadcrumb trails are also a good way of visually indicating the trajectory of moving objects or even the location of stationary objects.
Peripheral cues are often used to engage a user that might need guidance towards looking in a specific direction. Oftentimes reticles, glints of light, or arrows are engaged at the periphery to help a user understand the visual indication of where to look. For important interactable content, or even exposition from non-player characters (NPCs), it is advisable to give a peripheral cue for when players need help looking in the correct direction.
Similar to traditional 2D gaming, hover effects help indicate to a user a specific piece of interactable content. This technique might be helpful if your content has a tendency to roam, although it will break presence if used too frequently.
Audio and haptics are easily one of the most overlooked and underutilized tools, but one of the most valuable as it helps maintain immersion while providing substance to digital objects. Spatialized sound and haptic cues help users understand digital object position without having to maintain a line of sight for visual content. Use audio cues to help remind users when content is outside the field of view, to notify users if content is rendered behind them, and to assist users in detecting characters that might be moving around constantly. Use haptic cues to notify users when theyβre hovering on a digital object, when an impact of a digital object occurs, or when an important interaction that needs to take place is about to happen.