AR Mobile Design
Last updated
Last updated
Mobile Design is a necessary part of the field of Augmented Reality today. Many designers have to integrate spatially difficult actions into a very tight screen space. I'm not going to go into too much depth here, because I believe a lot of the content covered in the previous section is still applicable (as well as the entire existing paradigm of mobile design standards).
The first major obstacle is going to be how to organize the interface. Most AR mobile content is experienced in portrait mode instead of landscape, so keep this in mind when creating your content to fit a very narrow space. We can divide the areas of mobile design into three main stages and a content zone.
Upstage: The location for generally low priority or potentially detrimental UI (like exit or return) that is still necessary for the AR content. Keep in mind, this is an area that users are likely going to ignore for the most part and/or it will be covered with notifications.
Centerstage: The location for most optimally viewing content or digesting information. You can easily place in-game notifications or popup dialogue here if necessary.
Downstage: This is the location closest to the thumbs and most ergonomic for input and controls. The most frequently used UI should go here, including inventory systems if applicable.
Optimal Placement Range: This is the location where content should generally be spawned in envrionments. This is great especially for automatic spawns or user-generate contentpopulation.
This might be a less established design paradigm overall, but I find that the most successful prompts for getting a user to spatially map an object, room, or environment is not to show the user themselves doing the action of scanning but to show the object or person within a room. This generally gives players a better idea of how much they'll be moving around and how much space is required for AR.
This interaction has a static graphic overlay fixed to the glass(screen) at all times. This design convention is useful for permanent elements that need to be within the users reach at all times. An example of this is a menu or return prompt.
Although these elements are locked in space, They could have a dynamic feature where they always face the user. This design convention is useful for labels and material that needs to accompany an object or marker in space.
In this case static becomes a dynamic content type .This convention works for allowing users to position assets in custom or specific areas. This is helpful for target based or drag and drop elements.
A great way to engage with 3D models and understanding its components. Most commonly used for educational purposes and understanding the breakdown of an object.
It can be really tempting to create AR content with multiple types of input beyond touch screen. I've seen some interesting integrations with other peripherals like keyboards, as well as the usage of hands on the back-facing camera. While I don't discourage any types of exploration, understand that any novel usage of an input schema on a platform that already has an established standard is a risky move.
I want to specifically call out the work of Bushra Mahmood - who has an amazing amount of cateloged work in the AR mobile space and beyond. Please go check out their work!