This article is translated from iOS 11 Human Interface Guidelines, published on DPUX public account, translated into C7210.

Your app can take advantage of Apple’s augmented reality technology, known as ARKit, to provide users with an engaging virtual interactive experience that seamlessly integrates with the real world. In AR app, the camera of the device is used to present the real-time picture of the real world on the screen, while the virtual 3D objects are overlapped to form the illusion of the coexistence of the virtual and the real. Users can view virtual objects from different angles through the device; Users can also interact with virtual objects through gestures or other movements if the app provides the right experience mode.

Design engaging AR experiences

Use full screen to enhance engagement. Devote as much screen real estate as possible to real images and virtual objects, and don’t let the immersive experience be spoiled by the clutter of interface controls and information.

Create lifelike virtual objects. Not all AR apps require lifelike virtual experiences; But for those who do, virtual objects must be made as vivid as possible to blend in with the real world around them. To get the best visual experience, you need to design detailed, realistic 3D materials, use data from ARKit to place virtual objects on the surface of the real environment, and make sure that the virtual objects have the right proportions to reflect the light of the real environment and create a projection on the surface. And dynamically update the visuals as the camera position changes.

Consider physical limitations. Keep in mind that people are likely to try to use your app in environments that are not conducive to constructing an AR experience, such as tight Spaces where it is difficult to move around or where there is a lack of flat areas. Anticipate these potentially problematic scenarios and clarify the environmental requirements of how the product operates, while also considering providing differentiated functionality for different use environments.

Consider the comfort of the user. Handheld devices are used at a certain distance and Angle, which can become tiring after a certain period of time. Keep in mind that people need to maintain a certain posture when using your app, so avoid making people uncomfortable while providing a pleasurable experience. For example, you can place virtual objects in their proper place by default, reducing the need for people to move them closer manually. For games, keep levels as short as possible and interspersed with short breaks in progress.

If your app encourages people to interact through physical gestures, do it in a gradual way. Games, for example, don’t require users to interact with virtual objects through complex body movements as soon as they get started. People should be given time to familiarize themselves with the AR experience model, and then gradually increase the complexity of the interaction mechanism.

Pay attention to user safety. If there are other people or objects nearby when people use AR apps, excessive body movements may bring potential dangers. Consider how to ensure the safety of the operation. For a game, for example, avoid large or sudden body movements as a means of interaction.

Enhance the immersive experience through sound and tactile feedback. When virtual objects touch solid planes or other virtual elements, interactive feedback through sound effects and vibrational touch is a very effective way to do this. In immersive games, background music also helps to create an immersive virtual experience.

Provide contextualized operation tips. A 3D rotation indicator that surrounds a virtual object to indicate how it operates is more intuitive than traditional text instructions. However, in certain situations, such as when the system is exploring a solid plane, or when the user is not responding to contextualized prompts, the text float may be more effective.

  

If you must use text prompts, make sure they are easy to understand. Some users may be intimidated by advanced technology concepts like AR. In order to make the text more accessible, avoid technical terms such as “ARKit”, “environmental detection”, “tracking” and so on, and instead use more friendly and conversational text that most users can understand the concept. Here are some stylistic comparisons:

  • Appropriate: “Unable to find the right flat surface, please try to move the phone sideways or adjust its position.”
  • Improper: “Horizontal surface cannot be detected. Adjust tracking.”
  • Appropriate: “Tap something and place [object name] on it.”
  • Inappropriate: “Tap horizontal surface to anchor object.”
  • Appropriate: “Try turning up the lights and moving a little.”
  • Inappropriate: “Insufficient features.”
  • Appropriate: “Try moving your phone slowly.”
  • Improper: “Excessive speed of motion detected.”

Avoid unnecessary interruptions to the experience. Each time the user exits and re-enters AR mode, the environment analysis and horizontal detection are performed again, and the position of the phone and camera usually changes, causing the virtual objects that have been placed to be relocated. Sometimes they even seem detached from the surface of the real world. One way to avoid this disruption is to allow people to do as much as possible without leaving AR mode. For example, in the AR app of home furnishing, users have already placed a happy chair in the living room, and then they are likely to add other furniture for preview. Make sure that everything you do can be done in your current environment.

Enter the AR environment

Explicitly prompt the initialization status and guide the user to participate in the completion. Every time your app enters AR mode, the initialization process is performed to detect and evaluate the environment. The whole process may take several seconds. To reduce user confusion and speed up the process, you can use text to explicitly remind users of the current state of the system, and encourage them to explore their surroundings with their camera and actively look for places that provide a horizontal surface.

Placing virtual Objects

Helps users understand when horizontal planes need to be located and when virtual objects can be placed. Visual signage can effectively inform the user that horizontal positioning is taking place. For example, when the user sees the trapezoidal crosshair in the center of the screen, he or she knows that it is time to look for a flat area. After the level determination is complete, the style of the target changes, indicating that the user can place virtual objects. The appearance style of the indicator logo needs to be consistent with the overall style of the app.

Horizontal probe indicator

Object placement indicator

A custom indicator in a specific app

Respond to user placement of virtual objects in an appropriate manner. The horizontal alignment process takes time (albeit a very short one). If the user is trying to place a virtual object during this process, you should use existing data information to immediately put the object on the screen. Once the level is determined, the final data is used to fine-tune the position of the object. If the user initially drops the object out of range of the determined horizontal plane, gently pull it back into the plane.

Avoid prepositioning objects on horizontal edges. In AR, the horizontal edge range may adjust as the user’s location changes.

Interaction between users and virtual objects

Replace control interaction with direct manipulation. Allowing users to interact directly with the object itself can lead to a more immersive AR experience than operating through interface controls that are separate from the virtual object. Keep in mind, however, that a direct approach can make the interaction more difficult and even frustrating if the user has to move their body a lot.

       

Direct action is achieved through standard, familiar gestures. For example, you can allow users to move objects by dragging gestures with one finger, and rotate objects by twisting gestures with both fingers.

Keep the interaction simple. While touchscreen gestures are essentially 2D interactions, AR experiences involve interacting with the 3D real world. Consider simplifying the way users interact with virtual objects to make up for dimensional differences, such as limiting the movement of objects to 2D planes or allowing objects to rotate around a single axis.

Limit the movement of an object to a 2D plane

Limit the rotation of an object to a single axis

Expand the response area for gesture actions. On a touch screen, it’s hard to precisely tap into a specific spot on a tiny object, and it’s not easy to place an object exactly at a point. If a gesture is detected near an interactive object, it is usually assumed that the user intends to manipulate the object.

Consider whether it is necessary to provide scaling capabilities for objects added by users. For virtual objects that do not have a specific frame of reference for size, such as toys or game characters, users will likely need to adjust their size to their environment; In this case, it is appropriate to provide manual scaling for objects. On the other hand, for objects that have a specific scale to their real environment, such as furniture, manual scaling is not necessary if precise preset sizes can be provided. In addition, the zoom function cannot be used to adjust the distance between an object and the user – for example, zooming in on an object only changes the size of the object itself, not the actual distance due to the “near larger far smaller” effect.

Be alert for potential gesture conflicts. For example, a kneading gesture with both hands can be too similar to a twisting gesture. If you need to integrate these gestures, test them out and be sure to explain them to users.

Make sure the virtual objects move smoothly enough. When the user scales, rotates, or moves an object, the motion should be smooth and consistent enough that it does not jump.

Explore more and more interesting ways to interact. Gestures aren’t the only way people interact with virtual objects. You can make use of location and movement to create a more interesting interactive experience, for example, a game character can turn his head and look at the user as they approach him.

Deal with the problem

Users should be allowed to reset the scene if the experience is not as good as expected. Don’t force the user to wait for the experience environment to improve or get stuck with poor object placement. Always provide users with a way to reset the scene to help them get the best experience.

Provide troubleshooting suggestions to users when exceptions occur. The process of analyzing the environment and detecting the horizontal plane can fail for many reasons, such as the environment is not bright enough, the horizontal plane is too rough or reflective, the camera position is not stable, and so on. If your app detects these conditions, or if the level detection process is too long, advise the user on how to deal with them.

  • Problem: Insufficient features.
  • Suggestion: “Try turning up the lights and moving a little.”
  • Problem: Excessive speed of motion detected.
  • Suggestion: “Try moving your phone slowly.”
  • Problem: Horizontal probe too long.
  • Suggestion: “Please move slightly, turn up the lights, and make sure the phone camera is on a smooth horizontal plane.”

The AR function is provided only for compatible devices. If AR is at the core of your app, you should exclude devices that do not support ARKit when setting the device compatibility range. If AR is only a secondary function in the app, for example, a home furnishing app mainly displays picture catalogue and also provides AR demonstration for some furniture, then when users use the app through devices that do not support ARKit, they should hide AR related functions, rather than allow users to access and receive error prompts.

AR icon

You can use the standard AR symbol icon in the app to trigger arKit-based functionality. Visit the Resources section to download graphic Resources.

       

Use AR ICONS explicitly with intent. This icon can only be used to start (ArKit-based) AR mode. Do not change the style of the icon (other than its size and color scheme) or use it for any other purpose. Do not use the icon in AR apps that are not built by ARKit.

Make sure you have the most basic white space. You need to leave a basic white area around the AR icon, 10% of the height of the icon. Other interface elements must not occupy this area or enclose it in any way.

AR logo

For apps that provide product listings or similar forms of content, you can use the AR logo to indicate that specific content objects can be viewed in AR mode. For example, a home app can use the AR logo to guide users to preview the placement style of furniture in their homes, which is convenient for them to make purchase decisions.

Use the AR logo explicitly with intent. You can visit the Resources section to download graphic Resources. The AR logo comes in “standard” and “simplified” styles and can only be used to indicate that a product or specific content object can be viewed in ARKit-based AR mode. Do not change the style or color scheme of the logo or use it for other purposes. Do not use the logo in AR apps that are not built by ARKit.

Standard AR logo

Keep only the simplified AR logo of the icon

The standard AR logo is recommended. In general, simplified logos are only considered when the interface space is too tight to accommodate standard AR logos. The default size of both logos ensures that they are recognizable.

Use the AR logo only when part of the content supports the AR mode and part does not. If everything is viewable in AR mode, the logo is redundant.

Keep the logo layout consistent and clearly visible. In general, placing the logo in the corner of the content thumbnail is best. Keep the location of the logo uniform globally and make sure the size is legible (but not so large that the logo blocks the details of the thumbnail).

Make sure you have the most basic white space. You need to keep a basic white space around the AR logo, 10% of the height of the logo. Other interface elements must not occupy this area or enclose it in any way.