Augmented reality (AR) is slowly yet surely becoming mainstream as a wide range of companies incorporate AR features into their websites and apps. Once you’ve decided that an AR tool would be useful, you must thoughtfully design it to ensure that it can be used successfully.
To uncover usability issues surrounding augmented-reality shopping features, we conducted a mobile remote moderated study with 10 participants. For this study, we looked specifically at ecommerce AR features geared toward informing purchase decisions. The study included a variety of mobile websites and apps, including virtual try-on AR tools that augment the user’s appearance (typically using the device’s forward-facing camera or a webcam) and ‘view in room’ AR tools that augment the user’s surroundings to place items within their environment. While these features are slightly different use cases, the majority of our findings apply to both types of AR tools.
AR Discoverability
Because AR is a relatively new technology (and was initially used in gaming), most users don’t think to actively look for it when they browse products online. Thus, if an AR tool exists, it must be very discoverable for users to notice it; it should also clearly communicate what it is and why users should interact with it.
Where and How to Promote AR
When first landing on the homepage of a website or app, many users scanned of the page to check for promotions and featured new items or to get a better sense of what the site had to offer if it was their first visit. Most websites didn’t promote AR features on the homepage. Warby Parker’s website was one of the exceptions and displayed a floating banner at the bottom of the screen promoting the ability to virtually try on frames using its app. This banner appeared on all pages until dismissed by the user.
Including a link to an AR tool from the homepage or main landing screen of an app works best only if users can browse multiple products without leaving the tool. For instance, the Ulta mobile app promoted several virtual try-on tools on the main page, all of which allowed users to browse and try on multiple products without leaving the AR experience. Furthermore, the call-to-action to launch the tool should be descriptive enough so users aren’t taken by surprise when they are subsequently asked for permission to access their device’s camera. Showing images of people’s faces and the words try-on were usually sufficiently strong cues. In contrast, higher on the page of the Ulta app, the link to the foundation shade matcher only showed a graphic of color swatches on a phone, and participants were surprised when the app asked for camera permissions.
Only one website in our study, Kendra Scott, promoted a jewelry try-on AR feature on product-listing pages. Displayed as a banner above the product listings, I fully expected participants to overlook it due to banner blindness. However, I was pleasantly surprised to find that it was noticed and made users excited. One participant saw the banner immediately and stated, “New virtual try-on, that’s actually interesting. I’ve never seen a virtual try-on with earrings.” Sadly, the AR feature wasn’t available for many items, which quickly led to disappointment. “This website says try them on, but I picked the third most popular earrings on the whole web page [and it wasn’t available]. So, I get it if they’re not for every single item, but you would think their top 20 or 50 items, you would be able to have that option for.” So, use this tactic only if your AR tool works with many of your items — or, at least, with the most popular ones.
The primary thing participants wanted to see on product-listing pages — once they realized the AR option existed — was some visual indicator for which items were possible to view in AR. Participants were quick to notice that not all the items were available in AR, and yet there was no way for them to know that without visiting each product-detail page. For example, one participant on the Etsy mobile app complained, “It said items with this icon — you can view it virtually, but you have to click each item to know if it does that. It doesn’t tell you when you’re scrolling through the list. That’s kind of annoying.”
On the product-detail pages themselves, AR features were most discoverable when they appeared near the product image and used clear text labels such as See it in AR, Preview in your space, View in Room, or Live Try-On. While the AR 3D box icon was recognized by some participants as being associated with using the camera to view items in their surroundings, a camera icon seemed to work just as well. Note that several participants didn’t recognize the AR cube icon at all, so it is definitely not a universal icon! When asked what that icon was, one participant said, “I assume it looks like a cube, like a 3D cube … but … if I hadn’t just used it, I wouldn’t have known.” As always, icons worked best when paired with a meaningful text label.
Prominent placement of the button and a meaningful label are more important than which icon to use. For example, the text-only Try It On button on product-detail pages of the Ray-Ban mobile website worked well and was noticed and understood by users. Do not create a completely custom icon, as these are likely to be confusing and misunderstood by users (and don’t contribute toward the adoption of a standard AR icon). Unfortunately, after using the AR feature, the Ray-Ban website replaced its straightforward text button with an unclear toggle. Participants in the study weren’t sure what the unlabeled icons represented and had to toggle back-and-forth a few times to realize its meaning.
Presenting a hint or instructional overlay promoting the AR feature was common when users reached a product-detail page for the first time on a mobile app. While popups are often problematic, because AR is a new and unexpected feature, participants were receptive to the interruption and willing to learn about it. Additionally, waiting until the product-detail page to present the hint was contextually appropriate, because users could then immediately try out the tool.
Keep in mind, however, that users may be quick to dismiss modals without reading the text inside; if they do so, there is no easy way for them to access that information again. For example, a participant on Etsy’s mobile app was disappointed that the hint only appeared once and that the icon alone was easy to overlook because it blended into the background. She commented, “They should prompt you more because people like me that think they know everything, they’re going to miss out. … I think that’s a really cool feature, I just wouldn’t have known to find it or look for it.” An alternative to relying on a hint popup is to make the button itself more prominent (as already discussed) or to consider a purposeful animation if the feature is worth the attention hijacking. For instance, Warby Parker’s mobile app displayed a brief animation promoting the Virtual Try-On feature when the user reached a product-detail page for the first time, signaling that the page’s content could be pulled down to access the AR tool.
Seamless Transitions Between Channels
Incorporating AR features into the primary browsing website or app made these tools readily available during the shopping process, increasing the likelihood that a user would discover and use them. Perhaps due to technical limitations, only three of the websites in our study allowed users to use AR features directly from the mobile website — all others merely advertised that AR was available but required them to download a mobile app. However, this process of switching channels was far from seamless — only one website correctly navigated users from a product-detail page on the mobile website through downloading and installing the app to the same product page on the app.
In contrast, some mobile websites only presented a general link telling users to download the mobile app to be able to use the AR feature. Upon launch, the app merely opened the main landing screen. Additionally, generic app promotions led several users to skip the link altogether and go directly to the app store to locate the app, thus eliminating the possibility of seamlessly navigating to a specific product. (Though, perhaps these general links were less misleading than those that appeared to be specific to a product, but weren’t.) For example, product-detail pages on Interior Define’s mobile website presented a promotion — missed entirely by some participants because it was located very low on the page — to visualize this piece in your home using the AR app; the promotion was misleading because it forced people to relocate the product once within the app.
To add insult to injury, several participants were then frustrated to find that the sofa they had been viewing on the website wasn’t even available in the app! After browsing the app, revisiting the mobile website to locate the item’s name, and returning to the app to search for it directly, one participant gave up and said, “So apparently, I’m not able to find the one I was interested in. That’s a bit disappointing.” This issue unfortunately was not limited to Interior Define, but seems to be common among dedicated AR apps — apps whose primary function is to view items in situ — as opposed to apps which incorporate AR features into the primary shopping app.
Using the AR tool
Getting people to the tool is only half the battle for the AR feature. Once users realize it exists and are motivated to try it, they need to be able to use it to accomplish their goal of envisioning the item in its prospective environment.
Calibration Is Unfamiliar and Difficult
The first hurdle to overcome when launching an AR tool is calibration. This process tended to be difficult for users, partially because it is so new and unfamiliar (and partially because the technology needs improvement). People learn by repetition, so because they don’t have many past experiences to draw on, they don’t know how calibration should work and how to best complete it. Thus, calibration requires helpful, clear instructions to teach people what to do and guide them through the process.
Step-by-step instructions contextually appropriate to the user’s environment worked best for guiding users through calibration. Etsy’s mobile-app AR feature provided clear help content when calibrating, instructing the user to “scan the entire wall, including where it meets the floor and ceiling.” Specifying what to include in the camera view was immensely helpful, as users would otherwise not know what to show. Once calibration was successful, the instructions progressed to telling users to tap to make the item appear.
General help instructions that don’t reflect the user’s current issue aren’t helpful and can make a bad situation even worse. Several participants struggled to calibrate the Best Buy AR tool and gave up before being prompted to continue. The primary issue was a prominent instruction to Move Closer, which led some to eventually touch the wall with their phone in an effort to follow this direction.
“It’s taking so much [more] time than I expected. So, I don’t know what I have to do. It says move closer. I am doing that, but, okay. That was actually a little confusing.”
“That was a very unsatisfying experience. I’m not sure what it was needing me to do. It kept saying move closer, move closer, to the point where I was almost touching the wall. In which case, you can’t really see how it looks in the space if you’re that close.”
The instruction to Move Closer also directly contradicted a later tip suggesting that the camera view needed to include a focal point on the wall (which is likely easier to get from further away). Additionally, some participants interpreted this tip as meaning that they needed to hold something up in the camera’s view, which was difficult to juggle and ineffective.
The exception to these calibration woes were virtual try-on tools. Not only was the calibration technology more accurate because the contours of a face are more easily detectable than blank walls, but they also are more familiar to users thanks to various selfie lenses and filters on social media platforms such as Snapchat and Instagram. One participant was impressed by the speed of calibration on Warby Parker’s app, stating, “Oh wow, that was easy to do. It automatically found my eyes. And I was expecting to need to position it, but it automatically found where my eyes and nose bridge were, which is, wow, very easy and helpful.”
Requires Focused Attention
Interacting with the AR tool was a complex task, which required users’ focused attention. People spent most of their attentional resources looking at the AR view and at any chrome presented at the bottom of the screen. Because this bottom area is where people expected to find details about the product they were viewing, interact with UI elements to change the item’s color or size, select a new product, or find other related tools, users tended to selectively focus on it and missed additional functions presented at the top of the AR view. (In general, interacting with something complex makes users more likely to miss potentially helpful tools in the periphery of the screen — a finding supported also by our studies on store locators with interactive maps.)
For example, a participant on Warby Parker’s mobile app took a screenshot using her camera buttons rather than the photo function present in the top-right corner of the AR tool when she was asked how she might get a second opinion on a pair of glasses she was interested in. She later explained, “I didn’t realize it had that; it already had a screenshot function. I just didn’t pay attention. It was on the upper right-hand corner. … I guess I’m looking more down than up and those things were on the top and maybe I just didn’t notice it. … Because the first thing I would do is put the frames on and then I would look down, I’m looking at the description.”
Similarly, a participant using the Ulta mobile app’s virtual try-on tool had been looking for a way to remove a type of makeup entirely, rather than selecting a different color. She had almost given up when she finally found a listing function presented at the top of the screen that allowed her to view all the products she had selected, plus the ability to remove specific items. She said, “Oh you know what? I just noticed up at the top … all this time I never even looked at the top here. So now I see there’s more at the top. Now Wearing, okay. … the fact that you can play with the bottom part of the screen draws the eyes so much that I didn’t really notice the icons at the top so much.”
Sometimes, this focus on the bottom chrome of the AR tool meant that participants overlooked changes within the main camera view as well. For example, several participants shopping for a television on Best Buy’s mobile app noticed the ruler icon within the AR tool and clicked it expecting to view measurements, but were disappointed that it seemed to only add pricing information to the screen — they didn’t look up at the AR view to see that is where the measurements were shown! One participant stated, “So that’s not what I was expecting … I expected that ruler to come up. … Some sort of dimensions to show up, like length and width, and instead just the price came up underneath the picture.”
In this case, a better solution may have been to only add the measurements to the main AR view (along with the visual feedback showing that the button was selected), so users would continue to look at the screen to find what had changed. Adding the unrelated information of price and an Add to Cart button distracted users and made them believe that was the main function of that button.
Conclusion
Augmented reality is an exciting new technology that has the potential to become a useful tool for ecommerce. As more sites and apps begin take advantage of it, it will slowly become more expected and users will learn how to use it. Until then, great care must be taken to ensure that an AR feature is highly discoverable, clearly explained, simple to calibrate, and easy to interact with.
Share this article: