Augmented Reality Tools: Usability

Streem

Summary

Streem creates remote augmented reality experiences. Picture a specialist at Best Buy seeing your home through your phone, measuring spaces in your kitchen, and dropping in an AR dishwasher to see what you think of it. Streem works with a number of enterprise customers, including Lowe’s and Best Buy. Customers have Streem’d over three million minutes.

I led user research to improve the usability of Streem's AR tools, resulting in enhanced selected states, a redesigned call control interface, and the cancellation of unnecessary features that saved development time.

The Challenge

Were our users getting the best experience with our tools? If there were multiple objects in the room, could they adjust one, and did they know it was even selected? This is what I wanted to find out.

The Process

I figured a great place to start was by asking the people who had designed and spent the most time with the tools: Our team. I spoke with 19 Streem employees, from the CEO to the Customer Success team member who talks to our customers about the tools every day.

Several top takeaways from this study:

  1. Editing existing measurements is very hard. Most people delete and try again instead.

  2. Laser and Marker tool are redundant. Why isn’t the marker a 3D AR element?

  3. The arrow might not be useful

  4. The label tool could be useful if it persisted into later Streem sessions

These were findings which I would hope to prove or disprove through user testing.

User Testing

I ran a study of 7 users, asking them to play the experts and take a number of actions in my space: position 3D arrows to point out a feature, label items and then edit labels, take measurements, etc. At the end of the call, I asked users to end the call; Nobody thought to tap on “the pill” (The timestamp on the top right of the screen which ends the call).

Five of the top insights were:

  1. Snapping to 3D “feature points” led to unexpected measurements

  2. People delete rather than edit

  3. Context-switching worked

  4. Selected states are not clear

  5. People were not initially able to exit the call with our current UI

Feature points are points in 3D space that the product recognizes. The issue of snapping to unexpected feature points is something that we’re addressing in a future project which will include just-in-time onboarding instructions which will help users get better scans.

The fact that people delete rather than edit meant that one of the main outcomes of the tool research was actually NOT to build a feature.

  • We had designed and put on the roadmap an “undo” feature for each of the AR tools. After seeing that users tended to just delete an element and try again, we realized that this would be wasted dev time.

  • The findings around context-switching were another case where it led us to not build a feature. We had planned to put tools in modes, so the user could focus on the tool they were working with. In fact, people had no problem making edits in one tool and switching to the next.

Left: Measurement floating in space. Middle three: Which is selected? Right: Call UI “Pill” (in top right of screen) contains call controls, but nobody could find them.

Survey Results

I found that there was confusion around which element on the screen was selected. I ran a survey of existing and proposed selected states for the various tools to see which designs were most effective. I had 32 respondents and the answers were pretty clear: Adding a glowing animation to the selected element was the most effective way of showing it was selected.

Updating tool selected states to the glowing version has been planned and added to the backlog.

IA and the Drawer

We’ve known for a long time that our current solution for the call-controls (Speaker/Mute/End Call) was flawed. The user needs to tap on the timestamp on the top right of the screen to open the call controls. There are no signifiers to help. It’s only by process of elimination that everyone figures it out.

The tricky thing with Streem’s situation is that since the user might be using an AR tool—like measure—at the time, we can’t have controls that appear when you tap the screen, as in Facetime. That would start or end a measurement in the space.

I decided that a drawer was the best solution. It had the added [huge] benefit of also giving us someplace to put all of the call details (Name, address, map, Streemshots, serial numbers, etc.). I planned to test a number of drawer designs with users. Below and to the right was my favorite but I left Streem before getting a chance to test them.

Results

It’s interesting that so much research and testing can lead one to cancel features that were assumed to be helpful. The tools themselves were so well designed from the start that they needed only incremental improvement: updates to their selected states. Instead, we could turn our attention to the information architecture; how to add all of the call details in an accessible and intuitive way.

Previous
Previous

Ad-Tech: Automation

Next
Next

Ad-Tech: Trigger Events