Many have heard of the incredible, hands free, fully untethered, augmented reality computing device created by Microsoft, known as HoloLens by now as it has been out in the early market for some time. This device allows people to interact with their environment in ways that has never been seen before, (apart from science fiction). With these new paradigms comes new challenges when designing a well-polished and compelling user experience on HoloLens. I have found that nailing down these key areas early in design will help the overall application build go smoothly.
The experience and its environment
Given any use case for the HoloLens, it is very important to consider where the application is meant to be used and what capabilities that the platform offers are needed. Early on in design there are many questions that need to be asked such as:
Where is the end user going to be using the device? Is it in a smaller confined space, or is it large and open? What kind of lighting is in the space? Is there a lot of activity that will be occurring in the space while the application is running? What sort of materials and obstacles are in the target space, and could they interfere with some of the device’s sensors or cameras? Is this a single user experience or is this meant to be shared? Is it necessary to take advantage of the device’s capabilities like spatial mapping or spatial audio?
Many of these questions seem obvious but sometimes these can be overlooked or missed in design time and can greatly affect how features are implemented on the technical side, which can have a significant impact on overall project cost.
User interaction
We are used to using computers and smart phones everyday through means of mouse and keyboard, or even touch screens, and screens in general. But these are very specific to a 2D world of computing and the well-rounded UX patterns we are familiar with often don’t apply to a 3D holographic application. The interaction methods on HoloLens primarily involve
gaze,
gesture, and
voice. For example, consider a simple scenario with a user on the HoloLens looking at a floating pane option menu with several buttons to select.
There are several ways the user can interact with these buttons. We are used to pointing and clicking so the user could look at the option they wish to select, and perform an “air-tap” gesture to “click” the button. Or perhaps the user could be looking at the menu pane and simply speak the button text to select it. Another option is the user could even hold their gaze on a selection for a moment to select the option. Maybe the result could be one or a combination of these interaction methods, if it makes the most sense for the overall experience. Taking the time to think through user interaction is key to making a 3D application both user friendly and functional.
Body-Storming
Body-what?! I asked the same question when I heard about this concept but, as the name implies; body storming is sort of like brainstorming but with your body. How it applies to holographic application design is simply this: It is literally acting out the experience in physical space with low-tech props, (boxes, pipe-cleaners, paper, Styrofoam shapes, etc.), representing things like holograms, menus and other effects. This is also where a person walks through the scenario from the user or users’ perspective to make sure the overall experience makes sense and “works”.
Some may be thinking, “how can this help when designing a holographic app?! It sounds like a complete waste of time.” (especially since this involves no code what-so-ever). I was a bit skeptical to the idea of body storming at first, but after trying it out and watching others do so, I have become a firm believer in the process and have found that it really does help in creating a superior user experience.
For example, consider the floating menu pane example from before. How does the menu appear? Does the user say a keyword to summon it? Or does it appear when the application starts? Is it floating in front of the user always, or do they dock it on nearby wall? Maybe the menu follows the user in a “tag-along” manner or it appears near or on a hologram placed in the room. What happens when the user walks around the room? Does the menu move and always stay facing the user?
Acting out the scene (with the help of a few people) can easily answer these questions and you will easily figure out what works and what doesn’t work. You may also notice that these questions regarding the floating menu pane didn’t even address how the user selects options, these too can be acted out via body-storming to figure out what feels natural and what makes the most sense given the scenario.
As you begin thinking through all the different ideas for HoloLens it becomes more and more obvious how important it is to really think through user experience and design before a single line of code is written. Asking questions and making yourself feel a little uncomfortable with body-storming will end up saving hours of development time and the finished product is more likely to be a success.