One of the most important insights from the last few years has been that replicating the UI and UX of Web sites or mobile apps on in-store devices really doesn’t work. Although there are similarities between them, they are fundamentally quite different.
In the early days of Grid, we used to offer a Web Kiosk app – basically, your Web site on a kiosk in-store. It was a quick, easy, and cheap solution. All it did was display it on a touchscreen with a custom browser with an on-screen keyboard, or even a physical keyboard. Instantly, customers using a kiosk in your store could do anything they could do on their laptop or mobile.
Except... it didn’t work, and we withdrew it from sale.
Technically, it worked just fine. That wasn’t the problem.
The issue was more fundamental. The way customers use an in-store device isn’t the same as they use a Web site. As with any design issue, there are two major considerations: form, and function.
Matching the UI to the physical device
Many years ago, designers realized that replicating the desktop UI on a mobile device wasn’t ideal. Apps needed to be designed around a touch-based interface and gestures, rather than keyboard and mouse. Screen layouts were redesigned so that the most common features were easily accessible to the user’s thumbs – in other words, putting them at the bottom of the screen, not the top. The content of each screen was optimized for a small display, almost always in portrait mode. And, as mobile hardware and software developed, apps were redesigned to take advantage of cameras, fingerprint scanners, motion, location, and so on.
We're going through the same learning curve with in-store devices. The devices we work with include large touch screens, vertically mounted smart mirrors and digital signage, and a plethora of specialized devices that incorporate RFID readers, cameras, voice input, barcode readers. Using these devices is nothing like using a phone or a laptop, even if they’re displaying the same underlying content and offering similar functionality. They may look like giant iPads, but they’re not.
Consider, for example, how far you’re asking your customers to reach in order to touch a specific area of the screen. An icon at the top of a large vertically mounted screen could involve them having to stretch up to head height – or above. For shorter customers or young children, this could be a major problem. And if a customer in a wheelchair can’t reach it, this could be a violation of accessibility legislation.
The screen layout needs to be designed around each individual device. This could be different depending on the size and orientation of the screen. For example, if you’re asking users to scan a QR code, you need to ensure that it’s close to the user on a horizontal screen, which means placing it near the bottom. On the other hand, on a vertical screen, it’s probably better to place it in the middle of the screen.
What customers do in-store
One of the most important things we’ve come to realize is that in-store devices need to offer subtly different functionality. There are two main reasons for this: purpose and context.
In-store customers have different needs to mobile or desktop users. A typical online customer spends 90 minutes researching, reading reviews and recommendations, and comparing different products before making a major purchase. They may do this in several stages, perhaps over a period of several days or weeks.
However, an in-store customer probably doesn’t want to spend an hour or more standing at a kiosk doing research. They’re already in the store, so their main aim is usually to find the products they’re looking for, look at them or feel them, and then pay for them. They may want to know whether the item they’re looking for is in stock right now, and where to find it, or they may want to call someone for help or advice. They may want to pull up product information by using a barcode scanner or simply holding something up to the screen. And when it comes to payment, they don’t want to type in their payment details, or even log in to their account: they want to pull out their phone or credit card, tap, and go.
Something that needs to be considered is that using a shared device in a public space is a very different experience to using a personal device. Other customers may be able to see what you’re doing, which some people may find uncomfortable. If there’s a line behind you, you may feel pressured to complete your transaction quickly, rather than spending time browsing or researching. That alone changes the way you need to think about the UX.
Another aspect that’s become apparent to us is something that’s blindingly obvious once someone points it out. A customer using an in-store device is already surrounded by your branding, inventory, and customer experience. If they’re standing in ACME Megastore, you don’t need to use valuable screen estate to tell them they’re using an ACME Megastore kiosk. Almost every client asks for their logo to be visible on the screen at all times, usually at the top left, like on their app or website. But it’s simply not necessary. It adds to the screen clutter and the cognitive load, and usually fulfils no useful function for the customer.
As a result, the in-store UX needs to be optimized to reflect the needs of in-store customers. Some functionality or design elements that are useful on the Web or in the app may need to be removed, and other store-specific functionality needs to be added.
Making it intuitive
Part of the challenge is that these devices have to be completely intuitive. Customers expect them to work like an iPad, so they’re easily confused if the UI isn’t what they expect. This requires some subtle – and not-so-subtle – design that tells customers what to do.
The first thing we learned is that cluttered or multi-functional interfaces really don’t work well on in-store devices. Simplicity is much more effective. The functionality needs to be simple, linear, and crystal clear. Don’t overload the screen with icons or content, and don’t be afraid to put instructions on the screen.
For example, Find Your Style is a Grid-based guided selling tool. Users indicate by swiping left or right whether they like various looks, and an AI then suggests clothing ranges and outfits for them. In its original version, the screen had a carousel of stacked images. These disappear off to the left or right, simulating swiping behavior.
What we discovered is that most customers didn’t realize the screens were interactive. They assumed they were just digital signage and walked right by.
We had to find a better way to attract their attention and get customers to stop and engage. We changed the on-screen text, to “Tap to find your style” but that still wasn't enough.
What we eventually did was to add an animated hand that performs the swipe. This hints to users that the screen is interactive and also shows them exactly what to do. Everything else stayed the same. The effect of this simple change was astonishing. The day after we introduced the new UX, the number of people using the device increased by 80%, and the number of interactions increased almost 600%.
A successful in-store UX needs to take account of hundreds of tiny details like this. Although it’s tempting to reuse the UX from Web or mobile, it’s likely to result in a poor user experience. In-store UX is its own thing, and needs to take account of both the physical form of the devices and the needs of the users.