• Share
  • Share

Everywhere and nowhere: the future of interfaces

A glimpse into the future of interfaces and what will—and won’t—change for UX designers.

Tim Letscher | October 21, 2016

A song comes on the radio as I drive home, and it instantly unlocks memories of a trip I took to Ireland with my wife. Swiping two fingers up my arm, I add the song to a Spotify playlist and tap three fingers on my heart to text my wife that I love her. With temperatures rising earlier today and knowing I’ll be the first one home tonight, I rub my hand down the upper part of my arm two times, notifying my Nest thermostat to drop the A/C by 2° before anyone gets home.

Sounds like the distant future, right? It’s closer than you think.

Since the Spring of 2015, Google has been in its labs working on Project Jacquard—connected clothing that communicates with our surroundings. When this type of tech becomes widely distributed and adopted, what’s the interface? Better yet, where is the interface?

Project Jacquard might be in a niche space of fashion tech, but anywhere you look, emerging technologies are doing away with the need for a keyboard, mouse, or even touchscreen interfaces.

Case in point: pizzas ordered with a tweet. Laundry detergent purchased and shipped with the touch of an Amazon Dash button by your washing machine. Voice commands that change TV channels. These examples break the confines of the traditional interface, and they’re fantastic. But where does this new world leave the user experience designer?

The widely accepted definition of interface as UX defines it is arguably only a decade or two old. It’s pixel dimensions, Retina displays, and X-Y dimensions. User experience and user interface designers have spent years honing their craft and perfecting the digital experience. Just when mouse interactions were approaching the sublime (remember Don’t Click It circa 2005?), along came touch interfaces, forcing an expanded vocabulary and library of gestures to sufficiently describe the intended experience. Even the world of touch interactions is turning upside down. As designers are mastering gestures like pinching or swiping, advances in iOS like ForceTouch are making them consider what happens when users press softly or harder on a glass surface or trackpad.

Is it time to toss that learning out the window?

Hardly. While new mechanics and technologies need to be incorporated into our processes, these core UX principles will always apply:

  1. Serve a need. Always design with empathy for the people you’ll serve. Do only what is necessary, not anything that’s possible.

  2. Design for interaction feedback. Without progress bars or spinning beach balls, what ways will you let someone know their waving hand worked? What feedback do you provide to a person’s voice command?

  3. Design for intent. Think about intended behavior and subsequent outcome. It’s less about information architecture and more about choice architecture.

  4. Don’t get in the way. For every interaction you design, ask, “Does this slow someone down?” While Steve Krug’s first edition of Don’t Make Me Think was written through the lens of designing for screens, the book title’s core tenet still applies.

As a UX designer, keep the user as the primary concern. Start with empathy for their needs and motivations rather than losing sleep chasing down the latest changes in technology. In the end, the most important thing is to design a relevant and delightful experience for people. And when you focus on your users, you'll be doing just that.

Tim Letscher originally published this post on LinkedIn Pulse.

Slalom's Tim Letscher

Tim Letscher is a solution principal in Slalom Minneapolis. Tim oversees the execution strategy of programs that guide client interactions online and across the broader digital experience including how the digital world is constantly bleeding into the physical. Follow him on Twitter: @let5ch.