I recently had the pleasure of interviewing Stephen P. Anderson for The Informed Life. (Episode coming soon!) Among many other things, we discussed a concept from his new book with Karl Fast, Figure it Out: a cockpit as a key component of a pilot’s cognitive apparatus. As Stephen and Karl put it in the book, “An airplane cockpit is an environment loaded with external representations that make flying easier and safer.”
I won’t spoil the fun of our conversation here. (If you’re curious, I recommend you read the book, which is great.) I only mention it now because yesterday I saw a video that explains in detail the user interface of an F-15 jet fighter:
Among many insights in the video: getting a sense for the highly tactile nature of the physical controls of the aircraft, such as the various buttons and knobs on the control stick — including the “castle” switch and the “pickle” switch. (Yes, pickle. The fighter pilot who takes us through the cockpit explains the name’s origin.) The cockpit seems like an environment designed to reduce as much as possible the distance between the pilot’s reflexes and the jet’s actuators.
I learned a lot from this video, and was left with high expectations — it’s labeled as the first of a series called Human Interface. Subscribed.
In other words, the app is ditching labels in buttons, and going only with icons.
As noted above, the objective is to make the experience more accessible globally. I’m taking this to mean that interfaces with few or no words on them are easier to translate into different languages than those that include words. But that’s a gain for designers and developers, not necessarily for users.
This label-less approach to buttons works best when working in a domain that already has rich iconography. Spotify operates in such a domain — i.e., music players. If you had to design an interface with buttons to play, pause, go backwards or forwards, you’d have a clear starting point; we’ve had devices with buttons that accommodate these behaviors for a long time. My preferred music player (Apple Music) also has label-less buttons for those actions:
Modern music-playing apps must also accommodate other actions that may not have well-known icons. For example, the video above highlights the user downloading a track to their phone. In the new Spotify UI, the button for this action shows a downward-pointing arrow inside a circle. Downloading tracks from the internet to local devices hasn’t been around as long as play/pause/rewind has. I’d bet more folks would get tripped up by such a button in the absence of labels.
Note in the video that some buttons in the new Spotify interface don’t have icons at all; they consist solely of labels. For example, the “Follow” button simply consists of the word “Follow.” When pressed, the label changes to “Following” to indicate the object’s changed state. This is a rich interaction that would be difficult to communicate clearly using only icons.
From a report in The Verge about the aftermath of the collision of the USS John S. McCain, which killed ten sailors and injured many more:
The US Navy will replace the touchscreen throttle and helm controls currently installed in its destroyers with mechanical ones starting in 2020, says USNI News. The move comes after the National Transportation Safety Board released an accident report from a 2017 collision, which cites the design of the ship’s controls as a factor in the accident.
Not the only factor, to be sure; fatigue and lack of training also played a role. Still:
Specifically, the board points to the touchscreens on the bridge, noting that mechanical throttles are generally preferred because “they provide both immediate and tactile feedback to the operator.” The report notes that had mechanical controls been present, the helmsmen would have likely been alerted that there was an issue early on, and recommends that the Navy better adhere to better design standards.
There are systems in which humans are expected to play key control roles. A destroyer is one such system; as far as I know, fully autonomous vessels of this size don’t yet exist. User interfaces aren’t only means for users to send commands to systems; UIs also provide feedback to users. Some of this feedback is visual and auditory. But humans are creatures with more than two senses, and traditional mechanical controls can provide richer interactions than touchscreens, most of which rely primarily on sight.
When I was first getting started with computers, in the late 1970s, user interfaces looked like this:
Getting the computer to do anything required learning arcane incantations and typing them into a command line. While some — such as LOAD and LIST — were common English words, others weren’t. Moving to a new computer often meant learning new sets of incantations.
As with all software, operating systems and applications based on command line interfaces implement conceptual models. However, these conceptual models are not obvious; the user must learn them either by trial and error — entering arbitrary (and potentially destructive) commands — or reading the manual. (An activity so despised it’s acquired a vulgar acronym: RTFM.) Studying manuals is time-consuming, cognitively taxing, and somewhat scary. This held back the potential of computers for a long time.