From a report in The Verge about the aftermath of the collision of the USS John S. McCain, which killed ten sailors and injured many more:
The US Navy will replace the touchscreen throttle and helm controls currently installed in its destroyers with mechanical ones starting in 2020, says USNI News. The move comes after the National Transportation Safety Board released an accident report from a 2017 collision, which cites the design of the ship’s controls as a factor in the accident.
Not the only factor, to be sure; fatigue and lack of training also played a role. Still:
Specifically, the board points to the touchscreens on the bridge, noting that mechanical throttles are generally preferred because “they provide both immediate and tactile feedback to the operator.” The report notes that had mechanical controls been present, the helmsmen would have likely been alerted that there was an issue early on, and recommends that the Navy better adhere to better design standards.
There are systems in which humans are expected to play key control roles. A destroyer is one such system; as far as I know, fully autonomous vessels of this size don’t yet exist. User interfaces aren’t only means for users to send commands to systems; UIs also provide feedback to users. Some of this feedback is visual and auditory. But humans are creatures with more than two senses, and traditional mechanical controls can provide richer interactions than touchscreens, most of which rely primarily on sight.
Reading about the USS John S. McCain accident made me recall one of my favorite digital cameras, the Canon PowerShot G10. The G10 is over a decade old. It takes beautiful, highly detailed photographs — under some conditions. But these days, most of the pictures it makes don’t compare well to those that come out of my iPhone 8 Plus, which is smaller and more versatile to boot. (It does so many more things beside taking pictures!)
Still, I can coax great shots out of the G10, especially under extreme conditions. That’s in no small part because the G10 has something the iPhone doesn’t: dedicated manual controls for key photographic settings. Because of these tactile controls, I have finer control over the operation of the G10 than I do on the iPhone. On the G10, I can change settings very fast with my fingers, using only the sense of touch. After many years using it, I’ve developed the muscle memory to adjust ISO, exposure compensation, aperture, etc. in a flash. (Pardon the pun.) This is much faster than having to shift my gaze from the thing I’m trying to photograph to a control-laden touchscreen.
As a result, I have a much more intimate relationship with the G10 than I have with the iPhone’s camera. With the G10, I’m part of the system that takes the photo. With the iPhone, my role is diminished to two primary functions: framing and snapping the shot. (I use the verb “snap,” but this doesn’t really capture the action of determining the decisive moment on a touchscreen-based camera, which happens silently and with no tactile feedback at all.)
The iPhone’s touchscreen-based camera abstracts out a lot of complexity. For a time- and attention-critical activity (such as photograph-making), that complexity is where a lot of the nuance lives. Nuanced control is especially important when dealing extreme conditions such as too much light, too little light, fast moving subjects, subjects that are very close to the lens, etc. When dealing with average conditions, a mostly automated system with few tactile controls, like the iPhone’s camera, can do a better job for most people. But when dealing with extreme conditions, expert users will produce better results with a system that affords richer control mechanisms. I suspect the same is true for other systems. (Another one that immediately comes to mind is vehicles with standard shift gearboxes.)
As software eats more of the world, we’re abstracting more controls to panes of glass. Touchscreen-based user interfaces are more flexible and cheaper to design, develop, deploy, and iterate than dedicated physical controls. But touchscreens come at the expense of nuanced control in critical situations, where tighter feedback loops between the system and its operators can make all the difference.