A version of this post first appeared in my newsletter. Subscribe to receive posts like this in your inbox every other Sunday.
Last week, Apple held its Worldwide Developers Conference (or WWDC.) At this annual event, the company announces significant changes coming to its software platforms (including iOS, the iPhone’s operating system.) As with many other such gatherings this year, WWDC 2020 was held online. But that’s not what interests me most about this year’s event. Instead, I was intrigued by the steps Apple is taking to bring coherence to its ecosystem.
Apple is in a unique position in this regard. The company makes operating systems for a variety of computing device form factors: phones (iOS), tablets (iPadOS), watches (watchOS), TVs (tvOS), and personal computers (macOS). Apple competes with other companies in each platform segment, but no other company has strong contenders in every platform segment.
All of these Apple platforms have “family resemblances.” A first-time Apple Watch user will find details and interactions that are reminiscent of what she’s experienced on her iPhone. This makes the system more learnable and pleasant.
However, each platform has evolved to serve its particular niche in the ecosystem. (Some platforms, such as the Mac, have evolved over a very long time.) As a result, each provides different ways for the user to interact with the system. For example, iOS is based around multitouch (multi-finger gestures) as a primary interaction mechanism. iPadOS is iOS writ large, so it affords different gestures given larger screen sizes. On the opposite end, watchOS also accommodates touch-based interactions but does so on tiny screens. macOS and tvOS don’t support touch at all and depend instead on indirect manipulation through a pointer.
The outlier in this ecosystem is macOS; its WIMP (windows/icon/menu/pointer) user interface paradigm has evolved over almost four decades. As an interaction mechanism, WIMP is very different from multitouch. Macs are also the only computers in Apple’s ecosystem not to run on ARM-based chips, a fact that has set them apart from other parts of the ecosystem.
But at this year’s WWDC, Apple announced its intention to transition Macs to the same chip architecture that powers iPhones and iPads. This opens up an intriguing possibility: “Apple silicon”-based Macs will run software designed for iOS and iPadOS. With this move, macOS will gain access to a huge software library – albeit one designed for a different interaction paradigm.
As a Mac user, I find the possibility exciting. There are several apps I use on my phone that currently have no good equivalent on my computer. But I also expect this crossing of boundaries to set up weird situations. macOS isn’t designed to accommodate touch-based interactions. For one thing, touch targets are too close together for our (relatively) fat fingers. Macs also don’t have the same capabilities as iPhones or iPads.
The next version of macOS is getting a significant makeover to bring it closer in line with iOS and iPadOS. It’s not unrealistic to expect touch-enabled Macs in the future. But will Macs also gain the array of sensors we have in our mobile devices? I wouldn’t expect so; these are different platforms that serve different needs. Macs will fundamentally remain personal computers, iPhones smartphones, and iPads tablets. That said, I expect we’re going to see the lines blurring between these platforms.
It’s a fascinating design problem: how do you bring coherence and parity to a diverse ecosystem, while also allowing each platform to shine at what it does best? As a designer and an Apple customer, I’m excited to see how the company moves to address this challenge.