How to Compromise a Product Vision

Great products start with a vision. Somebody — perhaps a small group of people — has an idea to change how something works in the world. On its way to becoming a real thing, the team tweaks and adjusts the idea; they make small compromises to the laws of physics, market demands, manufacturing constraints, user feedback, and so on. In the process, the idea goes from a “perfect” imagining of the vision to a pretty good embodiment that can be used by people in the real world.

At least that’s the ideal. However, sometimes a product changes so much that its original vision becomes compromised. One of the best examples I’ve seen of this happened to one of the attractions in the Magic Kingdom theme park at Walt Disney World: Walt Disney’s Carousel of Progress. This is one of the few Disney attractions that have Walt’s name on them. There’s a good reason for this. The Carousel was the highest expression of his particular genius: using new technologies to convey big ideas to the masses in ways that they could connect to at an emotional level. Some people say it was his favorite attraction.

Continue reading

In Praise of Email, Which I Want Less Of

For many people today, most work happens in information environments. Much of it consists of collaborating on and coordinating activities. In other words, it requires communicating with other people. There are various ways for people to communicate in information environments. Email is one of the oldest. It’s also one of the best.

You can set communications channels on a continuum based on latency. On one end of this continuum, you have face-to-face communication, which is very low-latency. When somebody says something to you in your presence, you get what they’re saying almost as soon as the words leave their mouth. In many cases, there’s also a social expectation that you will reply right then and there. Long pauses can be awkward; you must respond even if it is to say, “I need a minute.” Face-to-face oral communication is what we call a synchronous channel — the back-and-forth between participants happens in “real time.”

On the other end of the continuum are communications channels that don’t carry such expectations, mostly for technical reasons. For example, if you send a question on a (physical) postcard to your friend halfway around the world, you know that the piece of cardboard that conveys your question will take some time to get to where your friend is. Your friend might read your message a few days from now. If she chooses to respond with a postcard of her own, it too will take several days to reach you. Considerable time will elapse between the time when you issued your question and when you got a reply. Thus, postal mail is what we call an asynchronous communication channel.

Continue reading

Think Better, Fast

The quality of your thinking is the factor that will most impact your life. Thinking well is essential to getting anything done, and this is as true for teams as it is for individuals. The better your thinking, the better the decisions you’ll make. This, in turn, will make it more likely you’ll achieve your goals. What would it be worth to you if you could think better, both individually and collectively? The dividends would be many-fold.

The first step to thinking better is understanding how you think. Many people believe thinking is something that only happens in the brain, which they see as some kind of meat computer. This is a misunderstanding. Cognition is more complicated than this. The brain is only one part of a complex system that extends outside of the body. As I write these words, I see them appear on the display of my MacBook Pro. My fingers move over the keyboard, and characters appear on the screen. The sentences I write don’t emerge fully-formed from my brain. Instead, the computer holds them for me in a temporary buffer where I can see and reflect on them. No, that’s not the right word; let’s try another one. I delete the word, type a new one. Over and over again. Little by little, the product of my thinking emerges from this brain-senses-fingers-keyboard-display system. The computer is part of my thinking apparatus, and not in a superficial way, but deeply. It would be more difficult for me if I had to craft the sentences exclusively in my brain and then transcribe them in “finished” form.

Of course, the extension of your brain doesn’t need to be a computer. You can also think with a paper-based notebook, a marker on a whiteboard, a stick on wet sand, etc. When you sketch or take notes in a journal, the notebook becomes a part of your thinking system. When you use a pile of stones to work through an arithmetic problem, the stones and the ground they’re lying on become part of your thinking system. You work through ideas by seeing them “out in the world.” There, you can explore relationships between elements in greater detail than you could if you had to hold everything in your mind. You change things, move them around, try variations, iterate, refine — much as I’m doing with the sentences I write here.

Continue reading

Steve Jobs: The Lost Interview

Yesterday on a cross-country flight I had the opportunity to watch STEVE JOBS: THE LOST INTERVIEW, a documentary recorded in 1995 and released to theaters shortly after Jobs’s death in 2011. As its name implies, the film consists of an interview Robert X. Cringely conducted with Jobs for THE TRIUMPH OF THE NERDS, a PBS documentary about the development of the personal computer. Footage from the interview was lost for a while, but resurfaced after Jobs’s death.

The film shows Jobs at an interesting time in his life. This was before his triumphant return to Apple, which was then at its nadir. At this point, the company Jobs founded after leaving Apple (NeXT) had already transitioned from making computers to making software. It’s fascinating to see him frame this development; when talking about NeXT, he doesn’t mention the company’s computers at all. Instead, he talks about object-oriented programming as one of three major advances he witnessed in a visit to Xerox PARC in the late 1970s; the other two being ethernet networking and the graphical user interface. The latter of these, of course, is what led to the development of the Mac. In this way, Jobs ties his past success with his (then) current endeavor. Jobs is very clear on the lineage of these technologies; he doesn’t claim to have invented any of them. (At one point he even cites Picasso’s famous quote, “good artists copy; great artists steal.”)

Continue reading

OneNote

You’re likely to run across lots of information during your day. Much of it is disposable, but some you’ll probably need to refer to in the future. A lot of it might be useful someday, but you just don’t know right now. Given how easy it is to search digital information, and how cheap storage is these days, you may as well keep it. I’ve long experimented with “digital junk drawer” applications for this use. I’ve tried Evernote, Yojimbo, Google Keep, Apple Notes, and Org Mode for Emacs, but my favorite thus far is Microsoft’s OneNote.

I keep a lot of stuff in OneNote: clips from web pages, quotes from famous people, impressions from books I’ve read, ideas for future presentations, meeting minutes, half-formed thoughts, etc. OneNote provides easy means to clip snippets of information from web pages and other apps whether I’m on my Mac, iPhone, or iPad. This makes it possible for me to keep a central repository of things I’m learning as I go about my day. It all syncs through Microsoft’s cloud, so all three devices have the latest information on them.

But OneNote is more than just a scrapbook for me: It’s also where I keep my projects organized. Whenever I start a new project, I open a new notebook in OneNote devoted exclusively to it. OneNote notebooks can have “sections” in them. Most of my projects have at least two sections: “Notes” (random notes, including scribbles to myself) and “Meetings,” where I record meeting minutes. Some notebooks also have other sections, such as “Admin” and “Research.” I aim for consistency with the naming and color schemes I use to differentiate these subjects. This allows me to quickly make sense of what I’m looking at when I switch projects.

Continue reading

Goodbye Google+

In a blog post published yesterday, Google announced plans to shut down the consumer version of its social network, Google+, over the next ten months. I’ve been quietly posting to Google+ over the past few years — mostly out of habit, since there is so little activity there. Given Google’s decision, I’ve decided to stop posting there at this point.

I’m saddened to see Google+ go; the network explored many interesting structural ideas during its existence. I also found some Google+ communities to be more thoughtful and useful than those in other social networks. Still, I always got a sense that Google+ wasn’t as diverse a place as Facebook or Twitter; that it never transcended its initial audience of early adopters. I’m not surprised by Google’s decision to discontinue it. (Again, as a consumer offering; Google+ will live on as an enterprise service.)

If you followed me in Google+, the best ways for you to keep up with my posts from now on are:

How Do You Know What’s True?

How do you know what’s true? I mean this in the most prosaic way you can imagine. As in, how do you know which conditions about the “real” world should inform your decisions? Your answer to this question will have a big impact on your life.

Imagine you’re walking in the dark and hit a wall. The pain you feel is a good sign that the wall is “true.” It’s unarguably there; you just ran into it. The bump proves it. There’s a baseline here: what your senses are reporting back to you. The pain is information. If you have any wits, you’ll avoid doing that again.

Not everything is as obviously true. Let’s take another example. Say you’re trying to manage your weight, so you step onto your bathroom scale. The number you read on the scale’s display, too, is information. By itself, it’s less obviously meaningful than the pain you feel from the wall. To begin with, it’s more abstract than the bump. To make sense of the number, you must know what it is and what it means. Is it higher or lower than “normal”? Well, that requires that you know what “normal” is. If this is the first time weighing yourself, you may not have anything to compare it to. So you may have data, but not necessarily information.

Continue reading

It’s Not Complicated

When he introduced the iPhone 7 in 2016, Apple executive Phil Schiller described the company’s decision to remove the phone’s headphone jack as “courageous.” While some people mocked this assertion, Schiller’s point is valid: Apple often makes bold decisions and sticks by them even when they may be unpopular (as with the headphone jack.)

This courage doesn’t just come across in the design of Apple’s products and their features; it’s also sometimes evident in the language the company uses to describe them. Remember when the iPad was first announced? This was a time when Apple still had a product in their lineup called iPod; the name iPad lent itself to confusion. I remember stumbling at first when trying to talk about the device. Now the name iPad feels natural. Some people call all tablets iPads, even the ones produced by other manufacturers. It’s become the name of the form factor itself, not just the product. Apple pulled off a coup with that label, a testament to the power of their marketing.

More recently, the company has made another bold naming choice. I’m talking about the word they’ve chosen to describe how users add functionality to watch faces in the Apple Watch. You can’t just say “the space in the watch face where you can see the temperature.” Too clunky. At some point, a team at Apple had to discuss giving these things a name. The word they chose? Complications.

Continue reading