“The increasing velocity of knowledge is widely accepted as sure evidence of human mastery and progress. But many, if not most, of the ecological, economic, social, and psychological ailments that beset contemporary society can be attributed directly or indirectly to knowledge acquired and applied before we had time to think it through carefully.”

— David W. Orr

A few years ago I was in a meeting with smart folks working in the tech industry. We were talking about the possibilities for “Internet of Things” technologies in everyday life: “smart” locks, “smart” cars, “smart” dog collars, and so on. The conversation centered on the capabilities of the technologies; ethical considerations were almost entirely absent. What if the tenant fell back on his rent and his bank used his “smart” lock to shut him out of his apartment? Too damned bad. I was dismayed. How could our love for our technical capabilities so blind us from their social implications?

Facebook used to encourage its developers to “move fast and break things,” a statement that captures the essence of a “lean” evolutionary approach that is pervasive in the tech world. You can’t sit still lest your competitors gain an advantage. Teams should constantly be innovating, experimenting, and testing. Good ideas will emerge and evolve. Collateral damage is not a problem: further iterations will take care of it. Fast wins!

While this approach is great at exploring and exploiting the capabilities of new technologies, it doesn’t leave much room to ask what these things are ultimately in service to. Admittedly, many life-improving innovations have emerged from the competitive pressures of the market. But some things are too precious to risk. What if the thing we’re breaking is the very socioeconomic system that fostered the technium in the first place? How can we improve societies whose ability to hold civil discourse has eroded? What would this loss do to our ability to iterate?

Technologies can make us smarter, but they can’t make us wiser. That’s entirely up to us. Wisdom calls for self-reflection and self-control, and our current mechanisms lack either. We have an effective means to produce new technologies, fast. But what if we also had means to consciously choose whether we want to live in the world they will bring about? Isn’t that what design is ultimately for?