Yesterday, as I was winding down from a busy week, I learned of the death of someone who influenced me greatly: Rush drummer and lyricist Neil Peart. As with so many other nerds of my vintage, Rush’s songs — and especially their lyrics, most of which were written by Mr. Peart — are key to the soundscape of my formative years. I never met Mr. Peart, although I did have the privilege of seeing him play live. Nevertheless, I consider him a model of integrity and mastery, someone to emulate.
It’s clichéd to highlight the geeky teen appeal of Rush’s (early) sci-fi themes. Instead, what drew me to their songs was their advocacy of self-agency. For someone brought up as Catholic, this stance — exemplified by the song Freewill — was shocking and refreshing:
Last weekend I did something I’d never done before: I reupholstered a chair. Here’s a photo of the final result:
Unfortunately, I don’t have a “before” photo to share. But take my word for it: my efforts improved this chair’s condition significantly. Before this weekend, it was unpresentable. Little fingers love to tug on tiny tears in vinyl until they become large, unsightly tears. Alas, it’s cheaper to buy new reproductions such as this one than to have them professionally reupholstered. But my conscience doesn’t let me throw away an otherwise good piece of furniture because of a fixable imperfection.
I’m sharing my weekend project here not to seek your approbation. Instead, I want to highlight that we live in a time when we can learn almost any skill on the internet. I learned to reupholster in a couple of hours through a combination of websites and YouTube videos. I researched and ordered the required materials on Amazon. It took some effort on my part, but it was worth it. I’m surprised at how well the chair worked out, given it was my first time.
As we head into a new year, I keep seeing pundits on Twitter claiming “big tech” is ruining everything. Of course, the real world isn’t as simple as these folks render it. Sure, there are negative aspects to our current large tech platforms — but there are positive ones too. The ability to pick up new knowledge and skills anytime at our own pace very cheaply is among the positives.
In case you haven’t seen it, The New York Times has a new-ish section called Smarter Living that offers pointers on how to be effective in our hybrid physical-digital world. A recent article by Victoria Turk is representative; it highlights the importance of good manners online:
As more of our lives moves online, good digital etiquette is critical. Just as we judge people by their behavior IRL — in real life — so we take note when a person’s manners in the digital sphere leave something to be desired.
The article addresses some of the challenges of operating in contexts made of (written) language:
Both the content of your message and its tone will live or die based on what you type on your keyboard, so the gap between, say, landing a joke and causing mortal offense can be perilously fine.
It goes on to suggest ways in which you can be mindful about your online etiquette; all good reminders.
I’m glad to see major publications like the Times acknowledging the contextual nature of our digital environments. Being effective in today’s world requires that we become adept at operating in places made of text. Minding our manners in these places is perhaps more important than in physical environments since written language is so easy to misinterpret. It also sticks around: spoken words are evanescent, but your online posts will be there for a long time.
While the NYT article doesn’t mention it, for me, an important part of minding my manners online is reminding myself that I’m dealing with other people, not just collections of pixels and metadata. These people — different though their positions may be from mine — also experience joy and suffering and all the tribulations of being human. I’m often reminded of this beautiful admonition from Kurt Vonnegut:
Hello, babies. Welcome to Earth. It’s hot in the summer and cold in the winter. It’s round and wet and crowded. At the outside, babies, you’ve got about a hundred years here. There’s only one rule that I know of, babies — ‘God damn it, you’ve got to be kind.’
Yesterday was Black Friday, the busiest shopping day of the year in the U.S. Thanksgiving — a holiday centered on gratitude and sharing with family — always falls on a Thursday, and many people also have the following day off. Given the nearness of the year-end gift-giving season, this “free” Friday is the perfect day for many to go shopping. Having long recognized the opportunity, retailers offer attractive discounts to spur buyers.
The result is a potent mix of three things I dislike: short-term thinking, mindless consumption, and crowds. So I avoid stores on Black Friday. But this year, my family and I went one better: we took something we’d set aside but hadn’t yet discarded and brought it back to life. We did it as a group and had a lot of fun. I’d love for this to become a new tradition for us, so I’m giving it a name: Back Friday.
There was a time, many years ago, when I used only one computer for my day-to-day work. It was a laptop, and it was with me most of the time, at least during the workday. I accessed my digital information exclusively on this device: email, files, etc. I kept my calendar on a (paper-based) Franklin Planner. For mobile communications, I used a beeper. I told you it was a long time ago — a simpler time.
Then a new device came on the market, the Palm Pilot:
It was like the paper planner, only digital: it could store your calendar, address book, to-dos, and such. You’d write into it using a gesture alphabet called Graffiti, which you had to learn so you could use the device. But most importantly, you could also sync it with your computer’s calendar, address book, etc. You did this by sitting it on a cradle that came with the device and pushing a button. You connected the cradle to the computer using a serial cable and installed an app on your computer to manage communications between the devices. It was crude and complex, and I loved it. The prospect of having my personal information in digital format with me anywhere was very compelling.
The Velvet Underground co-founder John Cale is one of the most influential musicians of the last fifty years. In an interview from 2016, he tells stories from his life and career:
Towards the end of the interview, Mr. Cale recounts a childhood marked by a difficult relationship with his father and sexual abuse from a teacher:
My grandmother made some rules that there was no English to be spoken in the house. So it was very quiet, and any communication with my father was limited because I didn’t speak English and he didn’t speak any Welsh. But there’s this overwhelming feeling that you are inadequate. Between that and the incident with the organ teacher and the abuse there, it made me a victim. And really, one way or another I figured out that being a victim really has repercussions all through your life. And you really do not want to be in your own mind, or in anybody’s mind, as a victim.
His mother helped him change how he thought about himself:
She said, look, always find someone good in somebody. Because everybody does have a good side. You’re a complete person, and you’re not a victim. That’s very important.
I’ve been fortunate to not have to deal with anything as traumatic in my life. Still, I’ve occasionally lapsed into a victim mindset. It’s disempowering. Like Mr. Cale, I’ve found it possible to transcend this mindset, with indispensable help from my wife, by reframing how I think about myself.
“What were some of the mindsets, habits of thinking you had to unlearn transitioning from [architecture] to [information architecture]?”
The answer that comes immediately to mind is: “not that many!” I consider architecture a perfect training ground for information architecture. There are many more architectural mindsets that apply to information architecture than mindsets that require unlearning. That said, as I’ve thought about it I’ve realized there is, in fact, a mindset I picked up from architecture that I’ve had to unlearn over time: the idea of the architect as the central element of the design process.
Architecture is rife with what are referred to as starchitects — practitioners whose work and style is known around the world, and whose fame influences the work. Clients seek out Frank Gehry because they want a Frank Gehry building. Gehry’s office is bound to produce buildings in the Gehry style regardless of what contextual conditions call for.
When I was a student, most of the works we looked at were produced by starchitects. The implication was that that’s what we ought to aspire to. The first few years of my career, I labored under the delusion that I was at the center of the work. Over time, I came to realize that effective designers (in any field!) primarily serve not themselves or their architectural ideologies, but the work. I came to suspect the idea of having a “house style” — something I longed for at first.
To put it bluntly, I left architecture school with an inflated ego. The main mindset I had to unlearn as I transitioned to information architecture was the centrality of my own ideas, desires, and “style” in the design process. Instead, the core of what I aspire to now is form-context fit. This calls for understanding through collaboration; it calls for research and open-mindedness. Experience is primarily in service to the process, not the other way around. Getting my ego out of the way — embracing beginner’s mind — took many years of practice.
Like many other people, I have a morning routine. Journaling is an important part of this routine. Every morning, at the start of my day, I set aside a few minutes to reflect on the previous day and what today holds.
I’ve written before about the structure of this journal. Recently, I’ve added a new section: What did I learn? Specifically, what did I learn the day before? (And by implication: How do I need to change my behavior?) I sit with it for a little while, replaying the previous day. These are some other questions that help with this exploration:
What did I try that didn’t play out as I expected?
What expectations did I have that weren’t met?
What expectations were exceeded?
What information would help reduce the gap between these expectations and actual outcomes in the future?
How can I procure this information?
What patterns have I noticed?
What happened unexpectedly/serendipitously?
What resonances/synchronicities did I notice?
This last question may seem weird; it requires unpacking. Sometimes I’ll be thinking about something and the next day (or sometimes, the same day), I’ll come across the same idea in a podcast, book, article, etc. An echo of sorts. Sometimes these resonances are very peculiar ideas, to the point where I’m startled by the coincidence.
I don’t think there’s anything supernatural at work. The fact that the concept stood out merely suggests that I’m paying attention to it on some deep level. It’s like when you’re thinking of buying a particular model of car and suddenly you see the same model everywhere. Those cars were always out there, but now your mind is primed to pay attention. These could be important signals.
Note these are questions about things that are under my control: my attitudes, expectations, plans, etc. I don’t bother to document things I learned due to events happening in the outside world, out of my control. (E.g., stuff in the news.) Rather, I’m trying to establish a feedback loop that allows me to become more effective over time. Growth calls for introspection; What did I learn? is a useful trigger.
Over a decade and a half ago, I was at an office party. (This was during the brief part of my career when I was in-house at a large corporation.) Among other amenities, the party featured a cartoonist; the type of artist you see drawing quick, exaggerated portraits at fairs. The artist was hired to draw each team member, highlighting striking things about us: quirky hobbies, particular styles of dress or grooming, tics, etc. I don’t remember if it was at my suggestion or that of my co-workers, but my cartoon showed me surrounded by electronic gadgets: mobile phones, MP3 players (a new thing at the time), notebook computers, cameras, etc. That’s how I saw myself and how my colleagues thought of me; tech was a primary part of my identity.
I’ve long been representative of the “early adopter” demographic. Being alive (and privileged enough to have some discretionary income) during the Moore’s Law years has meant seeing (and benefiting from) tremendous advances in many areas of life. Consider the way we listen to music. In the span of a few years, my entire music collection went from heavy boxes filled with clunky cassette tapes to a few light(er) CD cases to a tiny device not much bigger than a single cassette tape. Portable MP3 players represented a tangible improvement to that part of my life. The same thing has happened with photography, movies, reading, etc. It’s been exciting for me to stay up-to-date with technology.
That said, as I’ve grown older, I’ve become more aware of the costs of new things. I’m not just talking about the money needed to acquire them; every new thing that comes into my life adds some cognitive cost. For example, there’s the question of what to do with the thing(s) it replaces. (I still have cases full of plastic discs in my attic. I’m unsure what to do with them, considering the amount of money I’ve already sunk into them.)