Over a decade and a half ago, I was at an office party. (This was during the brief part of my career when I was in-house at a large corporation.) Among other amenities, the party featured a cartoonist; the type of artist you see drawing quick, exaggerated portraits at fairs. The artist was hired to draw each team member, highlighting striking things about us: quirky hobbies, particular styles of dress or grooming, tics, etc. I don’t remember if it was at my suggestion or that of my co-workers, but my cartoon showed me surrounded by electronic gadgets: mobile phones, MP3 players (a new thing at the time), notebook computers, cameras, etc. That’s how I saw myself and how my colleagues thought of me; tech was a primary part of my identity.

I’ve long been representative of the “early adopter” demographic. Being alive (and privileged enough to have some discretionary income) during the Moore’s Law years has meant seeing (and benefiting from) tremendous advances in many areas of life. Consider the way we listen to music. In the span of a few years, my entire music collection went from heavy boxes filled with clunky cassette tapes to a few light(er) CD cases to a tiny device not much bigger than a single cassette tape. Portable MP3 players represented a tangible improvement to that part of my life. The same thing has happened with photography, movies, reading, etc. It’s been exciting for me to stay up-to-date with technology.

That said, as I’ve grown older, I’ve become more aware of the costs of new things. I’m not just talking about the money needed to acquire them; every new thing that comes into my life adds some cognitive cost. For example, there’s the question of what to do with the thing(s) it replaces. (I still have cases full of plastic discs in my attic. I’m unsure what to do with them, considering the amount of money I’ve already sunk into them.)

There’s also the cost of having to learn new ways of doing things. While I love trying out new user interfaces, this often requires establishing new workflows. That can take a lot of time. My first portable MP3 player had (what was, in retrospect) a very clunky UI. Getting music onto to it was a pain. (This was pre-iPod.) I put up with learning new ways of managing my music because the convenience of having lots of random-access songs with me at all times was a compelling incentive. That said, I sometimes felt like I was spending as much time dealing with the system as I was using it to listen to music.

Then there’s all the collateral stuff that comes with new technology: the cables, cases, batteries to keep charged, etc. All of it must be accounted for, and a lot of it is clutter. I have drawers full of USB-A cables that are quickly becoming obsolete. Yes, I can donate them or otherwise get rid of them. But these are things I currently “own.” (And as we know from Fight Club, this means they own me.) There’s an entry somewhere in my meat computer for each one of those cables, taking up background processing cycles.

All of this stuff accrues. I’ve tried several approaches to reducing the load over time. For example, I’ve started reselling major devices (those that have preserved some degree of value) whenever I buy new ones. That at least keeps the number of devices in the household stable. Now that my kids are getting older, I’m also passing my older devices down. (Although that doesn’t do much to ease my mind; I must still account for those under my roof. Still, it prolongs their useful life.)

Most significantly, I’ve also become more demanding about what constitutes new functionality. That is: I’m looking for ways to cut down on the amount of stuff I buy. A new gadget must give me significant new abilities before I’m willing to consider replacing an older model. For example, while I was an early adopter of the Apple Watch, I skipped two generations before upgrading to a new model. There just wasn’t enough new stuff to make it worth my while.

But these are all reactive measures. I’m now at a stage where I’m considering becoming more proactive about simplifying my life. By this, I mean getting rid of things altogether, even if it means losing marginal capabilities or conveniences. I’ve written earlier about my ambivalence about AirPods. If I’m honest with myself, AirPods don’t do things radically better than my previous headset. In some ways, they do worse. I’m still using them, but it’s mostly because I’m long past the return window. (And I’ve dropped and scuffed the case, so they’ll be worth less if I try to resell them at this point.) But I am considering passing them on regardless.

Smartwatches, AirPods, cameras, tablets — they’re all incredible at doing particular things, but few open up previously impossible capabilities or otherwise improve my quality of life in radical ways. (Not even close to what I experienced when changing from cassette tapes to MP3 files.) My (old, tatty) Kindle was one such improvement; carrying a part of my library in my backpack was a big deal. But newer versions of the device only refine the experience. (E.g. they add niceties like higher screen resolutions and lighting.) Is it worth it? And why do I need a Kindle anyways when I also have an iPad and an iPhone, both of which perfectly capable of rendering books?

I’m increasingly drawn to simplifying, paring down, or at least sticking with the older version of things for a bit longer. The time it’ll take for me to deal with the new device can be better spent learning, making, listening, watching, etc.