A New Normal

Tom Warren writing in The Verge:

Microsoft is revealing more about how people are using its Teams app, and it predicts the novel coronavirus pandemic will be a turning point that will change how we work and learn forever. Demand for Microsoft Teams surged worldwide last month, jumping from 32 million daily active users to 44 million in just a week. While usage continues to rise, Microsoft is releasing a new remote work trend report to highlight how work habits are changing.

The article offers some details about the increase in usage of Microsoft’s Teams and Stream remote collaboration tools during the pandemic. No surprise there; we’ve seen similar reports from Slack and Zoom. But more interestingly, the article also speculates about our technology landscape after the crisis has passed:

“It’s clear to me there will be a new normal,” explains [Microsoft 365 head Jared]Spataro. “If you look at what’s happening in China and what’s happening in Singapore, you essentially are in a time machine. We don’t see people going back to work and having it be all the same. There are different restrictions to society, there are new patterns in the way people work. There are societies that are thinking of A days and B days of who gets to go into the office and who works remote.”

As a result, widespread adoption of remote collaboration technologies will become a more permanent feature of the workplace, with less of the stigma they had before the pandemic:

Microsoft is also seeing cases where remote workers can no longer be an afterthought in meetings, and how chat can influence video calls. “The simplest example is how important chat becomes as part of a meeting,” says Spataro. “We’re not seeing it as being incidental anymore, we’re actually seeing it be a new modality for people to contribute to the meeting.” This could involve people chatting alongside video meetings, and coworkers upvoting suggestions and real-time feedback.

I don’t have firsthand experience with Teams. However, I’ve used competing systems — including Zoom, Webex, GoToMeeting, Slack, and Skype — for a long time. Many offer the ability to chat alongside video calls. Invariably, the chat feature feels tacked on, with little thought to the integration between it and the video stream.

Mixing synchronous and asynchronous communications is a tough challenge, but it also has lots of potentials. For example, I’ve been in remote meetings where team members collaborate to write meeting minutes using Google Docs in real-time. With enough people taking notes, the result is a powerful augmentation of everyone’s cognitive abilities. Still, the tool isn’t designed to do this. There is no connection between the words in the document and what’s said in the call.

Such new work modalities have been around for a while, but the pandemic is accelerating their adoption. We’re likely to see innovations that will be with us long after the crisis subsides.

Microsoft Thinks Coronavirus Will Forever Change the Way We Work and Learn

Early 20th Century Car Navigation Systems

From the Nothing New Under the Sun dept.: Did you know there were devices that offered turn-by-turn automobile directions in the early 20th Century? An article in Ars Technica highlights several, including the “Live Map”:

The Live Map was a glass-enclosed brass dial attached to the outer edge of the driver’s side of the car and linked via a cable to a car’s odometer. Before leaving on your drive, you would purchase one of the company’s 8-inch paper discs with a trip’s directions, put together by The Touring Club of America. Each disc contained a trip’s mileage on the edge of the disc, with each tick mark symbolizing one mile, and supplementary tick marks for every fifth of a mile. Directions were printed alongside key mileage points like spokes on a wheel, describing road surfaces (paved or dirt), intersections, and rail crossings.

The disc was placed on the dial’s turntable. The driver would put the disc in the machine at the trip’s starting point. As the driver progressed, the disc rotated proportionally to your car’s speed, telling you what to do, what to look for, and where to turn. Each disc covered about 100 miles, at which point you pulled over and stopped to replace the disc with the next one. Of course, the device wasn’t accurate if drivers didn’t precisely follow the centerline of the road. So Joseph’s brother Ernest, to keep the mileage exact, introduced an improvement for 1913 that reduced the amount of rotation transmitted to the device if the driver was steering erratically.

We tend to think of information systems as digital, but that’s because digital systems are the norm today. But people have been thinking for a long time about how to make information access more contextually appropriate and convenient. Among other reasons, digital has made this easier by reducing the amount of stuff (literally, as in physical matter) required to get the right information to the right person at the right place/time.

Turn-by-turntables: How drivers got from point A to point B in the early 1900s

The Golden Age of Learning Anything

Last weekend I did something I’d never done before: I reupholstered a chair. Here’s a photo of the final result:

Unfortunately, I don’t have a “before” photo to share. But take my word for it: my efforts improved this chair’s condition significantly. Before this weekend, it was unpresentable. Little fingers love to tug on tiny tears in vinyl until they become large, unsightly tears. Alas, it’s cheaper to buy new reproductions such as this one than to have them professionally reupholstered. But my conscience doesn’t let me throw away an otherwise good piece of furniture because of a fixable imperfection.

I’m sharing my weekend project here not to seek your approbation. Instead, I want to highlight that we live in a time when we can learn almost any skill on the internet. I learned to reupholster in a couple of hours through a combination of websites and YouTube videos. I researched and ordered the required materials on Amazon. It took some effort on my part, but it was worth it. I’m surprised at how well the chair worked out, given it was my first time.

As we head into a new year, I keep seeing pundits on Twitter claiming “big tech” is ruining everything. Of course, the real world isn’t as simple as these folks render it. Sure, there are negative aspects to our current large tech platforms — but there are positive ones too. The ability to pick up new knowledge and skills anytime at our own pace very cheaply is among the positives.

That Syncing Feeling

There was a time, many years ago, when I used only one computer for my day-to-day work. It was a laptop, and it was with me most of the time, at least during the workday. I accessed my digital information exclusively on this device: email, files, etc. I kept my calendar on a (paper-based) Franklin Planner. For mobile communications, I used a beeper. I told you it was a long time ago — a simpler time.

Then a new device came on the market, the Palm Pilot:

Image: Wikimedia. (https://en.wikipedia.org/wiki/PalmPilot#/media/File:Palm-IMG_7025.jpg)
Image: Wikimedia.

It was like the paper planner, only digital: it could store your calendar, address book, to-dos, and such. You’d write into it using a gesture alphabet called Graffiti, which you had to learn so you could use the device. But most importantly, you could also sync it with your computer’s calendar, address book, etc. You did this by sitting it on a cradle that came with the device and pushing a button. You connected the cradle to the computer using a serial cable and installed an app on your computer to manage communications between the devices. It was crude and complex, and I loved it. The prospect of having my personal information in digital format with me anywhere was very compelling.

Continue reading

Being Open to Unsettling Changes

My post about watching movies at double-speed elicited strong reactions. Some folks seem convinced that giving people the ability to watch movies faster will diminish the viewing experience, and not just for them — for everyone. Why? Because such changes inevitably influence the medium itself.

Consider the way sound changed movies. “Talking” pictures did away with title cards. That was a significant change to the medium, which was wrought by advances in technology. Once it was possible to synchronize sound and pictures, irreversible changes to the medium were inevitable.

Are movies with sound better or worse than what came before? That’s a judgment call. It depends on your point of view. You and I grew up in a world of talking pictures; the silent ones with their title cards seem old and clunky. But they have merits too. Silent films had literate value. Many featured live musical performances, which made them into more of an event than pre-recorded movies. I can imagine somebody who grew up with silent movies could become attached to the way they were.

Continue reading

Quantum Supremacy

Earlier this week, Google researchers announced a major computing breakthrough in the journal Nature:

Our Sycamore processor takes about 200 seconds to sample one instance of a quantum circuit a million times—our benchmarks currently indicate that the equivalent task for a state-of-the-art classical supercomputer would take approximately 10,000 years. This dramatic increase in speed compared to all known classical algorithms is an experimental realization of quantum supremacy for this specific computational task, heralding a much-anticipated computing paradigm.

Quantum supremacy heralds an era of not merely faster computing, but one in which computers can solve new types of problems. There was a time when I’d expect such breakthroughs from “information technology” companies such as IBM. But Google’s tech is ultimately in service to another business: advertising.

Perspective on “Digital”

Twenty-five years ago, I left my career in architecture. I’d been working in the building-design trade for about a year. Then something happened that led me to abandon the profession I’d trained for and which I’d pined for only a few years earlier: I saw Mosaic, the first widely available web browser.

I’d been aware of the Internet while I was a university student. I thought of it as a command-line system, one that was mostly useful for email. But the web was different. It was visual. It easy to use, both for end-users and content authors. Anyone could publish at global scale, cheaply. It was clear to me that making the world’s information available through the web would change our reality in significant ways. I cast my lot with “digital” and haven’t looked back.

A quarter of a century later, I still love the web. I love what it’s become – even with its (many) flaws. But more importantly, I love what it can be — its latent potential to improve the human condition. There’s so much work to be done.

But there’s a lot of negativity centered on digital technology these days. Here’s a sample of headlines from major publications from the last few months:

These stories are representative of a melancholic tone that’s become all too common these days. Our pervasive digital technologies have wrought significant changes in old ways of being in the world. Some people miss the old ways; others are perplexed or alarmed. It’s understandable, but it couldn’t be otherwise. The internet and the constellation of services it enables are profoundly disruptive.

Social structures don’t remain unchanged after encountering such forces. The introduction of the printing press led to social, political, and scientific revolutions — including the Reformation. These weren’t small, incremental changes to the social fabric; they shattered then-current reality and re-configured it in new and surprising ways. The process wasn’t smooth or easy.

Digital is more radically transformative than the printing press. It’s folly to expect long-established social structures will stand as before. The question isn’t whether our societies will change, but whether the change will be for better or worse. That’s still an open question. I’m driven by the idea that those of us working in digital have the opportunity to help steer the outcome towards humane ends.

Table Stakes

Yesterday I was running an errand with my daughter. Our conversation drifted towards Mel Blanc. I explained how Mr. Blanc voiced most of the Looney Tunes characters and how I’d seen a hilarious interview years before in which he went through various voices. A “you had to be there” experience.

Then something amazing happened. Rather than (inevitably) mangle the retelling of Mr. Blanc’s amazing abilities, we pulled out my iPhone. Within seconds she was looking at the interview, which is available — along with so much else — in YouTube. She chuckled along. Our conversation continued. When, she wondered, was Mel Blanc alive? I said I thought he’d died in the early 90s, but that we may as well check. I long-pressed the phone’s home button to evoke Siri. I said, “When did Mel Blanc die?” The reply came almost immediately: “Mel Blanc died July 10, 1989 at age 81 in Los Angeles.”

One of my favorite quotes is from Charles Eames:

Eventually everything connects — people, ideas, objects. The quality of the connections is the key to quality per se.

I’ve been using an iPhone for over a decade. Even so, I’m still astonished at the quality of connections I can make from this device I carry in my pocket. And what’s more, having such a device isn’t a luxury afforded to a fragment of the population. Almost everybody has similar access.

Alas, the ubiquity of the experience has made it table stakes; we take it for granted. Of course you shot 4K video of the birthday party. Of course you cleared your inbox while waiting in public transport. Of course you know how to get there. (What with all the maps of the world and a GPS receiver in your pocket!) Everybody does.

How do we account for everyone having instant access to any piece of information anywhere at any time? Surely not with measures established in and for the world that existed before the small glass rectangles.

Striving for Simplicity

Over a decade and a half ago, I was at an office party. (This was during the brief part of my career when I was in-house at a large corporation.) Among other amenities, the party featured a cartoonist; the type of artist you see drawing quick, exaggerated portraits at fairs. The artist was hired to draw each team member, highlighting striking things about us: quirky hobbies, particular styles of dress or grooming, tics, etc. I don’t remember if it was at my suggestion or that of my co-workers, but my cartoon showed me surrounded by electronic gadgets: mobile phones, MP3 players (a new thing at the time), notebook computers, cameras, etc. That’s how I saw myself and how my colleagues thought of me; tech was a primary part of my identity.

I’ve long been representative of the “early adopter” demographic. Being alive (and privileged enough to have some discretionary income) during the Moore’s Law years has meant seeing (and benefiting from) tremendous advances in many areas of life. Consider the way we listen to music. In the span of a few years, my entire music collection went from heavy boxes filled with clunky cassette tapes to a few light(er) CD cases to a tiny device not much bigger than a single cassette tape. Portable MP3 players represented a tangible improvement to that part of my life. The same thing has happened with photography, movies, reading, etc. It’s been exciting for me to stay up-to-date with technology.

That said, as I’ve grown older, I’ve become more aware of the costs of new things. I’m not just talking about the money needed to acquire them; every new thing that comes into my life adds some cognitive cost. For example, there’s the question of what to do with the thing(s) it replaces. (I still have cases full of plastic discs in my attic. I’m unsure what to do with them, considering the amount of money I’ve already sunk into them.)

Continue reading