In our times of ubiquitous digital devices, we can often forget that information about the state of the world can diverge from the actual world it represents. While the traffic light may be telling you it’s ok to cross the street, it still behooves you to check for oncoming cars.

I’m writing this high above the Atlantic, en route to Europe. I shouldn’t be in this airplane. My flight from San Francisco to Newark — where I would connect for the ocean crossing — left almost an hour late. When I landed in Newark, I got a text message from the airline saying my next flight had already departed, and suggesting I contact an airline representative to reschedule. Odd, I thought: the next flight isn’t supposed to leave until thirty minutes from now. I decided to run for it. When I got to the next flight’s gate, the jetway doors were already closed. The gate agent told me they’d already crossed me off the passenger list, but that she could correct that. She issued me a new boarding pass and allowed me to board.

The text message I’d gotten from the airline was incorrect; the flight hadn’t departed. Somehow, an algorithm determined it was impossible for me to make the connection, and had initiated the sequence of actions prescribed for this scenario. How sophisticated is this algorithm? Did it know the distance between the gate in which I was arriving and the one from where I was to depart? Did it know the two gates were in different terminals? Did it know there was a makeshift transportation system ferrying passengers between terminals in Newark? Did it know it was raining? Whatever it knew, it didn’t have a good enough model of the world. Had I heeded its advice, I’d be trying to find accommodations in a crowded airport and rescheduling travel instead of enjoying a glass of wine and typing these words.