The Role of Structure in Digital Design

Andy Fitzgerald, in A List Apart:

design efforts that focus on creating visually effective pages are no longer sufficient to ensure the integrity or accuracy of content published on the web. Rather, by focusing on providing access to information in a structured, systematic way that is legible to both humans and machines, content publishers can ensure that their content is both accessible and accurate in these new contexts, whether or not they’re producing chatbots or tapping into AI directly.

Digital designers have long considered user interfaces to be the primary artifacts of their work. For many, the structures that inform these interfaces have been relegated to a secondary role — that is, if they’ve been considered at all.

Thanks to the revolution sparked by the iPhone, today we experience information environments through a variety of device form factors. Thus far, these interactions have mostly happened in screen-based devices, but that’s changing too. And to top things off, digital experiences are becoming ever more central to our social fabric.

Designing an information environment in 2019 without considering its underlying structures — and how they evolve — is a form of malpractice.

Conversations with Robots: Voice, Smart Agents & the Case for Structured Content

A Bold Example of Semantic Pollution

Sometimes language changes slowly and inadvertently. The meaning of words can change over time as language evolves. That’s how many semantic environments become polluted: little by little. But sometimes change happens abruptly and purposefully. This past weekend, AT&T gave us an excellent example of how to pollute a semantic environment in one blow.

Today’s mobile phone networks work on what’s known as 4G technology. It’s a standard that’s widely adopted by the mobile communications industry. When your smartphone connects to a 4G network, you see a little icon on your phone’s screen that says either 4G or LTE. These 4G networks are plenty fast for most uses today.

However, the industry is working on the next generation network technology called — you guessed it — 5G. The first 5G devices are already appearing on the market. That said, widespread rollout won’t be immediate: the new technology requires new hardware on phones, changes to cell towers, and a host of other changes. It’ll likely be a couple of years before the new standard becomes mainstream.

Despite these technical hurdles, last weekend AT&T started issuing updates to some Android phones in their network that change the network label to 5G. Nothing else is different about these devices; their hardware is still the same and they still connect using the same network technology. So what’s the reason for the change? AT&T has decided to label some advanced current-generation technologies “5G E.” When the real 5G comes around, they’ll call that “5G+.”

This seems like an effort to make the AT&T network look more advanced than those of its competitors. The result, of course, is that this change confuses what 5G means. It erodes the usefulness of the term; afterward, it’ll be harder for nontechnical AT&T customers to know what technology they’re using. It’s a bold example of how to co-opt language at the expense of clarity and understanding.

AT&T decides 4G is now “5G,” starts issuing icon-changing software updates

Book Notes: “The Evolution of Everything”

The Evolution of Everything: How New Ideas Emerge
By Matt Ridley
HarperCollins, 2015

Designers are called to tackle increasingly complex problems. This requires that we understand how systems function and how they came to have the configurations we experience. I put it this way because complex systems (at least those that stand the test of time) don’t come into the world fully-formed. Instead, they evolve step-by-step from earlier, simpler systems. (See Gall’s Law.) Because of this, it’s essential that we understand the distinction between top-down and bottom-up structuring processes.

That distinction is what drew me to The Evolution of Everything. While not written specifically for designers, the book addresses this subject directly. Per its jacket, the book aims to “definitively [dispel] a dangerous, widespread myth: that we can command and control our world.” It pitches “the forces of evolution” against top-down forces for systems definition. What sorts of systems? Any and all of them: the universe, morality, life, genes, culture, the economy, technology, the mind, personality, education, population, leadership, government, religion, money, the internet.

There’s a chapter devoted to how top-down vs. bottom-up approaches have played out for each of these complex subjects. Mr. Ridley aims to demonstrate that advances in all of them have been the result of evolutionary forces, and the hindrances the result of intentional, planned actions. I don’t think I’m doing the author a disservice by describing it in such binary terms. In the book’s epilogue, Mr. Ridley states his thesis in its “boldest and most surprising form:”

Bad news is man-made, top-down, purposed stuff, imposed on history. Good news is accidental, unplanned, emergent stuff that gradually evolves. The things that go well are largely unintended, the things that go badly are largely intended.

Examples given of the former include the Russian Revolution, the Nazi regime, and the 2008 financial crisis, while examples of the latter include the eradication of infectious diseases, the green revolution, and the internet.

While the whole is engaging and erudite, the earlier chapters, which deal with the evolution of natural systems, are stronger than the latter ones, which deal with the evolution of social systems. The book’s political agenda becomes increasingly transparent in these later chapters, often at the expense of the primary top-down vs. bottom-up thesis.

If you already buy into this agenda, you may come away convinced. I wasn’t. Sometimes bottom-up forces enable command-and-control structures and vice-versa. But you’ll find no such nuance here; the book offers its subject as an either-or proposition. This leads to some weak arguments. (E.g., “While we should honour individuals for their contributions, we should not really think that they make something come into existence that would not have otherwise.”)

Understanding the difference between top-down vs. bottom-up structuring is essential for today’s designers. The Evolution of Everything doesn’t entirely dispel the myth that we can command-and-control the world, but it does provide good examples of bottom-up emergence — especially in its earlier chapters. Still, I’d like a more nuanced take on this critical subject.

Buy it on Amazon.com

A Community In Search of a Home

Earlier this year, Google announced plans to shutter Google+, its failed social network. While this decision won’t affect many of us, some folks consider Google+ home. A post on Medium by Steven T. Wright highlights the plight of these communities that will go away when Google+ shuts down in April of 2019. Can they find a new place to meet?

The story profiles developer John Lewis, who built a group on Google+ with the purpose of trying to find a replacement information environment for these folks. Some are finding their trust in large companies as stewards of their information environments has eroded as a result of the way Google (mis)managed G+:

“Some are going to platforms similar to Facebook like MeWe, some are going to open-source sites like different Diaspora pods,” he says. “I think people are a bit wary of the big companies, after seeing what the rest of Google did to Google+. With their divided attention, Facebook was able to take all of their cool features and cannibalize them. I think we want something that will last for a while, that won’t be shut down by some exec.”

People invest real time and energy in these places. While on one level they are “products” to be “managed,” they’re also infrastructure where people build out important parts of their lives.

The Death of Google+ Is Tearing Its Die Hard Communities Apart

Facebook’s Reverse Halo

Last month I pondered whether it’s time to leave Facebook. Things have only gotten worse for the social network since then. It seems every week now we learn of new ways in which the company has mishandled the personal information we’ve entrusted it. A couple of days ago, The New York Times published a report that alleges the company shared their users’ private information with various other large tech companies. Just yesterday, the District of Columbia sued Facebook over the Cambridge Analytica incident. Wired has a running list of all the scandals thus far this year.

Still, results from a new study from Tufts University​ suggest it would take a thousand dollars for a typical Facebook user to leave. That sounds high for the way I use Facebook. Many other information environments have more value to me. Since I wrote my “is it time to leave?” post, I’ve significantly reduced my interactions in Facebook; I haven’t missed being there as much as I thought I would.

Facebook owns other properties that I’d find much harder to give up. One in particular — WhatsApp — would be worth a lot more than a thousand dollars to me, since that’s where I stay in touch with my family and friends in Panama. (I’d love to get them all to chat with me on Messages.app, but that isn’t likely to happen since many of them are Android users.)

The stuff I share in WhatsApp is much more sensitive to me than anything I’ve ever shared through Facebook. When I read stories like the one in the NY Times, I worry. I wonder how much sway Facebook (the company) has over the way WhatsApp handles personal data. It would be a real loss to me if I had to leave this environment where I meet my loved ones. That said, the constant stream of news regarding Facebook’s cavalier attitude to privacy is eroding my trust in WhatsApp as well.

The Role of Advertisers in Shaping Digital Ecosystems

Marc Pritchard, Procter & Gamble’s long-serving Chief Brand Officer, on how P&G helped create the modern digital media ecosystem:

when we first worked with them they were platforms for communication with people. They had no advertising business. We essentially worked with Facebook to figure out how to place media, how to do reach and frequency, within Facebook.

YouTube came along, and we thought this could be interesting. Not sure where it’s going to go, [but we] ended up monetizing it. What’s interesting about that is that [the founders of Facebook and YouTube] didn’t build these platforms for advertising. Some of the challenges that they’ve had recently, I think, have been because they were built for another purpose. Whereas other media companies, the TV and the radio, they started off and they built advertising in.

I’m not sure that the advertising business model is inherent to either TV or radio, but overall Mr. Pritchard is correct: digital information environments such as Facebook, Twitter, and YouTube weren’t designed primarily to persuade. The advertising-based business model they’ve latched on to (in part driven by demand from clients such as P&G) has created incentives that have harmed society as a whole.

That said, P&G recognizes that it has great power as one of the world’s leading advertisers, and has been taking steps to wield that power responsibly:

10 years ago [P&G] went down a purpose path. When I first started my job we had purpose-inspired, benefit-driven brands. What was interesting about that though, is that it was too disconnected from our business. Over the course of the last few years what we’ve done is we’ve gone back in at it, we’ve really become a citizenship platform. We’re building social responsibility into the business. It includes gender equality, community impact and environmental sustainability based on a foundation of ethics and responsibility.

Laudable! But what happens if/when tough choices are called for? If it came to it, would the company be willing to sacrifice growth and profit for sustainability?

The Biggest Voice In Advertising Finds Its Purpose

Digital Nutrition

Digital information environments are still a relatively new thing for our species. We’re still looking for ways of understanding what these things are and how they affect us. Inevitably, we talk about them through the lenses of what’s come before—the things we’re familiar with. If (like me) your background is in architecture, you may think of them as types of places. But of course, that’s not the only way to think about them.

In an essay in Medium, Michael Philip Moskowitz and Hans Ringertz draw the analogy with food. They identify five pillars (diet, exercise, sleep hygiene, vocation, and interpersonal relationships) that are important for good mental health, which they identify as the leading cause of disability worldwide. To this they add a possible sixth:

What we, the authors of this essay, have come to believe through our separate paths of enquiry over the past decade is that a possible sixth pillar of health has emerged in the era of smartphone addiction and ubiquitous computing. We call this element “digital nutrition,” and in our view, healthy digital habits urgently warrant adoption.

What is digital nutrition?

We define digital nutrition as two distinct but complementary behaviors. The first is the healthful consumption of digital assets, or any positive, purposeful content designed to alleviate emotional distress or maximize human potential, health, and happiness. The second behavior is smarter decision-making, aided by greater transparency around the composition and behavioral consequences of specific types of digital content.

The way I’m reading it: you are what you eat—and also what you read online. Your digital media diet may be making you sick.

It’s an interesting analogy, but not one that I’m fully on board with. Information (what the authors call “digital assets”) is very different from food. We can identify what foods that nurture or harm us, but even with our advanced science this knowledge is still imperfect. (Think of the continually shifting opinions about the benefits of foods like red wine and coffee.) Still, it’s possible to envision nutritional science getting more accurate over time. How would the equivalent work for information, digital or otherwise? Information is too contextually dependent—too personal—for us to define universal guidelines of what constitutes a good digital diet at anything but the highest levels.

To the point: the authors anticipate the emergence of a “new labeling system for digital content — one that goes further than today’s film and TV ratings and more closely resembles nutritional information on food packaging.” This seems like dangerous ground. Who would do the labeling? To what ends?

It’s Time to Embrace Digital Nutrition

Kranzberg’s Laws of Technology

Michael Sacasas explains Kranzberg’s Six Laws of Technology, “a series of truisms deriving from a longtime immersion in the study of the development of technology and its interactions with sociocultural change”:

  1. Technology is neither good nor bad; nor is it neutral.
  2. Invention is the mother of necessity.
  3. Technology comes in packages, big and small.
  4. Although technology might be a prime element in many public issues, nontechnical factors take precedence in technology-policy decisions.
  5. All history is relevant, but the history of technology is the most relevant.
  6. Technology is a very human activity—and so is the history of technology.

A nuanced take on technology’s role in shaping our lives and societies.

Kranzberg’s Six Laws of Technology, a Metaphor, and a Story

Is It Time To Leave Facebook?

Disclosure: My former employer did some work for Facebook and I’ve been a (paid) speaker at an internal company event. Besides knowing a few people who work there, I don’t currently have any affiliations with Facebook other than as a user.

A few days ago, my friend Lou Rosenfeld announced he was leaving Facebook. He explained why:

My particular reason has to do with Facebook’s designers, researchers, and writers. They’re simply too good at what they do. Their walled garden is really a Whole Foods of lovely produce, cheeses, and meats. I can’t pop in for a quick visit. I can’t leave without a whopping bill. And I often end up with a bad case of heartburn from yelling stupid things at strangers who are yelling stupid things at other strangers.

Recently I’ve heard more people express similar sentiments. Many have had a love-hate relationship with Facebook for a while. For some, that balance has tipped towards hate. Perhaps it was a growing realization of the environment’s addictiveness, as in Lou’s case. Or maybe it’s because of the negative attention Facebook has been getting in the media recently. (For good reason.) Whatever the case, many people seem to be leaving the platform or contemplating doing so. I’m one of those people. Continue reading