demand is high for designers who can create experiences that display data in useful and interesting ways. In my personal experience this became much, much easier to do once I’d learned to speak the crisp, precise and slightly odd language used by technical people for talking about data.
What follows is a phenomenal post that clearly explains much of what you need to know to understand and speak competently about data. A must-read for anybody involved in designing for digital information environments.
Are you confused by Brexit? I am. This short video from the folks at Information is Beautiful clarified for me the conundrum the UK finds itself in now:
Information is the lifeblood of democracy. People can’t effectively govern the system if they don’t understand the choices before them. I wonder how many people who voted in the Brexit referendum truly understood the implications of their decision.
Bertrand Russell, in a 1959 interview for the BBC:
Interviewer: Suppose, Lord Russell, this film would be looked at by our descendants, like a Dead Sea scroll in a thousand years’ time. What would you think it’s worth telling that generation about the life you’ve lived and the lessons you’ve learned from it?
Russell: I should like to say two things, one intellectual and one moral. The intellectual thing I should want to say to them is this: When you are studying any matter or considering any philosophy, ask yourself only what are the facts and what is the truth that the facts bear out. Never let yourself be diverted either by what you wish to believe, or by what you think would have beneficent social effects if it were believed, but look only — and solely — at what are the facts. That is the intellectual thing that I should wish to say. The moral thing I should wish to say to them is very simple: I should say love is wise, hatred is foolish. In this world which is getting more and more closely interconnected, we have to learn to tolerate each other; we have to learn to put up with the fact that some people say things we don’t like. We can only live together in that way and if we are to live together and not die together, we must learn a kind of charity and a kind of tolerance, which is absolutely vital to the continuation of human life on this planet.
Although the interview is sixty years old, Russell’s advice is more relevant today than ever before. These instructions should be part of the onboarding process of all online social platforms:
Look only at the facts and the truths they bear out.
People will say things you don’t like; practice charity and tolerance.
As our discourse about design ethics matures, we need better models for understanding this big, squishy subject so that we’re not talking about everything all at once. What does it really mean to be an ethical designer? What is most important, and what should we care about the most? What power do we really have to make a difference, and how should we use it?
Mr. Arledge offers a model that divides the areas of concerns in three layers:
The stack goes from specific and concrete at the top to systemic and abstract at the bottom. This seems like a useful way of understanding the domain — and especially the parts where designers have the most influence on the problem.
That said, design work is medium-agnostic. There’s no reason why designers should constrain themselves to only the layers that have to do with the interface. There are many problems at the business and infrastructure layers that would be well-served by strategic design.
The company is mounting a new marketing campaign describing the app as “where work happens.”
I think I’ve seen Slack market itself as a digital place where work happens before. However, this seems like confirmation that they’re going to push in this direction in a big way. I’m glad, because it acknowledges the fact that Slack is more than a tool, product, or service; it’s an environment. (I see this marketing direction as validation of Living in Information’s thesis.)
The second aspect of the post that stood out for me is this observation by its author, Casey Newton:
I’ve been feeling down on Slack ever since my colleagues at The Verge, which runs on Slack, created a channel called verge-internet for discussing the internet. We already had a channel to discuss tech (verge-tech), and a channel for longer tech discussions (verge-tech-discuss), and a channel for discussing culture (verge-culture). Wasn’t our whole website about the internet? Why did our internet website need an internet discussion forum separate from the many other forums in which we discuss the internet?
Many of us who’ve used a greenfield Slack account to coordinate activities with a group larger than a couple of people have experienced this. The environment’s design makes it easy to spin up new channels. Without an agreed-upon ontology, the result is duplication and confusion. Eventually, someone in the team either self-selects or is assigned the role of Slack channel curator. Not quite a bottom-up structuring of the environment; rather, a bottom-up nomination for the top-down role.
When work — along with other important social interactions — happens in information environments, information architecture stops being the purview of a select few. These days, anyone can be called on to be an information architect.
design efforts that focus on creating visually effective pages are no longer sufficient to ensure the integrity or accuracy of content published on the web. Rather, by focusing on providing access to information in a structured, systematic way that is legible to both humans and machines, content publishers can ensure that their content is both accessible and accurate in these new contexts, whether or not they’re producing chatbots or tapping into AI directly.
Digital designers have long considered user interfaces to be the primary artifacts of their work. For many, the structures that inform these interfaces have been relegated to a secondary role — that is, if they’ve been considered at all.
Thanks to the revolution sparked by the iPhone, today we experience information environments through a variety of device form factors. Thus far, these interactions have mostly happened in screen-based devices, but that’s changing too. And to top things off, digital experiences are becoming ever more central to our social fabric.
Designing an information environment in 2019 without considering its underlying structures — and how they evolve — is a form of malpractice.
According to a report on The Verge, Twitter will soon start testing new ways of displaying tweets that should give them more context. Some features clarify messages’ positions in conversations using reply threads:
I’m more intrigued by two other features: availability indicators and context tags. The former are green bubbles next to the user’s name that indicate whether s/he is online and using the app at any given time. (Much like other chat systems do.) The latter are tags that allow users to indicate what a tweet refers to. Having a bit more context on what a tweet is about should help avoid non-sequiturs. (I assume it would also make it easier to filter out things you don’t want to bother with.)
Features like these should drive engagement in Twitter and add clarity for users; a case of alignment between the company’s goals and those of its users.
There is now more code than ever, but it is increasingly difficult to find anyone who has their hands on the wheel. Individual agency is on the wane. Most of us, most of the time, are following instructions delivered to us by computers rather than the other way around. The digital revolution has come full circle and the next revolution, an analog revolution, has begun.
For a long time, the central objects of concern for designers have been interfaces: the touchpoints where users interact with systems. This is changing. The central objects of concern now are systems’ underlying models. Increasingly, these models aren’t artifacts designed in the traditional sense. Instead, they emerge from systems that learn about themselves and the contexts they’re operating in, adapt to those contexts, and in so doing change them.
The urgent design questions of our time aren’t about the usability or fitness-to-purpose of forms; they’re about the ethics and control of systems:
Are the system’s adaptation cycles virtuous or vicious?
Who determines the incentives that drive them?
How do we effectively prototype emergent systems so we can avoid unintended consequences?
Where, when, and how do we intervene most effectively?
Sometimes language changes slowly and inadvertently. The meaning of words can change over time as language evolves. That’s how many semantic environments become polluted: little by little. But sometimes change happens abruptly and purposefully. This past weekend, AT&T gave us an excellent example of how to pollute a semantic environment in one blow.
Today’s mobile phone networks work on what’s known as 4G technology. It’s a standard that’s widely adopted by the mobile communications industry. When your smartphone connects to a 4G network, you see a little icon on your phone’s screen that says either 4G or LTE. These 4G networks are plenty fast for most uses today.
However, the industry is working on the next generation network technology called — you guessed it — 5G. The first 5G devices are already appearing on the market. That said, widespread rollout won’t be immediate: the new technology requires new hardware on phones, changes to cell towers, and a host of other changes. It’ll likely be a couple of years before the new standard becomes mainstream.
Despite these technical hurdles, last weekend AT&T started issuing updates to some Android phones in their network that change the network label to 5G. Nothing else is different about these devices; their hardware is still the same and they still connect using the same network technology. So what’s the reason for the change? AT&T has decided to label some advanced current-generation technologies “5G E.” When the real 5G comes around, they’ll call that “5G+.”
This seems like an effort to make the AT&T network look more advanced than those of its competitors. The result, of course, is that this change confuses what 5G means. It erodes the usefulness of the term; afterward, it’ll be harder for nontechnical AT&T customers to know what technology they’re using. It’s a bold example of how to co-opt language at the expense of clarity and understanding.