Tyler Sonnemaker, reporting for Insider:
Newly unredacted documents in a lawsuit against Google reveal that the company’s own executives and engineers knew just how difficult the company had made it for smartphone users to keep their location data private.
Google continued collecting location data even when users turned off various location-sharing settings, made popular privacy settings harder to find, and even pressured LG and other phone makers into hiding settings precisely because users liked them, according to the documents.
The report alleges internal stakeholders weren’t clear on the system’s structure:
Jen Chai, a Google senior product manager in charge of location services, didn’t know how the company’s complex web of privacy settings interacted with each other, according to the documents.
Sounds like a concept map would help. But perhaps these issues could be due to more than a lack of understanding:
When Google tested versions of its Android operating system that made privacy settings easier to find, users took advantage of them, which Google viewed as a “problem,” according to the documents. To solve that problem, Google then sought to bury those settings deeper within the settings menu.
Google also tried to convince smartphone makers to hide location settings “through active misrepresentations and/or concealment, suppression, or omission of facts” — that is, data Google had showing that users were using those settings — “in order to assuage manufacturers’ privacy concerns.”
I don’t know anything about this case other than what is in the media, nor do I have firsthand experience with Android’s privacy settings. That said, these allegations bring to mind “Dark IA” — the opposite of information architecture.
Information architecture aims to make stuff easier to find and understand — implicitly, in service of empowering users. The antithesis of IA isn’t an unwittingly disorganized system, but one organized to inhibit understanding and deprive users of control.