Categories
Identity Privacy

The Language of Privacy

Is it possible to come up with a single definition for privacy? Moreover, can we craft a definition that is based on durable principles, and remains robust despite developments in technology and online services? When it comes to online privacy the Internet Society has been using the same definition for a while, and I still think it is valid. I went back to it recently, to check that I was quoting it correctly, and the more I looked at it, the more I thought it would be worth breaking it down into its elements and validating each one. Here’s the definition:

“Privacy is about retaining the ability to disclose data consensually, and with expectations regarding the context and scope of sharing.”

Everything is in there for a purpose.

  • “retaining the ability to disclose”: privacy depends on user control; if you don’t have meaningful control over whether or not you disclose something, you don’t control your own privacy.
  • “disclose data”: privacy is about disclosure, not about keeping all your data to yourself (that is secretiveness… and taken to extremes, it’s a symptom of what we might even call paranoia). Privacy is a social construct, or set of conventions, relating to the disclosure of data, not to secrecy.
  • “consensually”: orthogonal to the idea of control; not only is it important that I consent to reveal things about myself, it’s also important that I respect others’ wishes about disclosure of data relating to them.
    • This is a key principle in two respects: first, if Alice tells Bob something, her privacy depends on Bob not telling Carl. This is a problem I describe as “privacy beyond first disclosure”. It is difficult to solve by technical means, because once Bob knows the thing Alice tells him, it is hard to prevent him, technically, from further disclosure.
    • Second, if the only way Alice can tell Carl something is via Bob, Bob’s role and obligations are a factor. This is especially true in today’s online world, where all interactions are mediated via third parties.
  • “expectations”: note that, for global applicability, we don’t use the word “rights”… but bear in mind that in some cultures/jurisdictions, our expectations about privacy may well stem from the fact that privacy is regarded as a fundamental right (not necessarily an unqualified one, but fundamental nonetheless). Expectations may be well- or -ill-founded; and they are often subject to being disappointed.
  • “context of sharing”: if I tell my doctor personal details in order to be treated, I don’t expect those details to crop up on “Hilariously Embarrassing Ailment of the Month” website.
    • The healthcare context may appear to be clear-cut, but it really is not. For example: justification for disclosure of healthcare data is often claimed on the grounds that it helps medical research… but the medical research benefits are speculative, while the financial benefit to research companies is not; genetic data about one person is intimately revealing about their parents, siblings and (potential) descendants; supposedly “anonymised” clinical trials data may prove to be easily de-identifiable… This is not argue one way or the other – just to illustrate that healthcare privacy is not a simple context.
  • “scope of sharing”: as Danah Boyd puts it “publishing, to a wider audience, something that was intended for a narrow one may be just as much a violation of privacy as publishing something that was not meant to be published in the first place”. Scope is important, and intimately related, of course, to example (1) under “consensus”, above, and to the matter of “expectations” and “control”.

For all its clarity, though, this definition raises as many questions as it answers. None of those questions is simple – and the answers are many, and nuanced. Currently, the technical means we have to express our privacy preferences and wishes are still pretty crude, and mostly binary. The problem, at its heart, is the clumsy language they offer us, with which to express subtle inter-personal concepts like trust and discretion.

If we start thinking of privacy as a ‘language’, we might start asking what users would like to say in that language, when they engage with apps and services. Currently, users are lucky if they can say “yes” or “no”. We need a much richer vocabulary, with verbs and tenses, (especially the conditional tense!). We need a language that can express wishes, and we need translation, so that we can make ourselves understood across different cultures of privacy. Then we’ll need to teach users how to become fluent in the language of privacy.