Thematic discourse analysis of data protection legislation

I am currently undertaking a thematic discourse analysis of three pieces of legislation (the Council of Europe’s 1981 Convention on automatic processing of personal data; the 1995 Data Protection Directive, and the 2016 General Data Protection Regulation.I am currently undertaking a thematic discourse analysis of three pieces of legislation (the Council of Europe’s 1981 Convention on automatic processing of personal data; the 1995 Data Protection Directive, and the 2016 General Data Protection Regulation.

There’s certainly plenty to go through:

1981 1995 2016
Number of characters (including spaces) 21276 79387 353577
Number of characters (without spaces) 17357 64802 289058
Number of words 3503 12677 55199
Lexical density 18.8124 11.6589 4.9204
Number of syllables 6222 22891 101740
Used: online-utility.org founded by Mladen Adamovic
Lexical density = the number of lexical words (content words) divided by the total number of words

So far, I have identified 140 different ways of referring to entities (individuals, groups, institutions, society); 37 terms relating to the flow of data; and 27 terms relating to facilitators or obstacles affecting the flow of data

Advertisements

Examples of posters and guidance on privacy & security

Have any libraries developed their own posters summarising the key messages that they want to share with their users about protecting their online privacy and information security?

Here are a few resources:

CryptoParty NYC first-step security checklist
https://www.bklynlibrary.org/sites/default/files/documents/learning/FirstStepSecurityChecklist.pdf

A message to our library users: your privacy is important to us
https://chooseprivacyweek.org/wp-content/uploads/2013/03/Trina_UserHandout.pdf

Protect your privacy while using public computers and wifi

EDRi : Privacy4kids booklet
https://edri.org/privacy-for-kids-digital-defenders/

Library Freedom Poster https://twitter.com/flexlibris/status/792070173242986496

Web Privacy in Practice: Assessing Internet Security and Patron Privacy in North American Public Libraries https://macsphere.mcmaster.ca/handle/11375/19016

Top 10 most frequently viewed pages on my library privacy blog

1.       Privacy & libraries in corporate sector https://libraryprivacyblog.wordpress.com/further-reading/privacy-libraries-in-corporate-sector/

2.       What can librarians do to protect user privacy https://libraryprivacyblog.wordpress.com/2016/12/09/what-can-librarians-do-to-protect-user-privacy/

3.       Model for ontological frictions https://libraryprivacyblog.wordpress.com/2017/07/21/model-for-ontological-frictions-cf-floridi/

4.       Data breaches https://libraryprivacyblog.wordpress.com/data-breaches/

5.       Resources https://libraryprivacyblog.wordpress.com/resources/

6.       Why the #cilipinwales conference was the wake up call I needed https://libraryprivacyblog.wordpress.com/2017/05/13/why-the-cilipinwales-conference-was-the-wake-up-call-i-needed/

7.       Text of my @cilipinwales talk on privacy https://libraryprivacyblog.wordpress.com/2017/05/15/text-of-my-cilipinwales-talk-on-privacy-in-libraries/

8.       Communications privacy management (CPM) theory https://libraryprivacyblog.wordpress.com/2017/08/03/communication-privacy-management-cpm-theory/

9.       Why it matters https://libraryprivacyblog.wordpress.com/contact/

10.   Library users’ trust in librarians to protect their  privacy https://libraryprivacyblog.wordpress.com/2017/04/23/library-users-trust-in-librarians-to-protect-their-privacy/ 

Privacy and public goods v common pool resources

In economics, a public good is a good that is both non-excludable and non-rivalrous in that individuals cannot be effectively excluded from use and where use by one individual does not reduce availability to others.

If one thinks about the nature of an individual’s informational privacy, it must surely come under the heading of public good, since:

a) if you have a copy of someone’s personal data, that doesn’t somehow exclude others from having access to that information (including the person about which the data relates)

and

b) your use of someone else’s information does’t automatically reduce the availability of that information to others

Thinking about the “Common pool resources” category, Schlager and Ostrom (1992) identify five property rights that are most relevant for the use of common-pool resources, including access, withdrawal, management, exclusion and alienation. These are defined as:

  1. ACCESS – the right to enter a defined physical area and enjoy non-subtractive benefits (eg hike, canoe, sit in the sun)
  2. WITHDRAWAL – the right to obtain resource units or products of a resource system (eg. catch fish, divert water)
  3. MANAGEMENT – the right to regulate internal use patterns and transform the resource by making improvements
  4. EXCLUSION – the right to determine who will have an access right, and how that right may be transferred
  5. ALIENATION – the right to sell or lease exclusion, management or withdrawal rights

Doesn’t this go to the heart of the problem?:

An individual automatically assumes that they should have (complete) ownership and control over their personal information. They wouldn’t expect their personal information to be part of a “common pool” of resources.

What about a situation whereby the individual’s identity is temporarily kidnapped and modified in the sense that classifications are made which reveal something new about the person (or in other words, some pieces of information are used to infer something new – with the potential of big data, through the use of data, models and algorithms people are able to generate new personal information that perhaps the data subjects didn’t even know, whereby others will know us better than we know ourselves).

The fundamental problem is that informational privacy is non-rivalrous, and would therefore fit into the category of “public good”.

I have to say that I have struggled with these concepts, and have been thinking about how they relate to concepts of kidnap, trespass, and cloning

One potential problem area is the way in which someone’s privacy could be invaded not only in a private place, but it could also be invaded in a public space.

Floridi asserts that thinking about an invasion of privacy in terms of trespass doesn’t make any sense in a public space, so privacy shouldn’t be seen in terms of trespass

Westin sees informational privacy as the right to be let alone. So, if one’s personal information were “kidnapped”, that would be a privacy violation.

Of course, one’s personal information can’t be kidnapped in the normal sense of the word, because what is really happening is that the individual’s identity is “cloned”, or “temporarily kidnapped”, whereby a piece of personal information is gained by someone without at the same time depriving others of the possibility of having access.

If someone were to break into the offices of an examinations board, and make a copy of (or even just memorize) the questions that have been set for an exam it doesn’t mean that the examiners have somehow lost access to the master document containing those questions. This is the whole point about information being non-rivalrous.

EXCLUDABLE NON-EXCLUDABLE
RIVALROUS Private Goods Common-pool resources
NON-RIVALROUS Club Goods Public goods

Prosser (1960) identified four privacy torts, of which the first is intrusion upon seclusion. Intrusion overlaps with actions for trespass to land or chattels.

Floridi considers our personal information as constitutive of our very being. If personal information is finally acknowledged to be a constitutive part of someone’s personal identity and individuality, then one day it may become strictly illegal to trade in some kinds of personal information, exactly as it is illegal to trade in human organs (including one’s own) or slaves. (Luciano Floridi, 4th revolution)

What I need to think further about is the dynamic between individuals and society as a whole – whereby an invasion of one individual’s informational privacy is harmful to society as a whole.

Bad algorithms

Cathy O’Neil doesn’t believe all algorithms are bad, but she characterizes the ones that are as having three key features:

firstly – that they are seen to be widespread and important

secondly – that they are secret (black box)

thirdly – that they are destructive. One bad design mistake will make them unfair for huge populations as they scale up

She says that people are blinded by them being mathematical, as if when asked why didn’t you question it, that they believe it because it is math.

O’Neil believes that there needs to be a measure for whom the algorithm fails – so for example, if it fails women in the recruitment process. There needs to be a measure of any harms that are experienced, and how these are distributed.

I keep thinking that the focus on the individual in privacy and data protection does an injustice to all of us, because potentially bad things can happen to lots of people, and if we focus on the individual how will we be able to see things in perspective, how will we know the sheer scale of the problem, how will we be able to see that things need to be tackled at a higher level of analysis.

Power in numbers (collective action)

Why collective action is needed

Is it realistic to expect individuals to be able to protect their privacy when faced with multinational companies who show little regard for their right to privacy?

How many individuals bring legal challenges based on a breach of their privacy?

A right to privacy is of little use if individuals don’t have a realistic way of enforcing the right.

Defending an individual’s right to privacy is important for society as a whole. It is an important public policy issue; and one which in many cases will affect the interests of the individual, the collective, and the public as a whole.

Some form of collective action is needed. And yet, even here, there are problems. How many class action lawsuits has the Competitive Enterprise Institute (www.cei.org/issues/class-action-fairness) had to declare as being unfair because all or most of the money will go to lawyers, with little or no payments to the individuals whose privacy has been breached.

Or are individuals able to be part of a class action only if they can prove that they have standing by demonstrating actual harm or a real case of identity theft, for example.

What does the GDPR say

Article 80 allows individuals to have the right to mandate a not-for-profit body, organisation or association (such as a consumer protection body) to exercise rights and bring claims on their behalf.

These rights include the right to lodge a complaint with the ICO (Art 77); the right to an effective judicial remedy against the ICO (Art 78); and the right to an effective judicial remedy against a data controller or processor. 

The policy aim is to ensure that individuals are able to exercise their rights to authorise non-profit organisations to deal with claims on their behalf, and that such organisations can collect damages awarded on individuals’ behalf. 

The government will legislate to ensure that individuals are able to exercise their rights to authorise non-profit organisations to deal with claims on their behalf. 

Potentially class action lawsuits can achieve the following.

  • increase access to justice
  • provide an effective remedy, allowing people to exercise their legal rights
  • save time and money, both for parties and the courts
  • avoid inconsistent decisions in similar cases
  • act as a deterrent to unlawful/unfair behaviour by businesses
  • provides legal representation for “diffuse interests”
  • makes court proceedings affordable
  • one time players of court system at distinct disadvantage over repeat players whose ongoing contact with the judicial system allows them to better plan their litigation, spread litigation costs, build up information relationships with the decision-making institutions, spread the risks of litigation and develop strategies for future actions (Wrbka 2010)
  • To overcome lack of financial resources
  • To overcome lack of legal knowledge
  • Monetary amount may be small, and making a claim as an individual would not be cost-effective
  • Individuals may simply not have the right (the legal standing) to bring a case to court
  • Creates power in numbers
  • Goes some way to redressing the balance between the individual and multi-national companies

 

Privacy is full of paradoxes

They include: the privacy paradox, transparency paradox, and e-reader paradox, among others.

Privacy paradox: people are concerned about privacy, but they behave in ways that suggest otherwise

“But the privacy paradox is fallacious because it only considers one party in an information relationship: the user. Users given a blunt choice between protecting their data and participating in modern society really have no choice at all, especially when the terms of any such choice are clouded by confusing technology and legal mumbo-jumbo, where long-term interests in privacy are hard to value, or where meaningful choice is an illusion. In fact, given the limited notice and choice that most of us encounter, the privacy paradox suggests that users care about their personal data in spite of the limited legal and technological choices they face in protecting it” (Richards, Hartzog 2015)

“The myth that People Don’t Care about Privacy suggests a kind of reverse privacy paradox – if people really don’t care about privacy, why do they talk about it so much? After all, if we didn’t really care about privacy, it wouldn’t be regular front page news, books on privacy wouldn’t sell, and it would not be a major topic of public debate” (Sarat, Cambridge Books Online EBS 2014)

Helen Nissenbaum talks of the “transparency paradox”. Achieving transparency means conveying information handling practices in ways that are relevant and meaningful to the choices individuals must make. If the notice were sufficiently detailed, it would be unlikely to be understood, let alone read; while to summarise the policy in the style of, say, nutrition labels, isn’t helpful either because it drains away important details that are precisely the ones that are likely to make a difference – who the business associates are, and what information precisely is being shared with them. The transparency paradox consists of transparency of textual meaning and transparency of practice being unachievable, because we appear not to be able to achieve one without giving up on the other. (Nissenbaum 2011)

The ereader paradox: Ereaders create the illusion of intellectual privacy in the physical world, while they threaten intellectual privacy in the digital one (people might not be able to see what you are reading, but companies like Amazon will know precisely what you are reading, how many pages you have looked at etc)

Open inquiry requires closed doors: “Open inquiry,” it seems, requires at least a few closed doors; only by protecting our secrets can we achieve the freedom to follow information wherever it takes us.

Richards & King “highlight three paradoxes in the current rhetoric about big data to help move us toward a more complete understanding of the big data picture. First, while big data pervasively collects all  manner of private information, the operations of big data itself are almost entirely shrouded in legal and commercial secrecy. We call this the Transparency Paradox. Second, though big data evangelists talk in terms of miraculous outcomes, this rhetoric ignores the fact that big data seeks to identify at the expense of individual and collective identity. We call this the Identity Paradox. And third, the rhetoric of big data is characterized by its power to transform society, but big data has power effects of its own, which privilege large government and corporate entities at the expense of ordinary individuals. We call this the Power Paradox (Richards, King 2013)

REFERENCES

NISSENBAUM, H., 2011. A contextual approach to privacy online. Daedalus, the Journal of the American Academy of Arts & Sciences, 140(4), pp. 32-48.

RICHARDS, N.M. and HARTZOG, W., 2015. Taking Trust Seriously in Privacy Law.

RICHARDS, N.M. and KING, J.H., 2013. Three paradoxes of big data. Stan.L.Rev.Online, 66, pp. 41.

SARAT, A. and CAMBRIDGE BOOKS ONLINE EBS, 2014. A World without Privacy: What Law Can and Should Do? Cambridge: Cambridge University Press.