Text of talk for @infolawcentre 17 Nov 2017

Children & digital rights – privacy in libraries: talk for IALS conference

The library’s role is to provide access to material and to do so whilst at the same time ensuring absolute confidentiality and anonymity.

Why does the privacy of children matter?

Carrie Gardner (quoted in (Adams 2002)) says “Often, if people do not think their information requests and information-gathering activities are going to be kept private, they won’t ask for the information. They would rather suffer the consequences of not knowing”.

At the tween stage (a youngster between 10 and 12 years of age considered too old to be a child and too young to be a teenager) they are just finding their way. They are just at the cusp of recognizing who they are and learning about the world. They need privacy as a form of freedom to develop as a person.

Then as teenagers, as part of their process of development they may well be looking for materials on sensitive topics (Tough topics for teens http://lauraperenic.blogspot.co.uk/2016/03/tough-topics-bookmarks-and-poster.html )

If children were to look for material about these topics by going online and accessing electronic resources such as ebooks or ejournals, the question arises as to who could potentially access that information, for how long it is kept, and for how long it can be linked back to an identifiable individual.

If a library user accesses an ebook from home, their personal data is processed by the library; by the e-book vendor; and by the e-reader software company. The e-book vendor may use third party cookies, and whilst they may claim that these cookies don’t contain any personally identifiable information and can only be used to identify machines rather than individuals, the reality is that device fingerprinting can be used to fully or partially identify individual users.

In America under section 215 of the PATRIOT Act, the authorities could gain access to that sort of information, whilst imposing a gagging order on the library to ensure that no-one was even aware that they had asked for the information. The American Library Association is dedicated to ending ongoing mass surveillance, which continues despite important reforms made by the USA FREEDOM Act of 2015 which replaced the PATRIOT Act.

And in the UK a similar situation pertains under the Investigatory Powers Act 2016.

People often think that the only way the law enforcement authorities would be able to access such information is where they serve a warrant. But there are a number of routes through which such information could be requested, and they don’t all need a warrant.

In what ways does privacy play out in a library context?

One example relates to circulation records.

A couple of years ago there were reports of the leaking of the school library borrowing history of author Huraki Murakami (McCurry 2015). This yet again raised the issue of privacy (Finch 2015).

Another example would be requests for assistance to find information on sensitive issues. (Catherine) Beyers found herself responding to requests by teachers and parents for information on sensitive issues related to specific students she served. This aspect of her work included “helping teachers locate materials for children about suicide, having a parent in jail, or loving with a disabled sibling” (Adams 2002).

(Ferguson, Thornley et al. 2016) bring together a number of ethical scenarios. One was about privacy versus potential harm to individuals. A school librarian noticed that a child had been reading about sex abuse. The librarian normally regards what children are reading as ‘off limits’ and does not inform teaching staff. A child reading about sex abuse, however, raised alarms. The outcome was that ‘the person with pastoral responsibility for that individual child was alerted.’ In this case, therefore, ‘confidentiality would not be the predominant factor’ although the interviewee added that this was ‘obviously challengeable.

In another case study (Ferguson 2016), a library manager helped save the life of a suicidal client.

Such cases test the limits of LIS professionals’ commitment to protection of client confidentiality, where there is potential harm to that client.

Use of fingerprints as ID instead of a library card

Many school libraries across the UK have implemented technology enabling pupils to take out books by scanning their thumb prints instead of using a card. Such systems are intended to replace library cards and save time and money in managing the libraries. However, the use of electronic fingerprinting systems in this way to manage loans of library books has raised a number of privacy concerns.

In 2006 the Department for Education and Skills and the Information Commissioner said parents could not prevent schools from taking their children’s fingerprints.

Thankfully, the Protection of Freedoms Act 2012 now provides stronger protections. Chapter 2 deals with the protection of biometric information of children in schools. It says that “if, at any time, the child (a) refuses to participate in, or continue to participate in, anything that involves the processing of the child’s biometric information, or (b) otherwise objects to the processing of that information, the relevant authority must ensure that the information is not processed, irrespective of any consent given by a parent of the child.

What has my child looked at online or checked out?

This raises issues of intellectual freedom of young people, childrens’ rights to privacy, responsibility, and freedom to use library materials.

Could – or indeed should – a parent have access to data about what their child has been looking at online?

When faced with the parent-child relationship, to whom is the library responsible? The child possessing a library card is the cardholder of record. However, in many libraries the parent has traditionally been asked to sign as the financially responsible party to try to control losses and recover the cost of lost books.

(Hildebrand 1991 p3) says “Honoring privacy is a concrete expression of respect for another person. We need to start out with a belief that it is desirable for adults in our society to allow children to experience privacy and respect”.

(Symons, Harmon 1995 p13) While a surface acceptance of user’s rights to privacy may be easy, implementation of this right throughout library operations and in the face of pressure is, in fact, difficult. Library employees end up on the grounds of privacy and confidentiality explaining to parents that although they may be billed for their children’s lost books, they are not entitled to a list of everything their children have checked out.

Issues around filtering and monitoring of content

(Wyatt 2006) Librarians are often reluctant to monitor patrons’ Internet access. Monitoring seems to be in direct conflict with the librarian’s ethical duty to honor a patron’s privacy (Skaggs 2002 p851).

At least one parent has sued a public library for allowing her child to access pornography on the Internet. In the American case Kathleen R. v. City of Livermore 104 Cal.Rptr.2d 772 (2001), a minor boy accessed pornography through the Internet and distributed it to his friends. His mother sued the library claiming that “the government had a constitutional duty to protect her minor son from the offensive material found on the Internet” (Kendall 2003 p221).

Although the public library has not been found legally liable for failure to monitor children’s unfiltered access, the question remains whether the librarian should play the role of a monitor? If parents do not see the public library as a safe place for their children, they will not allow them to go there.

By never encountering inappropriate content, individuals do not develop the ability to decipher for themselves which content might be appropriate or not. Given the unreliability of filtering software, this is an essential skill in today’s world. We also need to take account of other measures to achieve child protection that may have a less restrictive impact on information access: e.g. situating public access terminals apart from areas designated for use by minors; and providing user education, support and training (Cooke, Spacey et al. 2014).

At times it seems as though there is an unhealthy desire to monitor childrens internet activities. In August 2017 the BBC reported on a French company offering “invisible PC spy software”. The company was criticised after it said its product could be used “to find out if your son is gay”. It listed a number of clues that might cause a parent to suspect that their son might be gay. One of which was being more interested in reading and theatre than in football. BBC News Online story from 23 August 2017(BBC News Online 2017)

Many libraries use filtering software. Indeed, in America, the Children’s Internet Protection Act requires public and school libraries to have a policy of Internet safety for children. CIPA’s amendments specifically require protection that “blocks or filters” access to visual depictions that are obscene or harmful to minors. If libraries fail to comply, they will not receive federal funding through the E-rate program, which provides a bulk of the federal funding for library Internet connectivity.

(Ledbetter, Heiss et al. 2010 p187) draw attention to the issues around the expectations regarding children’s privacy change throughout the course of child development, and privacy invasions may occur frequently within parent-child relationships. (Petronio 1994) notes that the college years are a time when parental expectations of their children’s privacy can be particularly unclear. Although undergraduates view their college years as a time of increasing independence, ‘‘parents may contradict these expectations by invading the children’s privacy boundaries’’ when the child returns home or when away at college (Petronio 1994 p244). Therefore, parental invasive behaviors ‘‘may send a message to college-aged children that indicates the reluctance of parents to let go’’ (p. 245).

‘‘When I am using the computer, they come behind me and read over my shoulder for a few minutes;’’ ‘‘My dad reads the instant messages I write.’’

‘‘My mom would sometimes try to read my online blog.’’

Likewise, computer-based defences consisted of concealing online conversations such as changing passwords, deleting emails, or closing open windows when a parent enters the room.

Children do want to hide things from parents. It may seem like a contradition in terms, but children may well seek out the privacy that a library can offer as a public space.

Personalization & providing a tailored service

(Adams 2002) While the privacy of students and underage patrons must be protected, Pat Scales, cautions against carrying protection too far. “My primary concern regarding privacy issues as it relates to children and young adults is that we cannot allow privacy to interfere with ‘best practice.’

“This is especially true with reader guidance issues. For example, I feel that it is important that I know what a child likes to read in order to lead him/her to another book they will like. I’m not going to reveal my knowledge to another person, but that child may need my guidance. I don’t believe that we should force guidance on a child, but we cannot totally serve their needs if there is complete privacy.”

Fisk (2016) says adolescents are more likely to value privacy from their parents and other local adults moreso than they are the somewhat more distant-seeming but ever-present threat of corporate surveillance.

In conclusion, libraries face a number of ethical challenges in their work, and that is true of their work with children just as it is for other age groups. Issues such as:

  • What has my child checked out
  • Situations where children consult material on difficult or controversial topics
  • Use of fingerprinting as a form of ID instead of a library card in school libraries, and
  • The use of filtering and monitoring of internet activity
  • Personalization & providing a tailored service

These all pose challenges for a child’s privacy and confidentiality.

 

REFERENCES

ADAMS, H.R., 2002. Privacy and confidentiality: now more than ever, youngsters need to keep their library use under wraps. American Libraries, 33(10), pp. 44-48.

BBC NEWS ONLINE, 2017. French firm offered spyware to “find out if your son is gay”. BBC News Online, .

CONGER, S., PRATT, J.H. and LOCH, K.D., 2013. Personal information privacy and emerging technologies. Information Systems Journal, 23(5), pp. 401-417.

COOKE, L., SPACEY, R., MUIR, A. and CREASER, C., 2014. Filtering access to the internet in public libraries: an ethical dilemma? Globethics.net, pp. 179-190.

FERGUSON, S., THORNLEY, C. and GIBB, F., 2016. Beyond codes of ethics: how library and information professionals navigate ethical dilemmas in a complex and dynamic information environment. International Journal of Information Management, 36(4), pp. 543-556.

FINCH, D., 2015. Privacy and the young reader. Dawn Finch’s blog, .

FISK, N., 2016. The limits of parental consent in an algorithmic world. LSE Media Policy Project Blog, .

HILDEBRAND, J., 1991. Is privacy reserved for adults: children’s rights at the public library. School Library Journal, .

KENDALL, J., 2003. LIBRARY INTERNET ACCESS POLICY. J.Juv.L., 24, pp. 218-273.

LEDBETTER, A.M., HEISS, S., SIBAL, K., LEV, E., BATTLE-FISHER, M. and SHUBERT, N., 2010. Parental invasive and children’s defensive behaviors at home and away at college: mediated communication and privacy boundary management. Communication Studies, 61(2), pp. 184-204.

MCCURRY, J., 2015, 12. Librarians in uproar after borrowing record of Haruki Murakami is leaked. The Guardian.

MCKINNEY, K.D., 1998. Space, body, and mind: Parental perceptions of children’s privacy needs. Journal of Family Issues, 19(1), pp. 75-100.

PARKE, R.D. and SAWIN, D.B., 1979. Children’s privacy in the home: Developmental, ecological, and child-rearing determinants. Environment and Behavior, 11(1), pp. 87-104.

PETRONIO, S., 1994. Privacy binds in family interactions: The case of parental privacy invasion. The dark side of interpersonal communication, , pp. 241-257.

PETRONIO, S. and ALTMAN, I., 2002. Boundaries of privacy.

RATZAN, J.S., 2004. CIPA and the roles of public librarians, a textual analysis. Public Libraries, 43(5), pp. 285-290.

ROWAN, D., 2002. Little brother’s fingerprints all over the library. The Times, .

SKAGGS, J.A., 2002. Burning the Library to Roast the Pig-Online Pornography and Internet Filtering in the Free Public Library. Brook.L.Rev., 68, pp. 809.

SYMONS, A.K. and HARMON, C., 1995. Protecting the Right To Read: A How-To-Do-It Manual for School and Public Librarians. ERIC.

WYATT, A.M., 2006. Do librarians have an ethical duty to monitor patrons’ internet usage in the public library? Journal of Information Ethics, , pp. 70-79.

 

Advertisements

Floridi, Groups & Privacy

One aspect of my research is to consider information privacy from the perspective of entities.

The discourse analysis that I have been undertaking has helped confirm my view that we cannot really understand information privacy by focussing exclusively on the individual. That instead we need to look at a number of different levels – the individual, the group, the company/institution, and society as a whole; and more than that we need to consider all of the stakeholders. And that this includes entities that have a data protection role or function

When I started my research in February, I had been looking at Luciano Floridi’s theory of ontological friction; and then a few months later I went back to his works and started to look at what he had written about groups. The most substantial item on group privacy is at https://www.stiftung-nv.de/sites/default/files/group-privacy-2017-authors-draft-manuscript.pdf

Today I was re-reading a number of articles by Floridi, and what struck me was that he makes numerous statements which help to justify looking at the group & institutional perspective. And that one can’t really understand Floridi’s thinking on groups without also taking on board his thoughts on digital ICTs and their revolutionary nature.

Where he says that the current ethical approach is too anthropocentric and nominalist, I would add that that is also true of data protection legislation such as the General Data Protection Regulation which also focusses heavily on the individual.

The GDPR isn’t exclusively focussed on the individual, though. Article 80 looks at the representation of data subjects, and even though it doesn’t use the words “collective action”, that is what the article deals with; although one must also mention that this is one of the many derogations in the legislation and that therefore there won’t be a consistent approach across all member states.

But coming back to Floridi, these are some of the statements I am thinking of as far as justifying the need to think beyond the individual:

To be clear, I agree with all of the statements below:

·         Floridi argues for an interpretation of privacy in terms of a protection of the information that constitutes an individual – both in terms of a single person and in terms of a group

·         He believes that groups may have a right to privacy

·         He thinks that the current ethical approach is too anthropocentric (only natural persons count) and nominalist (only the single individual person counts)

·         The infosphere denotes the information environment which is constituted by informational entities

·         “The very concept of democracy takes something away from the individual to emphasise the centrality of the “multiagent” system”

·         Agents need not be persons, they could for example be organizations

·         He acknowledges that digital ICTs treat most people not as individuals but as members of specific groups

·         “the information flow requires some friction in order to keep firm the distinction between the multiagent system (the society) and the identity of the agents (the individuals) constituting it”.

 

There are a number of areas where I don’t agree with Floridi’s views on privacy.

In The 4th Revolution (chapter 4), Floridi says that it is common to distinguish four types of privacy, and that these are all “freedoms from”:

  • physical privacy
  • mental privacy
  • decisional privacy
  • informational privacy

There are a number of ways in which I differ from Floridi’s analysis. Firstly, he classifies the four privacy types as all being “freedoms from”, whereas there are indeed a number of different privacy types and they are not all “freedoms from”. There are also a number of privacy types which are “freedoms to”…

Secondly, I would categorise “decisional privacy” not as a freedom from, but as a freedom to.

Thirdly, I prefer the privacy typology of Koops et al who put forward nine privacy types, where informational privacy overlays all the others.

Certainly Floridi sees informational privacy as being the most important of them all, and that is true also of Koops et al. And I would agree with them both on that.

I find that there are times where Floridi uses analogies that they are more confusing than helpful. The one that I had particular difficulty with was the analogy of information privacy in terms of “my” as in “my body” rather than “my car”, and of thinking of someone using my personal data as being like an organ being ripped from my body. The whole point about information is that it has as a key characteristic its non-rivalrous nature. And therefore, when someone takes my data, it doesn’t mean that I no longer have it. He speaks elsewhere about cloning, and that is more akin to what happens in practice.

Floridi believes that digital ICT’s have a revolutionary impact, and that “ICT’s are more redrawing than erasing the boundaries of informational privacy”, which I would agree with. He also believes that they can increase or decrease informational privacy; and that you can’t  have the possibility of them increasing privacy without also have the possibility that they can decrease privacy; and again I would agree. While digital ICTs both empower and disempower people, we don’t really get a detailed discussion of the balance or imbalance between the two. In The Fourth Revolution he gives examples of how digital ICTs can empower people, but for me the examples given are not compelling or wholly convincing. One example relates to reputation management companies. Whereas, for me there are a number of problems with reputation management. Firstly, it isn’t a one-off exercise. As soon as a search engine like Google changes its algorithm, this could undo the previous efforts of a reputation management company. And, secondly, it doesn’t empower everyone equally, although I recognise that these companies have a large number of customers. Many people would not be able to afford the services of a reputation management company.

 

 

 

 

 

Privacy : a glossary of terms

Anonymisation Anonymisation is the process of turning data into a form which does not identify individuals and where identification is not likely to take place.
Blagging Knowingly or recklessly obtaining or disclosing personal data or information without the consent of the data controller.
Cipher-suite A cipher suite is a named combination of authentication, encryption, message authentication code (MAC) and key exchange algorithms used to negotiate the security settings for a network connection using the Transport Layer Security (TLS) / Secure Sockets Layer (SSL) network protocol.
Cryptoparty A CryptoParty is simply a collection of people who come together with a common goal of helping each other safeguard their digital privacy and security.
Cybersecurity User-centred, primary concern is keeping personally identifiable information (PII) private. Source: Measuring vendor cybersecurity presented by  Chris Markman at Internet Librarian International 2016 session D203 Tuesday October 18th 2016.
Datafication Datafication model presented in Mai’s 2016 paper, wherein new personal information is deduced by employing predictive analytics on already-gathered data.
Dataveillance Monitoring and evaluation
Digital footprint Trail or body of data that exists as a result of actions and communications online that can in some way be traced back to an individual.
Digital literacy The set of skills necessary to navigate, access and contribute to the new information environment
Doxed Having your real personal information (e.g. name, address, phone number) discovered and revealed on the Internet, destroying anonymity
Encryption Encryption is the conversion of electronic data into another form, called ciphertext, which cannot be easily understood by anyone except authorized parties.
Enrichment Where a catalogue or database contains cover images (for books and journals etc)
Evercookies Evercookie is a JavaScript-based application which produces zombie cookies in a web browser that are intentionally difficult to delete.
IMSI Catcher (or Stingray) An IMSI-catcher is a telephone eavesdropping device used for intercepting mobile phone traffic and tracking movement of mobile phone users. Essentially a “fake” mobile tower acting between the target mobile phone and the service provider’s real towers, it is considered a man-in-the-middle (MITM) attack.  (Wikipedia definition as at 6 July 2016)
Loveint: Practice of spying on people you like
Onboarding Combining our online personas with our offline selves (building psychological profiles)
Panopticism A social theory named after the Panopticon, originally developed by French philosopher Michel Foucault in his book, Discipline and Punish. (role of Jeremy Bentham)
Sniffing A packet sniffer is a utility that has been used since the original release of Ethernet. Packet sniffing allows individuals to capture data as it is transmitted over a network. Packet sniffer programs are used by network professionals to diagnose network issues, and by malicious users to capture unencrypted data, like passwords and usernames.
Teleological The philosophical study of nature by attempting to describe things in terms of their apparent purpose
Third party An entity that is involved in some way in an interaction that is primarily between two other entities
Tracking As users browse the web, their browsing behavior may be observed and aggregated by thirdparty websites (“trackers”) that they don’t visit directly. (from Tracking excavator: uncovering tracking in the web’s past  https://trackingexcavator.cs.washington.edu)

Some conclusions from a discourse analysis

We found that undertaking a discourse analysis is a valuable way of testing a theoretical framework and model; and that this would be the case whether or not you conclude that they are robust and withstand the rigours of the discourse analysis process, because it will either vindicate your framework and model in whole or in part; or else it will give an indication of what they should contain.

It is likely that you will see your research project from a different perspective; that it will make you question the validity and the comprehensiveness of the concepts that you have identified; and whether or not minor or major changes are required.

In our case, the discourse analysis led us to add separate categories for values, rights and freedoms; safeguards and protections; and necessity and proportionality; although one could argue that necessity and proportionality should be treated as a subset of safeguards and protections.

The coding process is very subjective, but we found that undertaking a discourse analysis across several texts can help to overcome some of the problems because you can monitor trends across the different texts; and where the texts are spread across a long time period, it is possible to see how things have changed from a temporal perspective.

We chose to undertake a thematic discourse analysis in order to test both our theoretical framework and model. We conclude that It is impossible to entirely set aside any previous thoughts about how to approach the analysis, but it is nevertheless still feasible to learn lessons from the approach.

We found ourselves creating new themes that weren’t in our theoretical framework and model; and even where there were commonalities between the original framework & model, and the groupings of categories that emerged during the exercise, they proved useful because they helped clarify our thinking on those themes, and we had used different ways of expressing similar concepts. It was only by undertaking the discourse analysis that we broadened out the “entities” category to encompass data protection roles and functions.

Discourse analysis – how the themes emerged

The more time spent on the discourse analysis, it feels as though the less progress that is being made. But here are some thoughts on how the themes emerged.

The themes that emerged whilst we undertook the discourse analysis were the result of an iterative process of categorising and re-categorising terms. We found the process of categorisation to be very difficult, especially for categorising phrases where the phrase could contain several different concepts:

“easy access” could be tagged under access, but it could also be tagged under enablers/facilitators, as it makes the flow of personal data easier

“international co-operation” could be tagged under co-operation and assistance, or it could be tagged under national/international dimension

One of the limitations of the study was that each of the words or phrases identified were assigned to a single category. If we had analysed a much smaller number of terms, it would have been easier to assign multiple codes to each term.

Having noted down particular terms, it was necessary to go back and see how they were used in context. Taking a word in isolation, divorced from the context in which it has been used can lead to mis-categorisation; or to situations where the word is used in a number of different contexts. It isn’t sufficient to just pick out individual words that appear in more than one of the pieces of legislation, because one has to also take account of the context within which they are used.

Taking as an example the word “judicial”, the term appears in both the directive and the GDPR, but not the Convention. Specifically in the category of “entities”, one type that is mentioned are “judicial authorities”. In that example it is necessary to use the phrase rather than relying on all occurences of the word “judicial” because the word is used in a wide range of contexts. In the GDPR the word judicial is followed immediately by the following possibilities: Authorities (5), Capacity (5), Tasks, System, Redress, Procedure, Proceedings, Review, Authorisation, Remedy (19), Remedies (1), Protection, Independence.

Once each word or phrase had been assigned a code, the complete listing of words and phrases was sorted based on the initial list of categories. It was clear that the categories or themes would require some adjustment:

The first reason for this was because there were too few entries under several of the initial themes. For example, we categorised one item under a heading for exceptions. We could have gone back through each piece of legislation to look for any additional words or phrases that would fit within the exceptions category, but we chose not to. It is impossible to look at every possible theme given the volume of content being analysed, and the exceptions theme wasn’t felt to be a priority.

A second reason is that some of the provisional themes were misleading. For example, we initially used the heading “effort(s)”, and at face value one might think that this would fit within the heading for obstacles if something required a lot of effort. However, a number of the occurences of “effort” were in the context of where it would require disproportionate effort, so rather than look at the word “effort(s)”, we chose instead to analyse the phrase “disproportionate effort”, and that phrase slots into the heading for legality, necessity & proportionality.

Given that data protection and privacy are so multi-faceted it is hardly surprising that some words and phrases could easily be tagged under more than one heading. Choosing the best tag can be quite subjective.

As our theoretical model draws on two levels of analysis – entities and flows – it would have been easy to see everything in terms of those two concepts. For example, one theme which emerged was safeguards or protections, and this accounted for 61 terms. We could have categorised many of these under “flows” and the sub-category of “obstacles”.

Of the 372 terms analysed, 142 were tagged as entities, 77 as flows, and 153 fell into a third category which contained the following themes:

  • Power and control
  • Safeguards
  • Values, rights and freedoms
  • Co-operation & assistance
  • Access
  • Technology
  • National/international
  • Necessity & proportionality
  • Processing of data

We tried not to pre-judge how the themes would emerge. Having said that, our theoretical framework has at its heart entities (individuals, groups, and society), and Floridi’s theory of ontological friction is all about the extent to which personal data is able to flow freely within the infosphere.

It is important for us to be as self-aware as possible and conscious of any pre-conceptions. We weren’t able to completely block out of our minds entities and flow as potential levels of analysis. The initial list of themes was built up based on second tier terms, and the list of these terms was refined many times. It was only later that we grouped these into three categories at the top level of the hierarchy: entities, flows, and then a third category for other concepts.

The broad category of “Entities” was assigned after initial coding had identified a number of different types of entity. While our theoretical model is based around entities in terms of individuals, groups and society, our coding didn’t reflect that triadic categorisation. Rather, during the discourse analysis we created a fourth heading for entities with a data protection role or function.

Under the theme for groups, we split this out into four sub-categories: companies & institutions, groups of individuals, other groups, and then added in another category for states to cover terms such as “member state”, “non-contracting state” and so on.

Self-awareness is essential when searching for themes, reviewing themes, defining and naming themes. There is a real risk of gathering terms to support a narrative that one may have already arrived at. For example, using entities as one level of analysis it would have been possible to look only for evidence to show that there was a  concentration on the individual or natural person and that this could be seen in terms of disempowerment – that it was motivated by a desire to take power away from the individual. It could all have been based on a presumption that where large numbers of people suffer privacy harms, bringing individual lawsuits would be unrealistic both because of the financial burden it would place on them and because it would be a hugely inefficient use of judges time and that the end result would be that the perpetrators of privacy harms would simply be able to get away with violating the law.

Thematic discourse analysis makes me realise I need to add an important factor to my theoretical framework & model

Developing a theoretical framework and model is an iterative process. It is all well and good to come up with a provisional framework and model, but these then need to be tested over and over again using a range of research methods.

The first methodology that I opted for was to undertake a thematic discourse analysis of three pieces of data protection legislation, and that has already made me realise that I need to make the issue of proportionality much more explicit.

Privacy is not an absolute right. That is quite clear, for example, from the many cases that set the competing interests of Article 8 of the ECHR alongside those of Article 10 (right to privacy and the right to freedom of expression).

Precisely because it isn’t an absolute right, an important dimension is that of necessity and proportionality.

The ECHR derives from the Council of Europe, and one of the three pieces of legislation I selected for the thematic discourse analysis was a CoE convention. But as two of the pieces of legislation I have chosen emanate from the European Union, rather than the CoE, I think it is also important to go back to the EU treaties, and in this instance to a consolidated version of the Treaty on European Union in force 26th October 2012 (http://eur-lex.europa.eu/legal-content/en/TXT/?uri=CELEX%3A12012M%2FTXT).

Article 5 of the Treaty says that:

The use of Union competences is governed by the principles of subsidiarity and proportionality…

Under the principle of conferral, the Union shall act only within the limits of the competences conferred upon it by the Member States in the Treaties to attain the objectives set out therein. Competences not conferred upon the Union in the Treaties remain with the Member States…

Under the principle of proportionality, the content and form of Union action shall not exceed what is necessary to achieve the objectives of the Treaties…

The institutions of the Union shall apply the principle of proportionality as laid down in the Protocol on the application of the principles of subsidiarity and proportionality”.

The right to the protection of personal data is not an absolute right. It has to be balanced against other fundamental rights, in accordance with the principle of proportionality.

So, for example, the General Data Protection Regulation recognizes this, and makes clear that the processing of personal data must be:

  • necessary and proportionate measure in a democratic society (recital 19)
  • to the extent strictly necessary and proportionate for the purposes of ensuring network and information security (recital 49)
  • a necessary and proportionate measure in a democratic society to safeguard, in particular, important objectives of general public interest (recital 50)
  • necessary and proportionate in a democratic society to safeguard public security, including the protection of human life especially in response to natural or manmade disasters (recital 73)
  • each measure should be appropriate, necessary and proportionate in view of ensuring compliance with this Regulation, taking into account the circumstances of each individual case (recital 129)
  • Member States should implement a system which provides for effective, proportionate and dissuasive penalties (recital 152)
  • In any event, the fines imposed should be effective, proportionate and dissuasive (recital 152)
  • minimising the processing of personal data in pursuance of the proportionality and necessity principles (recital 156)
  • The Union or the Member State law shall meet an objective of public interest and be proportionate to the legitimate aim pursued (article 5)
  • proportionate to the aim pursued (article 9)
  • shall in each individual case be effective, proportionate and dissuasive (article 83 fines)
  • necessary and proportionate to reconcile the right of the protection of personal data with the obligation of secrecy (article 90)

 

Article 23 has a long list of circumstances where the data processing would need to be a necessary and proportionate measure in a democratic society to safeguard:

  • (a) national security;
  • (b) defence;
  • (c) public security;
  • (d) the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security;
  • (e) other important objectives of general public interest of the Union or of a Member State, in particular an important economic or financial interest of the Union or of a Member State, including monetary, budgetary and taxation a matters, public health and social security;
  • (f) the protection of judicial independence and judicial proceedings;
  • (g) the prevention, investigation, detection and prosecution of breaches of ethics for regulated professions;
  • (h) a monitoring, inspection or regulatory function connected, even occasionally, to the exercise of official authority in the cases referred to in points (a) to (e) and (g);
  • (i) the protection of the data subject or the rights and freedoms of others;
  • (j) the enforcement of civil law claims.

 

 

A few observations on “society” in data protection legislation

SOCIETY COE 1981 Convention 1995 Directive 2016 Regulation  
Humanitarian 0 0 5 5
Humanity 0 0 1 1
Social 0 10 24 34
Society 1 3 18 22
TOTALS 1 13 48 62

The Regulation has 18 occurences of the word “society”. Eight of these appear in the recitals text, eight appear in the articles text, and two appear as endnotes in the form of legislative references where the word “society” appears in the titles of the cited legislation (directive 2000/31/EC and directive 2015/1535).

Of the 16 occurences appearing in the recital and article text:

  • Six occurences refer to a “democratic society”, where five of the six do so in terms of where it relates to it being a necessary and proportionate measure in a democratic society.
  • Nine occurences refer to “information society service(s)”
  • The other reference which appears in recital 53 for the benefit of natural persons and society as a whole, and does so in the context of the management of health or social care services and systems.

Analysis of “society” in the regulation:

  Recital text (recital no’s) Article text (article no’s) Article title Footnotes (footnote no’s)
Democratic society 19,50,73,153 6,23    
Information society service(s) 21,32 4,8,17,21 8 8,19
Information society   97    
Legitimate expectations of society 113      
For the benefit of natural persons and society as a whole 53      

So, 1 mention in Article Title, 8 mentions in recital text, 7 mentions in article text, 2 mentions in footnotes.

All of the mentions of a democratic society in the Regulation are about overriding the interests of the individual for the sake of society as a whole:

  • in cases of money laundering, forensic laboratories (recital 19)
  • To safeguard the general public interest (recital 50)
  • To safeguard public security (recital 73)
  • The right to freedom of expression in every democratic society (eg. for journalism) (recital 153)
  • To safeguard national security, defence, public security, prevention of criminal offences, etc (articles 6 and 23).
  • Recital 113 refers to the “legitimate expectations of society for an increase of knowledge”

Article 9 of the Convention refers to “a necessary measure in a democratic society in the interests of:

A protecting State security, public safety, the monetary interests of the State or the suppression of criminal offences;

B protecting the data subject or the rights and freedoms of others”

 

Directive

recital 14: “given the importance of the developments under way, in the framework of the information society”

recital 54: “Whereas with regard to all the processing undertaken in society, the amount posing such specific risks should be very limited”

Article 33: “taking account of developments in information technology and in the light of the state of progress in the information society.”

 

Five mentions of “Humanitarian” in the Regulation

  • Recital 46 When processing is necessary for humanitarian purposes
  • Recital 46 In situations of humanitarian emergencies
  • Recital 74 Social protection, public health and humanitarian purposes
  • Recital 112 Any transfer to an international humanitarian organisation
  • Recital 112 Complying with international humanitarian law

 

Mention of Humanity in Regulation

  • Recital 158 “Crimes against humanity”

 

There are 24 occurences of “social” in the Regulation, in a variety of contexts:

  • Economic and social integration
  • Economic or social disadvantage
  • Efficiency of social services
  • Health or social care
  • Health or social care services
  • Health or social care system
  • Social conditions
  • Social life
  • Social networking
  • Social progress
  • Social protection
  • Social protection law
  • Social science
  • Social security matters

 

Ten mentions of “social” in the directive including in the contexts:

  • Business and social partners
  • Economic and social activity
  • Economic and social integration
  • Economic and social progress
  • Economic, cultural or social identity
  • Health and social protection
  • Social security matters