A few observations on “society” in data protection legislation

SOCIETY COE 1981 Convention 1995 Directive 2016 Regulation  
Humanitarian 0 0 5 5
Humanity 0 0 1 1
Social 0 10 24 34
Society 1 3 18 22
TOTALS 1 13 48 62

The Regulation has 18 occurences of the word “society”. Eight of these appear in the recitals text, eight appear in the articles text, and two appear as endnotes in the form of legislative references where the word “society” appears in the titles of the cited legislation (directive 2000/31/EC and directive 2015/1535).

Of the 16 occurences appearing in the recital and article text:

  • Six occurences refer to a “democratic society”, where five of the six do so in terms of where it relates to it being a necessary and proportionate measure in a democratic society.
  • Nine occurences refer to “information society service(s)”
  • The other reference which appears in recital 53 for the benefit of natural persons and society as a whole, and does so in the context of the management of health or social care services and systems.

Analysis of “society” in the regulation:

  Recital text (recital no’s) Article text (article no’s) Article title Footnotes (footnote no’s)
Democratic society 19,50,73,153 6,23    
Information society service(s) 21,32 4,8,17,21 8 8,19
Information society   97    
Legitimate expectations of society 113      
For the benefit of natural persons and society as a whole 53      

So, 1 mention in Article Title, 8 mentions in recital text, 7 mentions in article text, 2 mentions in footnotes.

All of the mentions of a democratic society in the Regulation are about overriding the interests of the individual for the sake of society as a whole:

  • in cases of money laundering, forensic laboratories (recital 19)
  • To safeguard the general public interest (recital 50)
  • To safeguard public security (recital 73)
  • The right to freedom of expression in every democratic society (eg. for journalism) (recital 153)
  • To safeguard national security, defence, public security, prevention of criminal offences, etc (articles 6 and 23).
  • Recital 113 refers to the “legitimate expectations of society for an increase of knowledge”

Article 9 of the Convention refers to “a necessary measure in a democratic society in the interests of:

A protecting State security, public safety, the monetary interests of the State or the suppression of criminal offences;

B protecting the data subject or the rights and freedoms of others”



recital 14: “given the importance of the developments under way, in the framework of the information society”

recital 54: “Whereas with regard to all the processing undertaken in society, the amount posing such specific risks should be very limited”

Article 33: “taking account of developments in information technology and in the light of the state of progress in the information society.”


Five mentions of “Humanitarian” in the Regulation

  • Recital 46 When processing is necessary for humanitarian purposes
  • Recital 46 In situations of humanitarian emergencies
  • Recital 74 Social protection, public health and humanitarian purposes
  • Recital 112 Any transfer to an international humanitarian organisation
  • Recital 112 Complying with international humanitarian law


Mention of Humanity in Regulation

  • Recital 158 “Crimes against humanity”


There are 24 occurences of “social” in the Regulation, in a variety of contexts:

  • Economic and social integration
  • Economic or social disadvantage
  • Efficiency of social services
  • Health or social care
  • Health or social care services
  • Health or social care system
  • Social conditions
  • Social life
  • Social networking
  • Social progress
  • Social protection
  • Social protection law
  • Social science
  • Social security matters


Ten mentions of “social” in the directive including in the contexts:

  • Business and social partners
  • Economic and social activity
  • Economic and social integration
  • Economic and social progress
  • Economic, cultural or social identity
  • Health and social protection
  • Social security matters

Thematic discourse analysis of data protection legislation

Background to why the legislation was passed

Information technology has been utilized by both government and business as a way of streamlining their day-to-day activities, and this included the growing use of automated data systems specifically for information about individuals. and this included the use of personal data. Consequently, legal scholars began to consider a legal framework to protect the privacy of individuals.

This led to the Fair Information Practice Principles (FIPP). Fair information practice was initially proposed and named by the US Secretary’s Advisory Committee on Automated Personal Data Systems in the 1973 report “Records, computers and the rights of citizens”.

As countries became more economically inter-dependent, there was a need for privacy to be considered at an international level. In the private sector, the rise of the multi-national company has meant that corporations need to transfer data across national borders, both in terms of data flows within their own organisation’s offices which are often dispersed across multiple locations and countries, but also in terms of working with suppliers, partners, and third parties.

In 1980, in an effort to create a comprehensive data protection system throughout Europe, the Organization for Economic Cooperation and Development (OECD) issued its “Recommendations of the Council Concerning Guidelines Governing the Protection of Privacy and Trans-Border Flows of Personal Data”.

Privacy experts lobbied European institutions to address the policy issues. The Council of Europe was the venue for initial international legislative efforts and led to the 1981 “Convention for the protection of individuals with regard to automatic processing of personal data”. The Convention takes account of the increasing flow across frontiers of personal data undergoing automatic processing, whilst simultaneously recognising the need to protect privacy within the wider context of recognising the need for respect for the rule of law as well as human rights and fundamental freedoms.

The European Commission realised that diverging data protection legislation amongst EU member states impeded the free flow of data within the EU and accordingly proposed the Data Protection Directive COM(1992) 422, which became directive 95/46/EC.

The GDPR (regulation 2016/679) will supercede the data protection directive on 25 May 2018. There were a number of drivers for replacing the 1995 directive, not least to address the exponential increase in personal data flows across the world which has been witnessed in the years since the directive was originally passed. These are attributable to a number of factors, including the rise of the world wide web, the ease with which data can be gathered on a large scale, and the decreasing costs of computer storage space. The GDPR addresses the export of personal data outside the EU, and its reach stretches beyond the member states of the European Union to companies based outside the Union but which process the personal data of EU citizens.

European Regulations work differently from European Directives, in that they have direct effect without the need for each member state to introduce its own implementing legislation. In theory it means that there will be a single set of rules for data protection applying to all EU member states.

In reality, the Regulation contains a series of derogations which means that there will still be a number of discrepancies between countries; and there could also be differences in the way in which the legislation is enforced.

The 1981 Council of Europe Convention, the 1995 data protection directive, and the 2016 General Data Protection Regulation were chosen as the source material to use for a thematic discourse analysis. One consideration was that the dates on which each piece of legislation was passed are spread over a period of thirty five years, and so one factor was to think about how things may have developed and changed over that time period.

In addition to the three pieces of substantive legislation, the COM DOCS which set out the proposed legislation for the Directive and the Regulation were also analysed:

1992 [COM(92) 422 final]

2012 [(COM)2012 011 final]

Importance of words and phrases

Words and phrases can be pregnant with meaning in different contexts, and indeed the use of language isn’t always straightforward. There are a number of phenomena in language which require people to take special care with words. These include: polysemy, the coexistence of many possible meanings for a word or phrase; homonyms, words which have the same spelling and pronunciation, but have different meanings; homophones, words which have the same pronunciation, but different spellings and meanings, and homographs, words that are spelt the same, but have different pronunciations and meanings.

Overview of the texts

As a starting point, the textual content of each piece of legislation was analysed using Mladen Adamovic’s textual analyser (online-utility.org), and a brief summary of the key characteristics of each document is given below:

1981 1995 2016 1992 2012
Number of characters (including spaces) 21276 79387 353577 205976 315011
Number of characters (without spaces) 17357 64802 289058 162368 252919
Number of words 3503 12677 55199 32512 49176
Lexical density[1] 18.8124 11.6589 4.9204 8.1662 6.1839
Number of syllables 6222 22891 101740 57410 90073

The total number of words for all three pieces of legislation adds up to 71,379. They vary significantly in length, with the GDPR account for 77% of the total word count, the directive accounts for a further 18%, and the Convention is the shortest of the three, accounting for 5%.

These figures need to be borne in mind when looking at the frequency of word occurences across the texts.

We then looked at the structure of the texts:

1981 convention                                              1995 directive                            2016 regulation

7 chapters                                                           72 recitals                                     173 recitals

27 articles                                                            34 articles                                    99 articles                                                                                 9 sections                                    12 sections                                                                                 7 chapters                                   11 chapters

EU directives and regulations are split into a series of recitals followed by articles. The Articles constitute the substantive legislation, while the recitals are effectively a kind of background setting the scene and explaining the intentions behind the legislation. Where words or phrases appear in the Articles, these are significant because they form part of the substantive legislation. We also felt it was worthy of study to analyse where the words and phrases appear – whether it was in the text of an article or recital, or whether it appeared within the overall title of the legislation; within a section or chapter heading, or within the title of an article. It is interesting to note that in the Directive, eight of the Articles don’t have a title. The recitals in the directive and the regulation all appear without any individual titles.

Our central research question is “What is the best way to understand the concept of information privacy”; question 2 is “What is the most appropriate theoretical framework to investigate this concept”; while question 3 is “What is the most useful model of information privacy, within this framework?”

In order to make sense of information privacy it is necessary to settle upon a level of analysis, and after considerable thought, we opted to view privacy through the prism of the individual, the group, and society as a whole.

The model we have developed is based around Luciano Floridi’s theory of “ontological friction”.

In order to test whether the level of analysis and the theoretical model are appropriate, the first research method chosen was that of thematic discourse analysis. (Braun, Clarke 2006)

(Braun, Clarke 2006) identify the following six phases of thematic analysis:

  1. Familiarising yourself with the data
  2. Generating initial codes
  3. Searching for themes
  4. Reviewing themes
  5. Defining and naming themes
  6. Producing the report

We also took into account grounded theory (Charmaz 2014).

The aim of the discourse analysis was to examine the treatment of two broad concepts:

  • Firstly, entities. This embraces individuals, groups, institutions, and society as a whole; natural persons, entities with a legal personality, as well as entities without a legal personality which are not natural persons.
  • Secondly, the flow of information, and the factors that facilitate or obstruct that flow

Privacy is a multi-faceted and complex topic, and it quickly became apparent that it would be necessary to take account of a wider range of issues; and these link back to our theoretical framework.

Coding system

A unique coding system was developed for the research. Codes were assigned for the following attributes:

  • The piece of legislation
  • The position of the text
  • The group or category within which the text falls and finally
  • The text itself

Legislation codes

CV          Convention

DIR         Directive

REG        Regulation


Codes for where the text appears

T              Title

CT           Chapter title

AT           Article title

REC        Recital text

ART        Article text

ST           Section title


Level of analysis

There is an emphasis on the individual in the legislation:

  • the Convention is called the “Convention for the Protection of Individuals….”
  • the Directive is called a directive “on the protection of individuals with regard to the processing of personal data …”
  • while the GDPR is called a Regulation “on the protection of natural persons with regard to the processing of personal data …”

All three refer in their overall title either to an individual or to a “natural person”. None of them have within the titles any mention of other levels of analysis.

Flow of information

The international nature of the legislation feeds into the emphasis on the flow of personal data across national boundaries. Words and phrases used include “border”, “cross-border”, and “transborder”; and the legislation doesn’t look exclusively at the flow of data within their respective organisations, the Council of Europe, or the European Union. The legislation looks both inwards and outwards: for the European Directive and the Regulation, reference is made to the “internal market”, but also to the transfer of data to “third countries”.

Some people think of cross-border and trans-border as implying relationships between countries sharing a common border.

Some discrepancies between where words do and don’t appear are merely down to the use of slightly different terminology

In other cases, it is noticeable that the concept conveyed by the words uniquely appears in one piece of legislation, even if you were to take account of synonyms. For example, it is only in the GDPR that we get the mention of “Child”, “Children”, and “Child’s”.


[1] Lexical density = the number of lexical words (content words) divided by the total number of words


Some findings from discourse analysis

A few findings from discourse analysis

GDPR has following words which don’t appear in Directive or Convention

Certification (69 times)

Binding corporate rules (25 times)

Without undue delay (15 times)

Restriction of processing (13 times)

Facilitate (11 times)


The directive has the following words which don’t appear in Convention or GDPR

Simplification (6 times)

National law applicable (5 times)


The convention has the following words which don’t appear in Directive or GDPR

Transborder (3 times)

Resident abroad (3 times)


Comparing the 2012 COM DOC with the GDPR (ie. the original draft versus the final version), the following are noticeable

                                                                                2012 COM DOC                 2016 GDPR

European Data Protection Board               89                                           5

Co-operation                                                     40                                           0

Subject to                                                            38                                           70

Empowered                                                       27                                           4

Necessary for the                                             22                                           33

Codes of conduct                                             15                                           21

Certification                                                       13                                           69

Restriction                                                          3                                              19

Can librarians guarantee to protect user privacy? (Text of my presentation for Internet Librarian International 2017)

Our personal data flows back and forth continually. It is a by-product of us going about our daily lives. That’s true of library usage, just as it is of other areas of life. Whether we are reading ebooks, consulting an electronic newspaper, or placing a request for an item currently being borrowed by another library user; our personal data is flowing back and forth. The key question is can those data movements be controlled, and if so, how.

There has been a shift over the last four decades towards delivery of library services electronically, using integrated library management systems, ebook platforms, RFID technology, self‐service issue systems, online databases, and discovery services. Many libraries utilize cloud computing.

Another dimension is the way in which users access library services from home and elsewhere using their own devices; rather than doing so solely on equipment provided by and located within a bricks and mortar library.

If, for example, a library user accesses an ebook from home, their personal data is processed by the library; by the e-book vendor; and by the e-reader software company. The e-book vendor may in turn use third party cookies, and whilst they may claim that these cookies don’t contain any personally identifiable information and can only be used to identify machines rather than individuals, the reality is that device fingerprinting can be used to fully or partially identify individual users. (Conger, Pratt et al. 2013) privacy model has

  • 4th parties (illegal entities)
  • 3rd parties (legal datasharing partners)
  • 2nd parties (vendors) &
  • 1st parties consumers / individuals

As libraries have relied more heavily on digital services, the challenge for librarians of being able to protect patron privacy has grown exponentially because of the complex ecosystem which has developed involving libraries, vendors, and third parties.

It is imperative that user privacy is extended beyond interactions with physical libraries, and this may require extensive programming and cyber security expertise.

The firefox addon lightbeam is a useful tool to visualise precisely which sites have had access to your information, both in terms of the sites you have visited AND the third parties who have also had access to your information. Another useful tool is the Ghostery addon which lists the trackers that are in use on a website, broken down by category type (trackers for social media, for advertising, for analytics, for customer or user interaction, etc), and gives you the option to block the ones you want to exclude.

“Librarians defend and protect reader privacy in recognition of the strong connection between the freedom to read and the right to privacy. The right to read freely depends upon the knowledge that what one is reading is not monitored or tracked. Protecting reader privacy ensures that library users can pursue any inquiry or read any book without fear of judgment or punishment” (Caldwell-Stone 2012).

Privacy is one of the most commonly featured values in the codes of ethics of library associations around the world. Indeed (Lamdan 2015) says that librarianship is one of the only professions that explicitly expresses privacy rights in its codes of ethics.

Carrie Gardner, who was library services coordinator at a school in Hershey Pennsylvania is quoted by (Adams 2002) as saying “Often, if people do not think their information requests and information-gathering activities are going to be kept private, they won’t ask for the information. They would rather suffer the consequences of not knowing”. (Chilling effect)

In 2000 (Gorman 2000) published a book, in which he lists the values that characterise and shape the work of librarians:

Gorman’s eight core values

  1. Stewardship
  2. Service
  3. Intellectual freedom
  4. Rationalism
  5. Literacy & Learning
  6. Equity of access
  7. Privacy
  8. ensuring the confidentiality of records of library use
  9. overcoming technological invasions of library use
  10. Democracy

Gorman says “Even in many democratic countries, the twin threats of an empowered surveillance state and a big technology assault on privacy make the defense of intellectual freedom harder than it was in previous generations” (Gorman 2015)

I am currently studying for a PhD and my research is from a philosophical grounding, primarily using Luciano Floridi’s concept of ontological friction, to understand the flow of personal data within the infosphere.

Given some amount of personal information available in a region of the infosphere[1], the lower the informational friction in that region, the higher the accessibility of personal information about the agents embedded in that region, the smaller the information gap among them, and the lower the level of privacy that can be expected (Floridi 2014).

I have put together an initial list, where I have identified a number of different types of “friction” which can affect the ease with which our data flows back and forth.





Training & Awareness

Information behaviour




Privacy is not an easy concept to define because there are so many different aspects to it. Nevertheless (Koops, Newell et al. 2017) have produced a typology consisting of eight types of privacy. In addition, a ninth type – informational privacy – overlays all of the other eight types.


Privacy Types  
Bodily privacy Direct and indirect invasions of bodily integrity
Spatial privacy Privacy expectations in and around one’s home (& possibly also the workplace)
Communicational privacy Someone violates this type of privacy by, for example, intercepting personal communications, eavesdropping, or accessing stored communications without consent
Proprietary privacy Reputational & image management
Intellectual privacy Privacy of thought and mind, development of opinions and beliefs
Decisional privacy Concerns the freedom to make decisions about one’s body and family. Decisional privacy involves matters such as contraception, procreation, abortion and child rearing. Freedom from interference in one’s personal choices, plans and decisions
Associational privacy Freedom to connect with whomever or with whichever group one chooses without being monitored
Behavioural privacy Activities that happen in both public and private places, and encompasses sensitive issues
Informational privacy Encompassing information/data/facts about persons or their communications

The types of privacy are drawn from the (Koops, Newell et al. 2017) typology of privacy.


I have built up a database of library privacy scenarios, and have used this to illustrate how personal data flows can be adjusted using the various types of friction.

One type of friction is “obscurity”. That heading also incorporates “practical obscurity” and “obfuscation”

Obscurity is the idea that when information is hard to obtain or understand, it is, to some degree, safe. Safety, here, doesn’t mean inaccessible. Competent and determined data hunters armed with the right tools can always find a way to get it. Less committed folks, however, experience the need for great effort, and it therefore acts as a deterrent.


Online obscurity:

Search visibility – use of robots.txt being one example

Unprotected access – not using access controls such as passwords, biometrics, encryption, privacy settings

Identification – ability to use pseudonyms (the “nym” wars), anonymisation

Clarity – the data doesn’t make sense because it is intentionally vague or incomplete

Right to be forgotten


Practical obscurity

Difficulty of collecting the data

Available only in the physical library v the digital library

Difficulty of correlating or aggregating the information

Information requires burdensome or unrealistic effort to obtain


Obfuscation (the deliberate use of ambiguous, confusing or misleading information to interfere with surveillance and data collection projects



Scenario 1

Police and FBI agents appeared at the Newton Free Library in January 2006 after learning of a terrorist threat sent to nearby Brandeis University had been generated from a library computer. The library Director Kathy Glick-Weil refused to hand over the computer without a warrant, and was backed up by the City Mayor. A warrant was eventually produced (Library Journal staff 2006).

Privacy Types: Informational Privacy; Intellectual Privacy

Friction Types: Regulatory (ALA Code of Ethics; the state law covering privacy of library records: Mass. Ann. Laws ch. 78, §  7)


Some people might portray the librarian’s actions as being obstructive; whilst others might portray them as the actions of someone wanting to obey the law and follow the library’s guidelines.


Scenario 2

A library patron leaves a printout of a highly personal email at the public printers at closing time.

Privacy Types: Informational Privacy; Communicational Privacy

Friction Types: Technological; Training & Awareness

Comment: It is possible to ensure that printouts are only released once someone has swiped their card at the printer/copier, thereby minimising the risk of someone else accidentally picking up the printout. There are times when people almost need saving from themselves, where it is the data subject who is responsible for a data breach relating to their own personal information.


Scenario 3

“Public libraries have focused on price negotiations in light of a certain sense of genuine desperation about being able to offer anything to patrons; other terms and conditions, such as privacy protections, have generally received much lower priority” (Lynch 2017)

Privacy types: Informational Privacy; Intellectual Privacy (Depending on the material being read other privacy types may be relevant such as Decisional Privacy or Bodily privacy)

Friction Types: Regulatory; Technological

Comment: It is understandable for librarians to want to make as much digital content as possible available to their users. But this should not be at the expense of their users’ privacy. As (Dixon 2008 p156) observes, if libraries only chose vendors who had good privacy policies, the industry would have to change its standards in order to obtain library business.

Scenario 4

Indiscreet reference interviews (Voice level, private space)

Privacy Types: Informational privacy; Bodily privacy; Spatial privacy

Friction Types: Spatial; Sensory

Scenario 5

Snooping devices (keystroke loggers) were found in Cheshire library computers http://www.bbc.co.uk/news/uk-england-manchester-12396799

Privacy Types:  Informational privacy; Intellectual privacy; Communicational privacy (and, depending on the material captured by the keystroke loggers, it could involve other types of privacy such as Decisional Privacy)

Friction Types: Technological

Scenario 6

A library user wants to borrow a few items which are listed in the Books on Prescription scheme. They opt not to use the volunteer run library nearest their home, but instead go to the central library managed by paid staff.   The Books on Prescription scheme prescribes books on mental health conditions, and the library user is nervous about whether or not they can trust volunteers to keep to themselves the fact that they want to borrow books with titles such as “Overcoming low self-esteem”, “Break free from OCD” or “Overcoming binge eating”

Privacy Types: Informational privacy; Intellectual privacy; Bodily privacy

Friction Types: Information Behaviour; Training & Awareness; Regulatory (ethics)

Scenario 7

In Virginia a husband requested circulation records of his wife to prove she had been “exploring avenues of divorce” before he filed the papers

Privacy Types: Informational privacy, Decisional privacy, Intellectual privacy

Friction Types:

  • Regulatory: The state laws of Virginia (http://www.ala.org/advocacy/privacy/statelaws); the library guidelines & procedures, the ALA code of ethics
  • Temporal: Keeping the information for the minimum period required
  • Obscurity: Anonymising the borrowing records as soon as items are returned to the library
  • Training & awareness: Ensuring that all library staff are fully cognisant of the library’s policies and procedures and their obligations under the law


ADAMS, H.R., 2002. Privacy and confidentiality: now more than ever, youngsters need to keep their library use under wraps. American Libraries, 33(10), pp. 44-48.

CALDWELL-STONE, D., 2012. A digital dilemma: ebooks and users’ rights. American Libraries, .

CONGER, S., PRATT, J.H. and LOCH, K.D., 2013. Personal information privacy and emerging technologies. Information Systems Journal, 23(5), pp. 401-417.

FLORIDI, L., 2014. The 4th revolution: how the infosphere is reshaping human reality. Oxford, United Kingdom: Oxford University Press.

GORMAN, M., 2015. Our enduring values revisited: librarianship in an ever-changing world. Chicago: ALA Editions, an imprint of the American Library Association.

GORMAN, M., 2000. Our enduring values: librarianship in the 21st century. Chicago; London: American Library Association.

LAMDAN, S., 2015. Librarians as feisty advocates for privacy. CUNY Academic works, .

LYNCH, C., 2017. The rise of reading analytics and the emerging calculus of reader privacy in the digital world. First Monday, 22(4 (April 3rd)),.


[1] We are moving from living in the biosphere to the infosphere. Information is our environment. The infosphere is a newly-created digital space built by new technology (Floridi 2014). It includes agents and objects, services, relations and processes, as well as the space within which they interact. It should not be confused with cyberspace, because it encompasses online as well as offline and analogue domains.

Raw data from discourse analysis

A quick analysis of the 1981 Council of Europe Convention, the Data Protection Directive and the GDPR – a few snippets from the raw data (I have been building up much more granular data, by  word/phrase, by where they appear, by which legislation they appear in:

1981      1995          2016
Number of characters (including spaces) 21276    79387        353577
Number of characters (without spaces)    17357     64802       289058
Number of words                                           3503       12677       55199
Lexical density                                                18.8124 11.6589    4.9204
Number of syllables 6222 22891 101740

Used: online-utility.org founded by Mladen Adamovic
Lexical density = the number of lexical words (content words) divided by the total number of words

1981 convention
Chapters 7
Articles 27
1995 directive
Recitals 72
Articles 34
Sections 9
Chapters 7
Articles with no heading 8
2016 regulation
12 sections (1-5 x 2 plus 1-2)
173 recitals
99 articles
11 chapters

Mentions of “entities” within titles, chapter headings, section headings, and article titles :

There are 74 mentions of entitles altogether in the title headings

In which legislation the references appear
14% in the Council of Europe Convention
19% in the Data Protection Directive
68% in the GDPR
(the figures don’t add up to 100% because of rounding up or down)

Where the headings appear (in terms of which headers)
73% appear in Article titles
14% in Chapter titles
4% in Section titles
9% in overall document titles

Types of entities referred to
45% are natural persons
39% are procedural or governance entities
8% are parties to the agreements
5% are states
1% = society
1% are “other groups”


The visual message we can or do portray about privacy to our users

One of the great things about being a university student is that it really gets you thinking. I have just started to attend some lectures on research methodologies, and this morning instead of a lecture we had a visit to the Museum of London to think about analysing visuals. We were there to see the exhibition “The city is ours”.

We were asked to consider a number of questions, but in doing so it really made me think about applying to libraries the things we considered, and here I am thinking in terms of privacy.

If we as librarians think about the importance of privacy, what do we do to put into practice our ethical values with regard to privacy and confidentiality.

Do we think carefully enough about the message we want to convey, and how best to get that message across.

What, if anything, would there be for people to see that relates to privacy if they were to visit our libraries?

          Are there cameras in operation, and if so, how prominent are they

          And if cameras are in use, is there any supporting material in terms of notices or visual cues as to the presence of cameras, and what to do if you want to request footage of yourself under the Data Protection Act

          Are copies of the institution’s privacy policy on display anywhere

          What about the computer terminals, do they have separator screens or partitions

          Do the computers have privacy screens on them, so its harder for passers-by to see what you are looking up (it could be on a controversial topic, a particular religious or political viewpoint, or about a topic that the user wouldn’t want others to see such as something about stress, divorce, or whatever

No doubt there are loads more questions one could ask about current practices

But what about the potential that exists to use all things visual to convey a confident message that we in this library have thought about privacy; we want to protect you; we want to help  you to protect yourselves in terms of safe internet searching, use of tools to protect privacy.

It just made me think that this is one huge area we may well be overlooking – surely if we embrace privacy as an issue, and demonstrate that we really have thought carefully about it, our users will come to us to ask for help and advice, for training etc.

What about the use of some sort of symbols, icons, or signs to mark up resources that are particularly safe to use.

There’s an interesting article by Lilian Edwards on the use of icons http://www.create.ac.uk/publications/the-use-of-privacy-icons-and-standard-contract-terms-for-generating-consumer-trust-and-confidence-in-digital-services/

What about the library website. Is there a privacy policy on there. How easily accessible and prominent is it. And is it in plain English.

BIC has a poster template for libraries to use, where the message it contains says “RFID is in use on these premises. This may, in certain circumstances, constitute a risk to your privacy. We take this risk very seriously” http://www.bic.org.uk/161/RFID-Privacy-in-Libraries/

I know that I really need to think about this whole area a lot more. I am looking at the ontological frictions that affect the speed with which data flows back and forth. And sensory frictions is one area I had already identified. But this morning’s museum visit really has made me think a lot about the visual, the sense of seeing, and how we may not have thought enough about this so that we may not yet be making the most of the opportunities that are available to us.


Thematic discourse analysis of DP legislation

Choosing to do a thematic discourse analysis of 3 pieces of legislation, I keep thinking of new ways to analyse the data. Today for example, I have analysed words referring to “entities” (individuals, groups, institutions, and society), looking for where these occur across all three pieces of legislation, and then broken these down based on whether they refer to individuals, groups/institutions, or society.

Then another thing I have been looking at is whether the words appear in:
The overall title
A chapter title
A section title
An article title
A recital title

Some words or phrases are unique to one of the three pieces of legislation, but this could be because of the nature of the organisation. For example, only the Council of Europe convention mentions a “Secretary General”, where the Data Protection Directive and the GDPR do not.

Then some words or phrases may not appear across all three legislative instruments in the same way. All three pieces of legislation are really focussed on the individual, but they don’t all have the word “individual” in their overall title. And that is because while the Council of Europe Convention 1981 and the Data Protection Directive 1995 refer to “individuals” in their overall title, for the GDPR the official title refers not to “individuals” but to “natural persons”.