Training & Awareness as a form of ontological friction

Background to ontological friction

Informational or ontological friction refers to the forces that oppose the flow of information with a region of the infosphere. It is connected with the amount of effort required for some agent to obtain, filter, or block information about other agents in a given environment, by decreasing, shaping or increasing informational friction. Given some amount of personal information available in a region of the infosphere, the lower the informational friction in that region, the higher the accessibility of personal information about the agents embedded in that region ,the smaller the information gap among them, and the lower the level of privacy that can be expected. Put simply, privacy is a function of the informational friction in the infosphere (Floridi 2014).

 

Role training & awareness can play

…teaching patrons how to use the internet, but not how to use it safely is like showing someone how to drive a car, but not where the seatbelt is” (Beckstrom 2015 p31).

 

Privacy training & awareness can empower individuals to protect themselves with regard to the collection and use of personal data about them. Training & awareness initiatives can cover areas such as a knowledge of the available technology tools that can help thwart government and corporate surveillance; “Know your rights” training detailing privacy laws; ensuring that all library staff are familiar with the library’s own policies, procedures & guidelines relating to privacy & confidentiality issues; or the sharing of best practice amongst libraries in terms of addressing patron privacy matters. As (Gebhart 2017) says “the more we share information and best practices, the more we can each fine-tune the ways we protect ourselves and each other”.

In one year, 2.3 million people attended digital literacy courses in libraries across the EU (http://www.publiclibraries2020.eu/content/see-numbers).

 

Effectiveness of training & awareness as a form of ontological friction

Thinking further about training & awareness as a form of ontological friction; the effectiveness of any training & awareness initiatives will depend upon a number of factors:

  1. Availability
  2. Evaluation & Impact
  3. Reach
  4. Content
  5. Practical v theoretical
  6. Who delivers the training
  7. Training facilities
  8. Clarity over librarians’ role

 

  1. Availability

How frequently is training offered?

Is there any “on demand” support such as recordings of webinars; online support materials

Is ongoing employee training and awareness of privacy and/or security issues, practices and policies provided?

Is refresher training offered?

 

  1. Evaluation & impact

Are all training session attendees offered an opportunity to provide feedback?

Is there a facility for attendees to provide feedback after the event (if they don’t have time on the day of the training, is there, for example, an opportunity to provide feedback online?)

Does the feedback result in any modifications to the design of the training sessions?

Did the training achieve what it set out to achieve?

Were the most appropriate training methods used?

Did the training reach the people it was intended for?

 

  1. Reach

Is training provided for all library staff, regardless of their roles?

Is training provided for library volunteers?

Is the training reaching those users who most need it?

Are all categories of users able to tap into the training & awareness initiatives, for example, what about the housebound?

(Hasselbach, Tranberg 2016) say that “We are in the process of creating a new digital divide, where those who can afford it have privacy and a private life, while the economically vulnerable groups in society do not”. While (Richards, Hartzog 2017 p21) say “it is precisely the weak and vulnerable who need help from other people, organizations, and technologies in defending themselves.”

 

  1. Content

What does the training cover:

Threat modelling

Password management

Cookies, Web Beacons, Device Fingerprinting

Anonymous browsing

Digital footprint

Know your rights

The library’s policies, procedures & guidelines

Internet safety

 

  1. Practical v theoretical

What is the nature or format of the training: Is there sufficient “hands on” training

 

  1. Who delivers the training

Who conducts the training:

Is it an external company? Clearly, external companies are not going to be familiar with the institution’s own policies, procedures, and guidelines.

Is it a company that has its own commercial interests to consider?

  • Barclays Digital Eagles
  • Google Digital Garage
  • Halifax
  • BT

“Private sector partnerships are one way forward when public funding is in short supply. Libraries have worked with Barclays and the Halifax (digital volunteers) and BT (wi-fi). Google has set up Digital Garages aimed at businesses in larger libraries. Though ostensibly “free”, such initiatives are, at least in part, commercially driven. Libraries need to be aware, if not wary, of that” (Khan 2016 p45).

 

Where commercial companies have been brought in, have the libraries involved sought any assurances regarding privacy of library users?

 

  1. Training facilities

Are the training facilities adequate?

Are they conducive to providing the required training?

Do they offer ample opportunity for attendees/delegates to gain sufficient hands on opportunities

 

  1. Clarity over the librarians’ role

Do library staff recognise that they have a key role to play in providing training & awareness to their users, or do they believe that it is someone else’s responsibility?

Do they “Bring passion for privacy (and teaching about privacy) to the community” (Ayala 2017 slide 46)

 

Library examples

Examples of training & awareness initiatives; or scenarios where training could potentially have made a difference

  • Library uses posters to draw attention to the risks of using public computers and wifi Example: Protect your privacy while using public computers & wi-fi https://chooseprivacyweek.org/wp-content/uploads/2013/05/CPWPrivacyPublicComputingTips.pdf
  • Printouts of reports from the library management system were left lying around unattended
  • Head of a library service says “We don’t specifically train staff to address privacy issues in any detail”
  • Before libraries can act ethically with regard to social networking sites, they must first have a nuanced understanding of the potential consequences of these sites.… Yet libraries are also committed to outreach and social networking sites provide a forum where libraries can create an online presence and spread awareness about their services. (Fernandez 2009)
  • Patrons walking away from computer screens that displayed their banking records or credit card information
  • Library organises a cryptoparty
  • Develop a forum for discussion of privacy issues, sharing best practice etc
  • Create an area on library website dedicated to privacy issues (useful example is sjpl.org/privacy)
  • Include privacy within any digital literacy training offered to users

 

General issues

Awareness as empowerment

Knowledge of library procedures

Librarians & users technical knowledge of how to protect privacy

Postcode lottery? Whilst the Department for Culture Media and Sport have responsibility for library policy, the services are delivered by individual local authorities. In addition, the significant increase in the number of volunteers working in libraries and the increasing number of community managed libraries all beg the question of whether there is any sense of consistency across the country.

 

REFERENCES

AYALA, D., 2017. Privacy is the Future The Library’s Role as Educator, Defender and Enforcer 24 June 2017 ALA Annual 2017 (webinar)  .

BECKSTROM, M., 2015. Protecting patron privacy: safe practices for library computers. Libraries Unlimited.

FERNANDEZ, P., 2009. Online social networking sites and privacy: revisiting ethical considerations for a new generation of technology. Library Philosophy and Practice, .

FLORIDI, L., 2014. The 4th revolution: how the infosphere is reshaping human reality. Oxford, United Kingdom: Oxford University Press.

GEBHART, G., 2017. For data privacy day, play privacy as a team sport. Deeplinks blog, (January 27),.

HASSELBACH, G. and TRANBERG, P., 2016. Privacy is creating a new digital divide between the rich and poor. Daily Dot, (October 23),.

KHAN, A., 2016. The future of libraries in the digital age. CILIP Update, (December),.

RICHARDS, N. and HARTZOG, W., 2017. Privacy’s trust gap. Yale Law Journal, (17-02),.

 

 

A definition of privacy in 2,200 words

Privacy is definitely a slippery subject that eludes straightforward definitions. I have spent four or five months now trying to come up with a definition of privacy. It is an iterative process where each time I look I weave in further thoughts. Here’s how my attempts stand at the moment.

Before thinking about a definition of the term “privacy” it is necessary to make a distinction between “data protection” and “privacy”. These terms are sometimes used interchangeably in the literature; but that is misleading because it overlooks an important distinction between the two concepts. Data protection is not the same as privacy. The right to data protection and the right to be forgotten are procedural rights, whereas the right to identity and the right to privacy are substantive rights.

Procedural rights cover the rules, methods and conditions through which the substantive rights are enforced. Procedural requirements include things like transparency, accessibility and proportionality.

Privacy is commonly defined in terms of control and freedom. But the present researcher would add that privacy is also closely related to the concept of human dignity (Bloustein 1964). (Greene 2014) goes further and says that “privacy is said to be intimately related … to a host of other values, including freedom, intimacy, autonomy, integrity, respect, dignity, trust, and identity.”

(Westin 1967) defines privacy as “the claim of individuals, groups, or institutions to determine for themselves when, how and to what extent information about them is communicated to others”.  He says that privacy is “the ability of the individual to control the terms under which personal information is acquired and used”. This can be taken as a definition of informational privacy.

“Informational privacy in a normative sense refers typically to a non-absolute moral right of persons to have direct or indirect control over access to (1) information about oneself, (2) situations in which others could acquire information about oneself, and (3) technology that can be used to generate, process or disseminate information about oneself” (van den Hoven, Blaauw et al. 2014).

(Floridi 2014a p102) says that it is common to distinguish four kinds of privacy: physical privacy, mental privacy, decisional privacy, and informational privacy. For Floridi, informational privacy is the protection of personal identity; and that informational privacy is of primary importance.

We take privacy to be both a right of freedom from (which is well expressed by (Warren & Brandeis, 1890) as “the right to be let alone”; as well as a right of freedom to (self development).

(Curry 1997) describes Warren & Brandeis’ definition of privacy as the “right to be let alone” as “one of the earliest and most succinctly annunciated definitions of privacy”. This widely quoted definition frames privacy as a right. The judgment in Olmstead v. U.S., 277 U.S. 438 (1928) further expands on this in referring to privacy as “the most comprehensive of rights, and the right most valued by civilized men”.

Whilst a short & succinct definition of privacy may be doomed to failure, (because it would be too restrictive and would be likely to get out of date very quickly) attempts to map the ground covered by the concept of privacy have made significant progress.

(Koops, Newell et al. 2017) have developed a typology of privacy in which they categorise eight different types of privacy, across all of which they overlay informational privacy. They divide the eight categories between privacy types that are “Freedom from” (the right to be let alone): Bodily privacy, spatial privacy, communicational privacy, and proprietary privacy; and privacy types that are “Freedom to” (self development): Intellectual privacy, decisional privacy, associational privacy and behavioural privacy

One omission from their list is a category is privacy of counsel, the right to private consultation with counsel or the right not to have one’s legally privileged material subjected to surveillance.

A number of commentators acknowledge the difficulty of defining privacy, and the lack of consensus on a definition:

  • (Greene 2014) says that privacy is “a multifaceted concept which eludes easy definition”.
  • (Greenland 2013) says that “there is no single agreed definition of privacy in the literature”, recognising that this can be a strength: “a range of conceptions of privacy is valuable because it encompasses several issues”.
  • (Taylor, Floridi et al. 2017) acknowledge that “the concept of privacy is notoriously difficult to define and has varied and sometimes conflicting interpretations”.

(Gutwirth 2002) says “The notion of privacy remains out of the grasp of every academic chasing it. Even when it is cornered by such additional modifiers as “our” privacy, it still finds a way to remain elusive.”

Coming up with a definition of the term “privacy” is incredibly difficult; especially if one hopes to find an authoritative definition which will stand the test of time. Indeed (Hartzog, Selinger 2013) consider it to be a sisyphean task. Commenting on the concept of obscurity within the context of privacy, they say that “Discussion of obscurity in the case law remains sparse. Consequently, the concept remains under-theorized as courts continue their seemingly Sisyphean struggle with finding meaning in the concept of privacy” (Hartzog, Selinger 2013)

Scholars argue that the definitional attempt to clarify the privacy concept is nebulous, and that no universally accepted definitions have emerged (Bennett, 1992; Flaherty, 1989).

As the influential privacy scholar Alan Westin put it, privacy as a notion is “part philosophy, some semantics and much pure passion” (Westin 1967).

 

It is hardly surprising, therefore, that some commentators make a conscious decision to avoid even attempting to come up with a definition of the word, let alone trying to come up with a definitive description of what privacy is. It is, nevertheless, imperative that people do try to come up with a definition for a number of reasons. Two in particular are especially worthy of consideration:

  • Privacy is widely recognized as an important good that deserves protection. But what cannot be described, defined and understood cannot be defended or regulated. Hence, how the meaning of privacy is constructed in the first place has far-reaching implications for how privacy boundaries and behaviors are negotiated and potentially re-settled in light of a changing world. Arguably then, the understanding of the meaning content of the category of privacy holds the key to the future of privacy in the digital world (Bajpai, Weber 2017).
  • (Cannataci 2016b) believes that the existence and usefulness of an international legal framework for privacy is seriously handicapped by the lack of a universally agreed and accepted definition of privacy if we do not have a clear understanding of what we have agreed to protect: “While the concept of privacy is known in all human societies and cultures at all stages of development and throughout all of the known history of humankind it has to be pointed out that there is no binding and universally accepted definition of privacy”

Even if there were a universally agreed and accepted definition of privacy, another handicap is what Cannataci refers to as TPET: the Time, Place, Economy and Technology dimensions. For the passage of time and the impact of technology, taken together with the different rate of economic development and technology deployment in different geographical locations means that legal principles established fifty years ago (ICCPR) or even thirty-five years ago (e.g. the European Convention on Data Protection) let alone seventy years ago (UDHR) may need to be re-visited, further developed and possibly supplemented and complemented to make them more relevant and useful to the realities of 2016. (Cannataci 2016b)

Agre says that “One constant across this history is the notorious difficulty of defining the concept of privacy. The lack of satisfactory definitions has obstructed public debate by making it hard to support detailed policy prescriptions with logical arguments from accepted moral premises. Attempts to ground privacy rights in first principles have foundered, suggesting their inherent complexity as social goods” (Rotenberg, Agre 1998).

(Greene 2014) says that “privacy applies to a curious mix of disparate acts, events, things, states of mind, and information. We speak of privacy with regard to our body parts, personal papers, important life decisions, financial status, homes, genetic inheritance, past actions, and our physical selves even when out in public, to name just a few examples”.

(Finn, Wright et al. 2013) say that “privacy” has proved notoriously difficult to define.

One dictionary defines privacy as “the right to be free from unwarranted intrusion and to keep certain matters from public view” (Law 2015)

Privacy is commonly defined in terms of control and freedom. But the present researcher would add that privacy is closely related to the concept of human dignity. (Greene 2014) goes further and says that “privacy is said to be intimately related … to a host of other values, including freedom, intimacy, autonomy, integrity, respect, dignity, trust, and identity.”

(Reiman 1995) says “Privacy is the condition in which others are deprived of access to you” which he explains further as people being deprived of access to either some information about you or some experience of you.

(Tripathi, Tripathi 2010) says “Privacy can be defined as an individual’s freedom to decide to what extent he/she likes to share their intellectual, social and cultural life with others, or in other words to what extent others can invade into his/her private life”.

(Garoogian 1991) says that “Privacy, as the term is commonly used, means the unavailability to others of information about oneself”. She presents moral, legal and professional arguments for the protection of a patron’s privacy. (Garoogian 1991) picks out a number of definitions of privacy:

  • the claims of individuals …to determine for themselves when, how and to what extent information about them is communicated to others (BUCHANAN 1982)
  • the condition enjoyed by one who can control the communication of information about himself. (Lusky 1972)
  • selective control of access to the self or to one’s group (Altman 1976)
  • control over when and by whom various parts of us can be sensed by others. By “sensed” is meant simply seen, heard, touched, smelled or tasted. (Thompson 1975)
  • [the] right that certain steps shall not be taken to find out facts [private facts] and …[the] right that certain uses shall not be made of [these] facts. (Thompson 1975)
  • having control over information about oneself. (Decew 1987)

The right to privacy has arguably always been used to get to thorny and hard-to-define problems because it touches on various more concrete rights – those of autonomy and the right to intellectual freedom, freedom from surveillance and interference, and the right to behave in ways that may be inconvenient for the authorities. Edward Snowden says that “privacy is the fountainhead of all other rights” (Schrodt 2016).

Defining the elements of privacy, (Sturges 2002) outlined solitude, anonymity, bodily modesty, psychological integrity and confidentiality (in terms of shared information) as important components of privacy.

(Petronio, Altman 2002 p6) defines privacy, ‘‘as the feeling that one has the right to own private information, either personally or collectively.’’

(Greenland 2013 p223) says “privacy refers to notions such as an individual’s right to be respected, personal space, dignity and autonomy, as well as those aspects of a person’s life they wish to restrict access to or keep control of (McCullagh 2008)”.

The goal of privacy is not to protect some stable self from erosion but to create boundaries where this self can emerge, mutate and stabilize (Morozov 2014). The boundaries metaphor is used by a number of scholars such as (Altman 1977) and (Petronio, Altman 2002)

The right to privacy protects an interest that has been defined as a “personal condition of life characterised by seclusion from, therefore absence of acquaintance by the public” Norberto Nuno Gomesde Andrade IN (Ghezzi, Pereira et al. 2014)

 

(Mai 2016) believes that we need to shift away from definitions of privacy and look instead to models of privacy. He supplements Agre’s surveillance and capture models with a third model which he sees as being supplementary, namely the datafication model where data is deduced by predictive analytics.

In a library context privacy is thought of as a right to open inquiry.

The American Library Association defines a right to privacy in a library, whether physical or virtual, as “the right to open inquiry without having the subject of one’s interest examined or scrutinized by others” (American Library Association 2014).

(Caldwell-Stone 2012) defines users’ information privacy as the right to read and inquire anything, without the fear of being judged or punished.

These two statements, by trying to encapsulate a brief definition of library privacy are in many regards too restrictive because they seem to relate only to someone actively seeking out information.

Writing in the Intellectual Freedom Manual (American Library Association 2010), Deborah Caldwell-Stone says “Confidentiality exists when a library is in possession of personally identifiable information about library users and keeps that information private on their behalf”.

(Richards 2015 p95) argues for a certain kind of privacy, one which is different from tort privacy, which he refers to as “intellectual privacy”, and which he believes to be essential if we care about freedom of expression. Indeed, he says that privacy and (freedom of) speech are not conflicting values but are mutually supportive. The three most important elements of intellectual privacy that Richards identifies are:

  1. the freedom of thought;
  2. the right to read (and engage in intellectual exploration),
  3. and the right to communicate in confidence

and he acknowledges that all 3 elements are related and build on the others

(Richards 2015 p95) says that “intellectual privacy is the protection from surveillance or unwanted interference by others when we are engaged in the process of generating ideas and forming beliefs – when we’re thinking, reading and speaking with confidants before our ideas are ready for public consumption”.

(Richards 2015 p95) believes that because it is only comparatively recently that surveillance and monitoring has been happening in a big way, that intellectual privacy is under-developed and under-appreciated. Intellectual privacy is increasingly under threat, and constant assault.

REFERENCES

ALTMAN, I., 1977. Privacy regulation: Culturally universal or culturally specific? Journal of Social Issues, 33(3), pp. 66-84.

ALTMAN, I., 1976. A conceptual analysis. Environment and Behavior, 8(1), pp. 7-29.

AMERICAN LIBRARY ASSOCIATION, 2014. Privacy: an interpretation of the Library Bill of Rights Adopted June 19, 2002, by the ALA Council; amended on July 1, 2014.

AMERICAN LIBRARY ASSOCIATION, 2010. Intellectual freedom manual. 8th edn. American Library Association.

BAJPAI, K. and WEBER, K., 2017. Privacy in public: Translating the category of privacy to the digital age. From Categories to Categorization: Studies in Sociology, Organizations and Strategy at the Crossroads. Emerald Publishing Limited, pp. 223-258.

BLOUSTEIN, E.J., 1964. PRIVACY AS AN ASPECT OF HUMAN DIGNITY: AN ANSWER TO DEAN PROSSER. New York University Law Review, 39(6), pp. 962.

BUCHANAN, R., 1982. Garbage in, mischief out. Library Review, 31(1), pp. 30-34.

CALDWELL-STONE, D., 2012. A digital dilemma: ebooks and users’ rights. American Libraries, .

CANNATACI, J.A., 2016. Report of the Special Rapporteur on the right to privacy. , pp. A/HRC/31/64-A/HRC/31/64.

CURRY, M.R., 1997. The digital individual and the private realm. Annals of the Association of American Geographers, 87(4), pp. 681-699.

DECEW, J., 1987. Defending the “private” in constitutional privacy. Journal of Value Inquiry, 21, pp. 171-184.

FINN, R.L., WRIGHT, D. and FRIEDEWALD, M., 2013. Seven types of privacy. In: S. GUTWIRTH ET AL. (EDS.), ed, European Data Protection: Coming of Age. Springer Netherlands, pp. 3.

FLORIDI, L., 2014. The 4th revolution: how the infosphere is reshaping human reality. Oxford, United Kingdom: Oxford University Press.

GAROOGIAN, R., 1991. Librarian/patron confidentiality: an ethical challenge. Library Trends, 40(2), pp. 216-233.

GHEZZI, A., 1975, PEREIRA, A., 1966 and VESNIĆ-ALUJEVIĆ, L., 1981, 2014. The ethics of memory in a digital age: interrogating the right to be forgotten. Houndmills, Basingstoke, Hampshire: Palgrave Macmillan.

GREENE, J.K., 2014. Before Snowden: privacy in an earlier digital age. International Journal of Philosophy and Theology, 2(1), pp. 93-118.

GREENLAND, K., 2013. Negotiating self-presentation, identity, ethics, readership and privacy in the LIS blogosphaere: a review of the literature. Australian Academic & Research Libraries, 44(4),.

GUTWIRTH, S., 2002. Privacy and the information age. Lanham, Maryland: Rowman & Littlefield.

HARTZOG, W. and SELINGER, E., 2013. Obscurity: a better way to think about your data than “privacy”. The Atlantic, (January 17),.

KOOPS, B., NEWELL, B.C., TIMAN, T., SKORVANEK, I., CHOKREVSKI, T. and GALIC, M., 2017. A typology of privacy. University of Pennsylvania Journal of International Law, 38(2),.

LAW, J., ed, 2015. Oxford dictionary of law. 8th ed. edn. Oxford University Press.

LUSKY, L., 1972. Invasion of privacy: a clarification of concepts. Columbia law review, 72(4), pp. 693-710.

MOROZOV, E., 2014. To save everything, click here: the folly of technological solutionism. New York: PublicAffairs.

PETRONIO, S. and ALTMAN, I., 2002. Boundaries of privacy.

REIMAN, J.H., 1995. Driving to the panopticon: a philosophical exploration of the risks to privacy posed by the highway technology of the future. Santa Clara High Technology Law Journal, 11, pp. 27-43.

RICHARDS, N., 2015. Intellectual privacy: rethinking civil liberties in the digital age. Oxford University Press.

ROTENBERG, M. and AGRE, P.E., 1998. Technology and privacy: the new landscape. MIT Press.

SCHRODT, P., 2016. Edward Snowden just made an impassioned argument for why privacy is the most important right. Business Insider, (September 15),.

STURGES, P., 2002. Remember the human: the first rule of netiquette, librarians and the Internet. Online Information Review, 26(3), pp. 209-216.

THOMPSON, J.J., 1975. The right to privacy. Philosophical Public Affairs, 4(Summer), pp. 295-314.

TRIPATHI, S. and TRIPATHI, A., 2010. Privacy in libraries: the perspective from India. Library Review, 59(8), pp. 615-623.

VAN DEN HOVEN, J., BLAAUW, M., PIETERS, W. and WARNIER, M., 2014. Privacy and information technology. In: E.N. ZALTA, ed, Stanford Encyclopedia of Philosophy.

WESTIN, A.F., 1967. Privacy and freedom. (1st ed.). edn. New York: Atheneum.

 

A worked example of library privacy using ontological friction

For each library privacy scenario:

They will all have an INFORMATIONAL PRIVACY component

They will also involve ONE or MORE of the following privacy types

  • Bodily privacy
  • Spatial privacy
  • Communicational privacy
  • Proprietary privacy
  • Intellectual privacy
  • Decisional privacy
  • Associational privacy
  • Behavioural privacy

The flow of personal data (and therefore the privacy level) can be adjusted by one or more of the following ontological frictions

  • Temporal
  • Spatial
  • Sensory
  • Obscurity
  • Training & Awareness
  • Information behaviour
  • Context
  • Regulatory
  • Technological

A worked example:

In Virginia a husband requested circulation records of his wife to prove she had been “exploring avenues of divorce” before he filed the papers

THIS SCENARIO INVOLVES:

  • Informational privacy
  • Decisional privacy
  • Intellectual privacy

And the flow of personal data is determined by the following types of friction:

Regulatory

Temporal

  • Keeping the information for the minimum period required

Obscurity

  • Anonymising the borrowing records as soon as items are returned to the library

Training & awareness

  • Ensuring that all library staff are fully cognisant of the library’s policies and procedures and their obligations under the law

Communication privacy management (CPM) theory

Sandra Petronio defines privacy ‘‘as the feeling that one has the right to own private information, either personally or collectively’’ (Petronio, Altman 2002 p6).

Petronio developed the idea of communication privacy management (CPM) theory in 1991. She presents a theoretical approach that gives us a rule-based system to examine the way people make decisions about balancing disclosure and privacy. CPM makes private information a focal point. It is not just about the self: it’s a communicative process. CPM is a conceptual framework based on an explicit philosophical and theoretical underpinning.

Petronio’s CPM theory is an important extension of Irwin Altman’s privacy regulation theory. In the foreward to (Petronio, Altman 2002 pxv) Altman writes “Sandra Petronio offers a comprehensive framework that incorporates systematically many aspects of privacy-disclosure that had previously been neglected, or treated in only a fragmentary way. Her integrative model is an excellent map or guide for students, teachers, practitioners, and scholars to address an array of privacy-disclosure issues” and (bearing in mind that he was writing in 2002) (pxvi) he describes Petronio’s communications privacy management theory as the most comprehensive conceptual framework presently available.

Petronio treats privacy and disclosure as inseparable aspects of a unified dialectical process. People make choices about revealing or concealing based on criteria and conditions they perceive as salient, and individuals fundamentally believe they have a right to own and regulate access to their private information.

Individuals, Groups, Society: Petronio conceptualizes group privacy management as coordinating unified boundaries, while individual privacy management is conceptualized as the coordination of privacy rules around the self. Private life only makes sense in relation to public life. People develop individual and group privacy management strategies or rules to coordinate disclosure behavior. As we create individual boundaries around the self, we also create group boundaries with others.

In a group context, individual disclosures grow into private information that belongs to everyone in the group. There are more possibilities for violations within group privacy boundaries than in the dyadic boundaries. Some group members may not accept responsibility for co-owned private information. Some may use different privacy rules than those established by the larger group. Loyalty to the group is demonstrated by keeping the disclosures of its members confidential. Some groups work hard to reinforce unwavering adherence to privacy rules

In CPM theory, privacy boundaries can range from complete openness to complete closedness or secrecy. An open boundary reflects willingness to grant access to private information through disclosure or giving permission to view that information, thus representing a process of revealing. On the other hand, a closed boundary represents information that is private and not necessarily accessible, thus characterizing a process of concealing and protecting. The relationship between the boundaries is dialectical, consistent with Altman’s thesis, because we continuously adapt our level of privacy and disclosure to internal and external states because we simultaneously need to be open and social as well as private and preserve our autonomy. (Petronio, Altman 2002)

While Petronio’s CPM theory dates back to 1991, it has been further developed in the decades since then, both by Petronio herself, but also by other scholars who have applied the theory to a range of different contexts.  It is, for example, particularly suited for the study of social networking – see, for example, (Mazer, Murphy et al. 2007).

(Graves, Healy et al. 2010) considered interpersonal communication for library and information professionals, including CPM theory. They say that librarians are receivers or co-owners of private information that come with its own set of boundaries, responsibilities, and expectations. The informer (patron) should be protected when he or she shares information with the receiver (librarian). Librarians must learn how to manage boundaries when dealing with disclosing or retaining private information.

Petronio envisages how her CPM theory will continue to grow and change as it is applied to practical problems.

SUPPOSITIONS 1. Private information
2. Boundaries
3. Control and ownership
4. Rule-based management system
5. Management dialectics

 

Exercise control to manage privacy boundaries (the way the rules develop and their properties) 1. Privacy rule foundations RULE MANAGEMENT PROCESSES
Collectively owned boundaries – where people co-own private information (Reflects how privacy is regulated through rules when people are engaged in managing collective boundaries) 2. Boundary co-ordination operations
3. Boundary turbulence

Rule management process 1: privacy rule foundations

Privacy rule development Cultural criteria
Gendered criteria
Motivational criteria
Contextual criteria
Risk-benefit ratio criteria
Rule acquisition
Rule properties

 

Rule management process 2: boundary coordination operations Boundary linkages
Boundary permeability
Boundary ownership
Boundary co-ownership; private disclosure confidants

Rule management process 3 – boundary turbulence

Under certain circumstances the boundary coordination process malfunctions, yielding “boundary turbulence”. Boundary turbulence signifies the assumption that coordination doesn’t always function in a synchronised way. When people don’t work together to have a smooth coordination process, and when the rules become asynchronised. Boundary management may become turbulent when there’s an invasion from outside sources or the management system doesn’t work. Types of boundary turbulence include:

  • Intentional rule violations
  • Boundary rule mistakes
  • Fuzzy boundaries
  • Dissimilar boundary orientations
  • Boundary definition predicaments
  • Privacy dilemmas

CPM also predicts that this boundary turbulence requires the owners and co-owners to recalibrate and readjust privacy management practices because it becomes clear that they are not functioning adequately or as intended. Original owners of the information often expect that co-owners, composing the collective privacy boundaries, will know and follow the privacy rules they use for management. However, when boundary turbulence occurs, individuals discover that information they have moved into a collective boundary is not appropriately being managed by the individuals within the collective. Thus, boundary turbulence occurs when violations, disruptions, or unintended consequences occur as a result of privacy management practices (Petronio, 2002).

Petronio lists a variety of factors that can lead to boundary turbulence, which (West, Turner 2013) group into three categories:

  • Fuzzy boundaries – where people haven’t discussed what can and can’t be revealed; where there is no mutual recognition of where the boundaries lie. The onus is then on a friend, family member etc.
  • Intentional breaches – these could be to hurt the original owner or simply because the breaking of the confidence works to their personal advantage
  • Mistakes – letting secrets slip out when their guard was down after having a few drinks; errors of judgment when discussing private cases in public places; or a miscalculation in timing

CPM Axioms (Petronio 2013)

  1. predicts that people believe they are the sole owners of their private information and they trust they have the right to protect their information or grant access
  2. predicts, when these “original owners” grant others access to private information, they become “authorized co-owners” and are perceived by the “original owner” to have fiduciary responsibilities for the information
  3. predicts, because individuals believe they own rights to their private information, they also justifiably feel that they should be the ones controlling their privacy.
  4. predicting that the way people control the flow of private information is through the development and use of privacy rules.
  5. that predicts successful and continued control post-access is achieved through coordinating and negotiating privacy rules with “authorized co-owners” regarding third-party access
  6. that predicts co-ownership leads to jointly held and operated collective privacy boundaries where contributions of private information may be given by all members
  7. predicting that collective privacy boundaries are regulated through decisions about who else may become privy, how much others inside and outside the collective boundary may know, and rights to disclose the information.
  8. predicts, privacy regulation is often unpredictable and can range from disruptions in the privacy management system to complete breakdowns

Three boundary rule management operations

  • Boundary linkage rules. Linkages refer to the establishment of mutually agreed upon privacy rules that are used to choose others who might be privy to the collectively held information.
  • Privacy rules for co-ownership. The degree and kind of ownership is negotiated. Because we live in a world where we manage multiple privacy boundaries, sometimes people find it difficult to know when one boundary ends and another begins. The level and type of ownership may vary: Shareholders have knowledge of private information because they have been given permission to know it. Stakeholders are confidants who are perceived as worthy of some level of access because they serve a functional role, providing the original owner a needed outcome For example, disclosing financial information to their bank Physicians and healthcare personnel also represent this category of co-ownership.
  • Privacy rules for boundary permeability – this represents the rule coordination as to the extent to which collectively held privacy boundaries are opened or closed once they have been formed. The confidant and the original owner negotiate how much control over the information there should be to restrict or to grant access to third parties. These rules regulate the depth, breadth, and amount of private information that is given access. Permeability is a matter of degree. Many coordinated access rules are crafted to be filters, letting some private information seep through, while other related facts are closely guarded. (Petronio, Altman 2002)

(Petronio, Altman 2002) differentiates three general patterns in how people manage group boundaries: inclusive boundary coordination, intersected boundary coordination, and unified boundary coordination:

  • Inclusive boundary coordination refers to person A giving up privacy control to person B in order to get something in return (e.g., a patient talking about their eating habits to a doctor so the doctor can provide adequate consultation with regard to his or her health status).
  • In intersected boundary coordination, the concealed information is perceived as comparable, and person A and B are considered as equals (e.g., two friends mutually disclosing the troubles they face at home)
  • Unified boundary coordination is a pattern whereby everyone is in control of the private information, whilst no one really owns the information. Here, the power of person A over B or the equal sharing of information between person A and B is not the most important aspect (e.g., members of a sports club concealing that they have cheated during a game). Rather, ‘‘the body of private information typically found in this type of coordination often predates all members and new members make contributions, yet the information belongs to the body of the whole’’ (Petronio, Altman 2002 p134).

The process where people regulate privacy boundaries as they make choices about the flow of their private information, is guided by six principles:

  • people believe that they own private information, which defines the parameters of what constitutes the meaning of private information;
  • because people believe they own private information, they also believe that they have the right to control that information;
  • to control the flow of their private information, people use privacy rules they develop based on criteria important to them;
  • once they tell others their private information, the nature of that information changes, becoming co-owned by the confidant;
  • once the information is co-owned, ideally the parties negotiate collectively held and agreed-upon privacy rules for third-party dissemination; and
  • because people do not consistently, effectively, or actively negotiate collectively held privacy rules, there is the possibility of “boundary turbulence” which means that there are disruptions in the way that co-owners control and regulate the flow of private information to third parties

The flow of information can be visualised in terms of the thickness or thinness of a boundary wall that allows information to be known. The thicker the boundary, with dense and impenetrable walls, less access is given and so less is known about the private information. Avoiding speaking about certain topics may serve as a safeguard to preserve one’s identity. When people are negotiating the collective privacy boundary, there is the possibility that the type of information being considered is so volatile that the co-owners decide to sustain a thick boundary wall with rigid protection rules by declaring the topic taboo.

Co-ownership

Co-ownership of private information involves a joint responsibility for its containment or release. But not all boundary ownership is 50-50. One person may have a greater stake in how the information is handled or feel that they should have total control of how it’s used. If so, that person is usually the original owner.

The different types of confidants

There are at least two ways that people become confidants. First, serving as a confidant may result from soliciting private information belonging to someone else. Second, people may find they are recipients of private information, although reluctantly so.

The deliberate confidant intentionally seeks private information from others either directly, indirectly, or gain permission from them to know the information. They often seek out the information in order to help others out. For example, doctors, counselors, attorneys, and clergy solicit personal information only after they assure clients that they have a privacy policy that severely limits their right to reveal the information to others. The common thread for deliberate confidants is that they purposely seek to know someone else’s private information.

Conversely, a reluctant confidant doesn’t want the disclosure, doesn’t expect it, and may find the revealed information an unwelcome burden. Picture the hapless airplane travelers who must listen to their seatmates’ life stories. Even though reluctant confidants often feel a vague sense of responsibility when they hear someone else’s private information, they usually don’t feel a strong obligation to follow the privacy guidelines of the discloser.

CPM theory contends that the first place privacy rules are learned is within the family. Families influence members of the group across time by providing orientations for use when interacting both with other family members and individuals external to the family unit. The interior family privacy orientation indicates how much a family shares or protects private information with other family members. The interior family privacy orientation is cultivated through both direct communication within the family and the occurrence of such practices among sub-groups within the family, such as concealing secrets.

Risks & benefits of disclosure

Risks

  • Making disclosures to the wrong people
  • Disclosing at a bad time
  • Telling too much about ourselves (“oversharing”)
  • Compromising others

Benefits

  • Increase social control
  • Validate our perspectives
  • Become more intimate with our relational partners

(Petronio, Altman 2002) There are many reasons beside intimacy why people disclose personal information

  • Relieve a burden
  • Gain control
  • Enjoy self-expression
  • Develop intimacy

Trust is the willingness to assume risk. A lower level of perceived privacy risk should be related to a higher level of trust in the other party’s competence, reliability and safekeeping of personal information.

Five principles about privacy management

CPM stipulates five principles about the privacy management that give a route to better understand both the times when access to the information is granted and when access is denied (Petronio, 2002).

  • Principle one states that individuals equate private information with personal ownership. That is, from a behavioural standpoint, people feel they own their private information in the same way that they own other possessions (Child et al., 2009).
  • Principle two predicts that because people believe they own their information, they also believe that they have the right to control the flow of the information to others.
  • Principle three predicts that people develop and use privacy rules to control the flow of information to others.  These rules are driven by motivations and frequently take into account a risk-benefit ratio.
  • Principle four predicts that once private information is disclosed or others are granted access, the information moves from individual ownership to collective ownership. CPM stipulates that typically, people coordinate three different types of privacy rules to manage a collectively held privacy boundary. Thus, the original owner and co-owners coordinate the management of information through the use of privacy boundary permeability rules, privacy boundary ownership rules, and privacy boundary linkage rules.
  • Principle five concerns the prediction that if owners and co-owners do not coordinate the privacy rules to regulate information flow, disruption will occur and boundary turbulence will result. When this type of disruption happens, the outcome exposes implicit or taken-for-granted expectations that have been violated.

REFERENCES

GRAVES, J., HEALY, H., HEITHUAS, A., LEHANE, D., MCCELLAN, K., MITCHELL, W. and SCHMIDT, K., 2010. Interpersonal communication for library and information professionals.

MAZER, J.P., MURPHY, R.E. and SIMONDS, C.J., 2007. I’ll see you on “Facebook”: The effects of computer-mediated teacher self-disclosure on student motivation, affective learning, and classroom climate. Communication Education, 56(1), pp. 1-17.

PETRONIO, S., 2013. Brief status report on communication privacy management theory. Journal of Family Communication, 13(1), pp. 6-14.

PETRONIO, S. and ALTMAN, I., 2002. Boundaries of privacy.

WEST, R. and TURNER, L.H., 2013. Chapter 13. Communications privacy management theory of Sandra Petronio. Introducing communication theory analysis and application.

 

Types of friction determining level of privacy (@floridi ontological friction)

Temporal

Related to or limited by time

 

Spatial

Spaces

Walls (eg. Are they soundproof)

Doors (locked/open)

Partitions (thickness/thinness)

Materials used (eg glass)

Architecture

Physical separation

 

Sensory

Smell

Taste

Touch

Sight

Hearing

 

Obscurity

Search visibility

Unprotected access

Identification (pseudonyms/anonymization)

Clarity

Right to be forgotten

Difficulty of collecting

In physical library v digital library

Difficulty of correlating/aggregating

Burdensome to obtain

Deliberate use of misleading information

 

Training & awareness

Users & librarians technical knowledge to protect privacy

Advising on websites to use (based on behavioural tracking)

Knowledge of library procedures

Knowledge of the law

Awareness as empowerment

 

Information behaviour

Self censorship

Conforming to expectations (because of chilling effect)

Privacy calculus

 

Context

Information type

Transmission principle (consent, coerced, stolen, buying, selling, confidentiality, stewardship, acting under the authority of a court with a warrant, and national security)

Who is the sender

Who is the recipient

Ethical concerns

 

Regulatory

Law: international, supra-national, national, local (in federal systems), case law

Norms: market norms, social norms, moral/non-moral norms, cultural norms

Self-regulation: terms & conditions, contracts, ethical codes, guidelines, standards

Codified library policies and practices

 

Technology

Privacy enhancing technologies

Privacy invasive technologies

Differential privacy

Privacy by design

 

Examples of temporal friction

  • Patron borrowing records anonymised once items have been returned
  • Library has CCTV cameras in place in all the public areas, but automatically wipes the tapes after 28 days
  • Paper based records for computer bookings destroyed at the end of each week
  • Transactional logs generated by access control software and network authentication anonymized/destroyed when no longer needed

 

Examples of spatial friction

  • Try to avoid indiscreet reference interviews (voice level, private space)
  • Reserved items placed in an open area of the library. The books don’t have any paper wrapped around them to disguise their contents, making it easy to spot titles that are racy, on controversial topics, or which the user would be embarrassed for others to know they had asked for. And the users’ complete surname visible on a piece of paper placed inside the book
  • Berwick’s new-look library has been criticised for the lack of privacy it offers customers coming in to discuss sensitive issues. The building now houses a range of services offered by Northumberland County Council. It means library, registration, tourism and customer service facilities are housed in the same place. However, concerns have been raised that potentially sensitive matters, such as conversations about housing benefits and council tax and even personal details, could easily be overheard. There is a private area for customers but it is understood it does not have computer access (http://www.berwick-advertiser.co.uk/news/privacy-concern-raised-following-library-revamp-1-4199427)

Is it ever right to impose privacy on someone against their will?

Anita Allen believes that there are times where privacy is seen as being unpopular, unwanted, resented, not-preferred, or even despised by its intended beneficiaries or targets. She poses the question: “Could certain privacies be so important that they should be legally protected and the legal protections not subject to voluntary waiver by their intended beneficiaries?” (Allen 2011 p8).

Her response to people having a need for privacy, even if they themselves don’t recognise that need, is to put forward the idea of what she calls “unpopular privacy”: “In a free society there is a place for freedom of choice about privacy, and a place for nudging and persuasion about privacy, and a place for co-ercing privacy. Demanding, imposing and co-ercing privacy which is unpopular – that is unwanted, and even resented to prevent serious harm and to preserve foundational privacy” (Allen 2011 pxii)

Anita Allen’s view is that some forms of privacy are so important that moderate forms of paternalism, consistent with a fairly broad understanding of political liberalism  may be warranted in order to impose privacy laws “for the benefit of uneager beneficiaries” (Allen 2011 pxi).

If people were able to make rational decisions for themselves, where they had been fully informed of the facts and were able to weigh up the advantages and disadvantages of giving out personally identifiable information then the case for imposing privacy on someone against their will would be difficult to justify. However, those things must not be assumed to be present. The reality can be quite different:

“Rampant practices of surreptitious as well as flagrant data gathering, dissemination, aggregation, analysis and profiling mean that it isn’t appropriate to think of the data subject as a free and rational agent able to make decisions without interference from third parties such as government regulators” (Nissenbaum 2011 p34).

“Not all individuals are rational, and an individual cannot be rational at all times” (Wang, Yan et al. 2017 p1429) when considering privacy protective behaviours in the light of the privacy calculus model, where the theory anticipates a rational calculus of the costs and benefits.

(Hull 2015 p91) cites three types of reasons why self-management can be expected to underprotect privacy:

  • users do not and cannot know what they are consenting to;
  • privacy preferences are very difficult to effectuate; and
  • declining to participate in privacy-harming websites is increasingly not a viable option for many.

There are times where one could argue that people need saving from themselves, but in democratic societies the question of imposing privacy is clearly a controversial one. If the premise of unpopular privacy is credible, the question is when is the imposition of privacy on someone against their will appropriate. Allen specifically refers to where it is to prevent serious harm and to preserve foundational privacy.

REFERENCES

ALLEN, A.L.1.[.A., 2011. Unpopular privacy : what must we hide? Oxford: Oxford University Press.

HULL, G., 2015. Successful failure: what Foucault can teach us about privacy self-management in a world of Facebook and big data. Ethics and Information Technology, 17(2), pp. 89-101.

NISSENBAUM, H., 2011. A contextual approach to privacy online. Daedalus, the Journal of the American Academy of Arts & Sciences, 140(4), pp. 32-48.

WANG, L., YAN, J., LIN, J. and CUI, W., 2017. Let the users tell the truth: Self-disclosure intention and self-disclosure honesty in mobile social networking. International Journal of Information Management, 37(1), pp. 1428-1440.

 

Privacy scenarios in libraries

Here are a few examples of privacy scenarios relating to libraries:

Overdrive and Amazon did a tie-in which required synchronisation of the two accounts (2011)

Receipts from self service machines. Years ago retailers realized that they were putting too much information onto till receipts, notably the full credit or debit card number. Given the threat of identify fraud, they stopped displaying complete card numbers, and instead only showed some of the numbers while using asterisks to mask some of the digits. With the prevalence of self-issue machines, libraries need to think carefully about the information that is printed out on transaction receipts.

Record of what articles a user has downloaded within a library discovery service

“Toronto Public Library reveals its website searches in real time IN Toronto Metro, 21 July 2016 ….provides a window into the range of interests and needs people bring to the library. Using Google Analytics, they pull search topics into one place where the information is shown in real time.

16 of the top 20 research journals let ad networks spy on their users https://go-to-hellman.blogspot.co.uk/2015/03/16-of-top-20-research-journals-let-ad.html

Internet sign up logs (destroy regularly)

Library asks a parent to reimburse them for a book that their child has lost, but is not prepared to divulge the title of the book for privacy reasons

Library brings in commercial interests (BT, Barclays Digital Eagles, Google Digital Garage) to provide digital literacy training

Posting overdue notices in the mail using postcards showing names and titles

Snooping devices (keystroke loggers) were found in Cheshire library computers

User sets up a number of saved searches on an online database

Library website uses Google Analytics https://go-to-hellman.blogspot.co.uk/2017/02/how-to-enabledisable-privacy-protection.html

This is a selection from the 100+ that I have been building up. If you have others, do let me know.