Model for ontological frictions (cf @Floridi)

Many of the frictions that determine the ease/difficulty with which one can access personally identifiable information fall under the heading of technology.

Whilst technology accounts for many frictions, it certainly doesn’t cover all of them. So, I have been trying to think about how to encapsulate the various different frictions in a model.

Yesterday, I very tentatively posted a slide on twitter about the opposing forces of privacy invasive technologies versus privacy enhancing technologies, and one response was simply to say “brilliant”. I had been quite wary of posting the slide, because it only tells part of the story. Other aspects might include privacy by design and differential privacy.

My efforts to come up with a suitable model are only in their initial stages, and right now its very much a work in progress.

Thinking of the technology segment, the nitty gritty might cover things like

  • Encryption
  • Secure networks
    –           VPNs
    –           Separation of staff wifi from user wifi etc
  • Strong passwords
  • 2FA
  • Use of blocking to inhibit tracking mechanisms
  • Password protections/password encoding
  • Specifically devised protocols or services
  • Warning systems (for externally captured data)
  • Limited disclosure technology (eg Sudoweb, Facecloak)
  • Use of blocking to inhibit tracking mechanisms
  • Pro-active information security measures
  • Network penetration testing
  • Limiting editing/access rights to those who really need them
  • Ensuring ability to undertake a forensic audit
  • Firewall
  • Proactively take measures to protect privacy
  • Clear cookies and browser history
  • Delete/edit something you posted in past
  • Set your browser to disable or turn off cookies
  • Adblockers
  • Addons to prevent tracking (PrivacyBadger, Ghostery etc)

The technology piece could be seen in the larger context of regulation – cf. Lessig’s law as code. So I then took David Haynes’ regulation model which covers four elements: norms, law, code, and self-regulation and tried to think from there about all of the other types of friction that aren’t covered by those headings.

It is definitely not easy to try and make sense and categorise the other elements, not least because they aren’t necessarily mutually exclusive.

For example, I decided to create a heading for “Obscurity” to cover obscurity, practical obscurity and obfuscation. These can in many cases be achieved through technology, but not necessarily. Making a deliberate decision NOT to digitise a work would be a means of achieving practical obscurity, and of ensuring that access to its content was far more restrictive than it ever would be had it been digitised. And if that contained sensitive personal data about individuals, the decision not to digitise will have restricted the flow of information, and would therefore be one of the frictions that Floridi refers to as “informational friction” or as “ontological friction”

For the moment, other than the regulation segments, the headings I have come up with are:

  • Temporal
  • Spatial
  • Sensory
  • Nature of the data
  • Obscurity
  • Digital information literacy (thinking specifically of digital privacy literacy)
  • Digital privacy protection behaviour
  • and finally a category for the moment headed “Other” to cover things like context (cf. Nissenbaum’s theory of contextual integrity); lack of resources; etc.

Privacy and orders of worth

Sometimes it is in the most unexpected of places that you find some of the most useful information. In a book entitled “From categories to categorization” (edited by Rodolphe Durand et al) is a chapter on privacy which looks at the topic from another angle.

(Bajpai, Weber 2017) analyse emerging notions of informational privacy in public discourse and policymaking in the United States.

They say that conceptions of privacy were tied to institutional orders of worth. Those orders offered theories, analogies and vocabularies that could be deployed to extrapolate the concept of privacy into new domains, make sense of new technologies and to shape policy.

Drawing on the work of Boltanski & Thevenot ((2006)[1991]) they list the following orders of worth applied to privacy:

  • Inspired
  • Domestic
  • Fame
  • Civic
  • Market
  • Industrial

A question I would ask is whether privacy is now seen through the “market” perspective at the expense of other perspectives. So, the market world values competition, winning and self interest; and devalues loss and scarcity; whereas the inspired world, for example, values spontaneity, independence and authenticity; devalues habit, regulation and routine. So, in the inspired world data must be free for creative use and data controls are deeply personal decisions.

Bajpai and Weber say that according to Habermas (1991 p319) a person’s life-world is divided into a private sphere (traditionally family, private household, and intimate relationships) and a public sphere (traditionally political and civic life, and public spaces). Clearly that distinction has come under strain, and I think that it is misleading to see things in such black and white terms.

A particularly noteworthy observation of Bajpai and Weber is that organisational research on categories has drawn mostly on theories of what cognitive psychologists and anthropologists call “object concepts” at the expense of “abstract concepts” such as truth, rights, self, democracy or privacy.

They say that “informational privacy is an abstract concept that rests on translating ideas of privacy from the predigital to the digital era. The reformulation of privacy as informational privacy entails political struggles over epistemic control that are only weakly bounded by “objective” qualities of the category”.

“The policy actors involved in translating the concept of privacy to digital privacy are arguably less constrained by material properties of privacy practices and conventions understandings, as the technologies, practices, and conventions in the digital domain are less settled and rapidly emerging. Policy actors are then involved in creating subsequent constraints in the form of legal doctrine and public policy rather than responding to them”.

REFERENCES

BAJPAI, K. and WEBER, K., 2017. Privacy in public: Translating the category of privacy to the digital age. From Categories to Categorization: Studies in Sociology, Organizations and Strategy at the Crossroads. Emerald Publishing Limited, pp. 223-258.

HABERMAS, J.[.A., BURGER, T. and LAWRENCE, F., 1991. The structural transformation of the public sphere : an inquiry into a category of bourgeois society. Cambridge: MIT Press.

UK data protection & privacy research in LIS sector

In the UK there are a number of initiatives in the LIS sector relating to data protection and privacy issues.

These include research by Jo Bailey in Sheffield:
to investigate data protection management in libraries. Specifically, this research will: establish the level of training or support available to library and information professionals, the prevalence of specific policies in organisations and the opinions and perspectives on data protection legislation amongst library and information professionals.

Research by David McMenemy, Nik Williams, and Lauren Smith on the chilling effect https://twitter.com/D_McMenemy/status/885085826023084037

The Carnegie Trust and CILIP are working together on a project relating to privacy https://www.carnegiecouncil.org/studio/briefings/20170510-privacy-in-a-digital-age-video-highlights

and https://www.carnegieuktrust.org.uk/project/balancing-privacy-public-benefit/

The distinction between public & private

The law tends to treat information in only two ways: either as public or as private.

But is the public / private dichotomy also a false dichotomy?

The dichotomy is challenged by Nissenbaum’s theory of contextual integrity

Is the public/private distinction a quaint norm from an irrelevant past? (Showers, Dawsonera 2015)

Many younger internet users see things in a far more nuanced way than simply in terms of public versus private, where these things are multiple and overlapping

One might, for example, restrict access and visibility to families, to friends or to employers

It isn’t as straightforward as information being either public or private, or indeed coming up with a third category of “semi-public” without going on to develop these concepts further.

The lines between these designations are at times blurred, mutable, even non-existent on occasion

Technology (such as data mining etc) creates interconnections with what were formerly separate spaces

Is the information restricted by technological features?

  • Passwords
  • Privacy settings
  • etc

Where there are barriers to access, the very existence of barriers communicates to those with the right credentials that there is a desire for or an expectation of privacy.

Solove (2004) argues that the secrecy paradigm “fails to recognise that individuals want to keep things private from some people but not from others”.

The two privacy torts that are most relevant to the public/private distinction are:

  1. Public disclosure of private facts (which limits liability to defendants who publicize information that is private, not of legitimate public concern and which is disseminated in a highly offensive manner)
  2. Intrusion upon seclusion

When can a plaintiff reasonably expect information about himself to remain “private” after he has shared it with one or more person?

A workable definition of online obscurity is needed.

REFERENCES

SHOWERS, B. and DAWSONERA, 2015. Library analytics and metrics: using data to drive decisions and services. London: Facet Publishing.

SOLOVE, D.J., 2004. The digital person: Technology and privacy in the information age. NyU Press.

 

Musings on a definition of “privacy”

Definitions of privacy

A number of commentators acknowledge the difficulty of defining privacy, and the lack of consensus on a definition:

  • (Greene 2014) says that privacy is “a multifaceted concept which eludes easy definition”.
  • (Greenland 2013) says that “there is no single agreed definition of privacy in the literature”, recognising that this can be a strength: “a range of conceptions of privacy is valuable because it encompasses several issues”.
  • (Taylor, Floridi et al. 2017) acknowledge that “the concept of privacy is notoriously difficult to define and has varied and sometimes conflicting interpretations”.

(Gutwirth 2002) says “The notion of privacy remains out of the grasp of every academic chasing it. Even when it is cornered by such additional modifiers as “our” privacy, it still finds a way to remain elusive.”

Coming up with a definition of the term “privacy” is incredibly difficult; especially if one hopes to find an authoritative definition which will stand the test of time. Indeed (Hartzog, Selinger 2013) consider it to be a sisyphean task. Commenting on the concept of obscurity within the context of privacy, they say that “Discussion of obscurity in the case law remains sparse. Consequently, the concept remains under-theorized as courts continue their seemingly Sisyphean struggle with finding meaning in the concept of privacy” (Hartzog, Selinger 2013)

It is hardly surprising, therefore, that some commentators make a conscious decision to avoid even attempting to come up with a definition of the word, let alone trying to come up with a definitive description of what privacy is. It is, nevertheless, imperative that people do try to come up with a definition. There are a number of reasons why it important to strive for a useful definition. Two in particular are worthy of consideration:

  • Privacy is widely recognized as an important good that deserves protection. But what cannot be described, defined and understood cannot be defended or regulated. Hence, how the meaning of privacy is constructed in the first place has far-reaching implications for how privacy boundaries and behaviors are negotiated and potentially re-settled in light of a changing world. Arguably then, the understanding of the meaning content of the category of privacy holds the key to the future of privacy in the digital world (Bajpai, Weber 2017).
  • (Cannataci 2016) believes that the existence and usefulness of an international legal framework for privacy is seriously handicapped by the lack of a universally agreed and accepted definition of privacy if we do not have a clear understanding of what we have agreed to protect: “While the concept of privacy is known in all human societies and cultures at all stages of development and throughout all of the known history of humankind it has to be pointed out that there is no binding and universally accepted definition of privacy”

Even if there were a universally agreed and accepted definition of privacy, another handicap is what Cannataci refers to as TPET: the Time, Place, Economy and Technology dimensions. For the passage of time and the impact of technology, taken together with the different rate of economic development and technology deployment in different geographical locations means that legal principles established fifty years ago (ICCPR) or even thirty-five years ago (e.g. the European Convention on Data Protection) let alone seventy years ago (UDHR) may need to be re-visited, further developed and possibly supplemented and complemented to make them more relevant and useful to the realities of 2016. (Cannataci 2016)

Agre says that “One constant across this history is the notorious difficulty of defining the concept of privacy. The lack of satisfactory definitions has obstructed public debate by making it hard to support detailed policy prescriptions with logical arguments from accepted moral premises. Attempts to ground privacy rights in first principles have foundered, suggesting their inherent complexity as social goods” (Rotenberg, Agre 1998).

(Greene 2014) says that “privacy applies to a curious mix of disparate acts, events, things, states of mind, and information. We speak of privacy with regard to our body parts, personal papers, important life decisions, financial status, homes, genetic inheritance, past actions, and our physical selves even when out in public, to name just a few examples”.

(Finn, Wright et al. 2013) say that “privacy” has proved notoriously difficult to define.

Privacy is commonly defined in terms of control and freedom. But the present researcher would add that privacy is closely related to the concept of human dignity. (Greene 2014) goes further and says that “privacy is said to be intimately related … to a host of other values, including freedom, intimacy, autonomy, integrity, respect, dignity, trust, and identity.”

Warren & Brandeis’ 1890 article defines privacy as “the right to be left alone”. (Curry 1997) describes this as “one of the earliest and most succinctly annunciated definitions of privacy”. This widely quoted definition frames privacy as a right. The judgment in Olmstead v. U.S., 277 U.S. 438 (1928) further expands on this in referring to privacy as “the most comprehensive of rights, and the right most valued by civilized men”, while Edward Snowden says that “privacy is the fountainhead of all other rights” (Schrodt 2016).

(Westin 1967) defines privacy as “the claim of individuals, groups, or institutions to determine for themselves when, how and to what extent information about them is communicated to others”.  He says that privacy is “the ability of the individual to control the terms under which personal information is acquired and used”.

(Reiman 1995) says “Privacy is the condition in which others are deprived of access to you” which he explains further as people being deprived of access to either some information about you or some experience of you.

(Tripathi, Tripathi 2010) says “Privacy can be defined as an individual’s freedom to decide to what extent he/she likes to share their intellectual, social and cultural life with others, or in other words to what extent others can invade into his/her private life”.

(Garoogian 1991) says that “Privacy, as the term is commonly used, means the unavailability to others of information about oneself”. She presents moral, legal and professional arguments for the protection of a patron’s privacy.

(Garoogian 1991) picks out a number of definitions of privacy:

  • the claims of individuals …to determine for themselves when, how and to what extent information about them is communicated to others (BUCHANAN 1982)
  • the condition enjoyed by one who can control the communication of information about himself. (Lusky 1972)
  • selective control of access to the self or to one’s group (Altman 1976)
  • control over when and by whom various parts of us can be sensed by others. By “sensed” is meant simply seen, heard, touched, smelled or tasted. (Thompson 1975)
  • [the] right that certain steps shall not be taken to find out facts [private facts] and …[the] right that certain uses shall not be made of [these] facts. (Thompson 1975)
  • having control over information about oneself. (Decew 1987)

The right to privacy has arguably always been used to get to thorny and hard-to-define problems because it touches on various more concrete rights – those of autonomy and the right to intellectual

Scholars argue that the definitional attempt to clarify the privacy concept is nebulous, and that no universally accepted definitions have emerged (Bennett, 1992; Flaherty, 1989).

As the influential privacy scholar Alan Westin put it, privacy as a notion is “part philosophy, some semantics and much pure passion” (Westin 1967).

Defining the elements of privacy, (Sturges 2002) outlined solitude, anonymity, bodily modesty, psychological integrity and confidentiality (in terms of shared information) as important components of privacy.

(Greenland 2013) Privacy refers to notions such as an individual’s right to be respected, personal space, dignity and autonomy, as well as those aspects of a person’s life they wish to restrict access to or keep control of (McCullagh 2008).

(Morozov 2014) The goal of privacy is not to protect some stable self from erosion but to create boundaries where this self can emerge, mutate and stabilize.

The right to privacy protects an interest that has been defined as a “personal condition of life characterised by seclusion from, therefore absence of acquaintance by the public” Norberto Nuno Gomesde Andrade IN (Ghezzi, Pereira et al. 2014)

People confuse privacy with secrecy. I know what you do in the bathroom. Its not secrecy you want in that instance, its privacy.

(Mai 2016) believes that we need to shift from definitions of privacy to models of privacy. He puts forward three models: the surveillance model, the capture model, and the datafication model where data is deduced by predictive analytics.

In a library context privacy is thought of as a right to open inquiry.

The American Library Association defines a right to privacy in a library, whether physical or virtual, as “the right to open inquiry without having the subject of one’s interest examined or scrutinized by others.” (American Library Association 2014).

(Caldwell-Stone 2012) defines users’ information privacy as the right to read and inquire anything, without the fear of being judged or punished.

These two statements, by trying to encapsulate a brief definition of library privacy are in many regards too restrictive because they seem to relate only to someone actively seeking out information.

Writing in the Intellectual Freedom Manual (American Library Association 2010), Deborah Caldwell-Stone says “Confidentiality exists when a library is in possession of personally identifiable information about library users and keeps that information private on their behalf”.

(Richards 2015) argues for a certain kind of privacy, one which is different from tort privacy, which he refers to as “intellectual privacy”, and which he believes to be essential if we care about freedom of expression. Indeed, he says that privacy and (freedom of) speech are not conflicting values but are mutually supportive. The three most important elements of intellectual privacy that Richards identifies are:

  1. the freedom of thought;
  2. the right to read (and engage in intellectual exploration),
  3. and the right to communicate in confidence

and he acknowledges that all 3 elements are related and build on the others

(Richards 2015) (page 95) says that “intellectual privacy is the protection from surveillance or unwanted interference by others when we are engaged in the process of generating ideas and forming beliefs – when we’re thinking, reading and speaking with confidants before our ideas are ready for public consumption”.

(Richards 2015) believes that because it is only comparatively recently that surveillance and monitoring has been happening in a big way, that intellectual privacy is under-developed and under-appreciated.

Intellectual privacy is increasingly under threat, and constant assault.

REFERENCES

ALTMAN, I., 1976. A conceptual analysis. Environment and Behavior, 8(1), pp. 7-29.

AMERICAN LIBRARY ASSOCIATION, 2014. Privacy: an interpretation of the Library Bill of Rights Adopted June 19, 2002, by the ALA Council; amended on July 1, 2014.

AMERICAN LIBRARY ASSOCIATION, 2010. Intellectual freedom manual. 8th edn. American Library Association.

BAJPAI, K. and WEBER, K., 2017. Privacy in public: Translating the category of privacy to the digital age. From Categories to Categorization: Studies in Sociology, Organizations and Strategy at the Crossroads. Emerald Publishing Limited, pp. 223-258.

BUCHANAN, R., 1982. Garbage in, mischief out. Library Review, 31(1), pp. 30-34.

CALDWELL-STONE, D., 2012. A digital dilemma: ebooks and users’ rights. American Libraries, .

CANNATACI, J.A., 2016. Report of the Special Rapporteur on the right to privacy. , pp. A/HRC/31/64-A/HRC/31/64.

CURRY, M.R., 1997. The digital individual and the private realm. Annals of the Association of American Geographers, 87(4), pp. 681-699.

DECEW, J., 1987. Defending the “private” in constitutional privacy. Journal of Value Inquiry, 21, pp. 171-184.

FINN, R.L., WRIGHT, D. and FRIEDEWALD, M., 2013. Seven types of privacy. In: S. GUTWIRTH ET AL. (EDS.), ed, European Data Protection: Coming of Age. Springer Netherlands, pp. 3.

GAROOGIAN, R., 1991. Librarian/patron confidentiality: an ethical challenge. Library Trends, 40(2), pp. 216-233.

GHEZZI, A., 1975, PEREIRA, A., 1966 and VESNIĆ-ALUJEVIĆ, L., 1981, 2014. The ethics of memory in a digital age: interrogating the right to be forgotten. Houndmills, Basingstoke, Hampshire: Palgrave Macmillan.

GREENE, J.K., 2014. Before Snowden: privacy in an earlier digital age. International Journal of Philosophy and Theology, 2(1), pp. 93-118.

GREENLAND, K., 2013. Negotiating self-presentation, identity, ethics, readership and privacy in the LIS blogosphaere: a review of the literature. Australian Academic & Research Libraries, 44(4),.

GUTWIRTH, S., 2002. Privacy and the information age. Lanham, Maryland: Rowman & Littlefield.

HARTZOG, W. and SELINGER, E., 2013. Obscurity: a better way to think about your data than “privacy”. The Atlantic, (January 17),.

LUSKY, L., 1972. Invasion of privacy: a clarification of concepts. Columbia law review, 72(4), pp. 693-710.

MAI, J., 2016. Big data privacy: the datafication of personal information. Information Society, 32(3), pp. 192-199.

MCCULLAGH, K., 2008. Blogging: self presentation and privacy. Information & Communications Technology Law, 17(1), pp. 3-23.

MOROZOV, E., 2014. To save everything, click here: the folly of technological solutionism. New York: PublicAffairs.

REIMAN, J.H., 1995. Driving to the panopticon: a philosophical exploration of the risks to privacy posed by the highway technology of the future. Santa Clara High Technology Law Journal, 11, pp. 27-43.

RICHARDS, N.M., 2008. Intellectual privacy. Tex.L.Rev., 87, pp. 387.

RICHARDS, N., 2015. Intellectual privacy: rethinking civil liberties in the digital age. Oxford University Press.

ROTENBERG, M. and AGRE, P.E., 1998. Technology and privacy: the new landscape. MIT Press.

SCHRODT, P., 2016. Edward Snowden just made an impassioned argument for why privacy is the most important right. Business Insider, (September 15),.

STURGES, P., 2002. Remember the human: the first rule of netiquette, librarians and the Internet. Online Information Review, 26(3), pp. 209-216.

THOMPSON, J.J., 1975. The right to privacy. Philosophical Public Affairs, 4(Summer), pp. 295-314.

TRIPATHI, S. and TRIPATHI, A., 2010. Privacy in libraries: the perspective from India. Library Review, 59(8), pp. 615-623.

WESTIN, A.F., 1967. Privacy and freedom. (1st ed.). edn. New York: Atheneum.

 

Personal data ownership & control

What are the key issues around personal data ownership & control?

Here are some of my thoughts, which are a bit rough or ready; but I would be interested to know if you have other points that I haven’t included, or if any of the ones I have listed aren’t really key issues.

OWNERSHIP
Ownership or possession of myself & its property, material & intellectual
Ownership of inferred data
Co-ownership
Sole ownership
Shared ownership
Are the lines of ownership ambiguous or clear
Have ownership expectations been violated

CONTROL
Do people have control over the information about themselves
Control flow of their own data?
Is control in the hands of the state; a corporate entity?
Is there any type or level of control that the data subject can exercise on their personal data?
Do they have control over the lifecycle of their data (generation, access, recording, usage)
Level of control (full, partial, none)
Is it direct control, or indirect control?
Vulnerable to having behaviour controlled by others? (social control)
Control over one’s own informational image?
Interpersonal boundary control
Genuine control over the dissemination of our personal information?
Control the disclosure of personal data to third parties
Control personal data flow
How is control achieved
– notice and choice (The control illusion)
Do they have ex post control in terms of the ability to check whether people are actually keeping to their obligations & promises
Use of personal data stores or data vaults
Ability to manage digital footprint
Ability to negotiate boundaries (cf Sandra Petronio’s communications privacy theory)

The control illusion: Profiling, discrimination, and other inferential harms happen so remotely from the source as to remove any doubt that the “choice” offered to users who disclose personal in the modern world is usually an illusion (Richards, Hartzog 2015)

The control paradox: There is a limit to how much control is helpful to consumers. In fact, too many privacy options may lead to users making poorer choices about their privacy by confusing them. For example, letting users define that only their self-designated “friends” can see blog posts can be helpful for ensuring privacy, but giving users multiple definitions of friends (ex: work friends, Tennessee friends, college friends) who all access the same profile, can actually lead to users to make more information about themselves available than if they were offered fewer, but easier-to-understand, choices (Flatow 2008)

REFERENCES
FLATOW, I., 2008. Web privacy concerns prompt Facebook changes. Science Friday, .
RICHARDS, N.M. and HARTZOG, W., 2015. Taking Trust Seriously in Privacy Law.

 

Ontological frictions

Luciano Floridi’s (@floridi) concept of “ontological frictions” relates to the forces that oppose the flow of information with a region of the infosphere. It is connected with the amount of effort required for some agent to obtain, filter, or block information about other agents in a given environment, by decreasing, shaping or increasing informational friction (Floridi 2014).

The key is getting the optimal level of friction in the infosphere.

I have tried to list a range of types of “frictions”, taking into account the analogue and the digital world. At the moment its very much an initial draft, and I am sure that it can be improved. But it would be interesting if anyone has any comments or feedback about the sorts of things that I have included.

FIVE SENSES SPATIAL
Smell Spaces
Taste Office space – design and layout
Touch Territories
Sight Walls
Hearing Hidden spaces
Soundproof
Sound/noise Doors / locked doors
Physical separation
Relevant to bodily privacy, but not limited to that. Impinges on other areas such as spatial, since – for example – ability to see hampered by use of opaque glass rather than clear glass, etc. Partitions
Curtains / closed curtains
Length
Height
Depth
Distance
Thin partitions
Materials (Glass etc)
Architecture
Material structures
Examples: Unisex bathrooms in libraries
Aspatial (internet)
DATA TECHNOLOGY
Volume/Amount of personal information in that region of the infosphere Encryption
Complexity of the information Secure networks
Data localisation (confining data within a country’s border whether required by law or otherwise) –           VPNs
International interoperability in data protection & privacy –           Separation of staff wifi from user wifi etc
Why information is dispersed over space and time Strong passwords
Data locked away in corporate silos 2FA
Use of blocking to inhibit tracking mechanisms
Password protections/password encoding
Firewalling
Specifically devised protocols or services
Warning systems (for externally captured data)
Privacy Invasive Technologies
Privacy Enhancing Technologies
Limited disclosure technology (eg Sudoweb, Facecloak)
Use of blocking to inhibit tracking mechanisms
Pro-active information security measures
Network penetration testing
Limiting editing/access rights to those who really need them
Ensuring ability to undertake a forensic audit
Firewall
Proactively take measures to protect privacy
Clear cookies and browser history
Delete/edit something you posted in past
Set your browser to disable or turn off cookies
Not used website because it asked for your real name
Used temporary username/email address
Adblockers
Addons to prevent tracking (PrivacyBadger, Ghostery etc)
DIGITAL INFORMATION LITERACY LEGAL / CONTRACTUAL BARRIERS
Knowledge is power – awareness of the risks, of how to minimise those risks, and how to deal with things in the event of a data breach. Contractual restrictions on user behaviour (eg. Prohibitions on scraping data ensuring only humans an access online information as opposed to bots)
Training Negotiation & enforcement of data handling conditions before a product/service is ordered online
Awareness raising Adoption of standards and guidelines such as the NISO privacy principles
Cryptoparties
Managing digital footprint effectively
OTHER ISSUES OTHER ISSUES
Social norms Obscurity
Context (is the information being used in a way it wasn’t given to data processor for) Practical obscurity
Lack of resources (memory or time) – Difficulty of collecting
– Available only in physical library v digital library
– Difficulty of correlating
Obfuscation = “The deliberate use of ambiguous confusing or misleading information to interfere with surveillance and data collection projects
– camouflage users search queries
– stymie online advertising”
Online obscurity (Hartzog & Stutzman 2013 identify four major factors)
1. Search visibility (eg use of robots.txt, privacy settings, passwords or other access restrictions)
2. Unprotected access (not using access controls such as a password, biometrics, encryption, privacy settings)
3. Identification (ability to use pseudonyms [cf. the “nym wars”]); Anonymisation
4. Clarity (doesn’t make sense because it is intentionally vague or incomplete)”