Ontological frictions

Luciano Floridi’s (@floridi) concept of “ontological frictions” relates to the forces that oppose the flow of information with a region of the infosphere. It is connected with the amount of effort required for some agent to obtain, filter, or block information about other agents in a given environment, by decreasing, shaping or increasing informational friction (Floridi 2014).

The key is getting the optimal level of friction in the infosphere.

I have tried to list a range of types of “frictions”, taking into account the analogue and the digital world. At the moment its very much an initial draft, and I am sure that it can be improved. But it would be interesting if anyone has any comments or feedback about the sorts of things that I have included.

Smell Spaces
Taste Office space – design and layout
Touch Territories
Sight Walls
Hearing Hidden spaces
Sound/noise Doors / locked doors
Physical separation
Relevant to bodily privacy, but not limited to that. Impinges on other areas such as spatial, since – for example – ability to see hampered by use of opaque glass rather than clear glass, etc. Partitions
Curtains / closed curtains
Thin partitions
Materials (Glass etc)
Material structures
Examples: Unisex bathrooms in libraries
Aspatial (internet)
Volume/Amount of personal information in that region of the infosphere Encryption
Complexity of the information Secure networks
Data localisation (confining data within a country’s border whether required by law or otherwise) –           VPNs
International interoperability in data protection & privacy –           Separation of staff wifi from user wifi etc
Why information is dispersed over space and time Strong passwords
Data locked away in corporate silos 2FA
Use of blocking to inhibit tracking mechanisms
Password protections/password encoding
Specifically devised protocols or services
Warning systems (for externally captured data)
Privacy Invasive Technologies
Privacy Enhancing Technologies
Limited disclosure technology (eg Sudoweb, Facecloak)
Use of blocking to inhibit tracking mechanisms
Pro-active information security measures
Network penetration testing
Limiting editing/access rights to those who really need them
Ensuring ability to undertake a forensic audit
Proactively take measures to protect privacy
Clear cookies and browser history
Delete/edit something you posted in past
Set your browser to disable or turn off cookies
Not used website because it asked for your real name
Used temporary username/email address
Addons to prevent tracking (PrivacyBadger, Ghostery etc)
Knowledge is power – awareness of the risks, of how to minimise those risks, and how to deal with things in the event of a data breach. Contractual restrictions on user behaviour (eg. Prohibitions on scraping data ensuring only humans an access online information as opposed to bots)
Training Negotiation & enforcement of data handling conditions before a product/service is ordered online
Awareness raising Adoption of standards and guidelines such as the NISO privacy principles
Managing digital footprint effectively
Social norms Obscurity
Context (is the information being used in a way it wasn’t given to data processor for) Practical obscurity
Lack of resources (memory or time) – Difficulty of collecting
– Available only in physical library v digital library
– Difficulty of correlating
Obfuscation = “The deliberate use of ambiguous confusing or misleading information to interfere with surveillance and data collection projects
– camouflage users search queries
– stymie online advertising”
Online obscurity (Hartzog & Stutzman 2013 identify four major factors)
1. Search visibility (eg use of robots.txt, privacy settings, passwords or other access restrictions)
2. Unprotected access (not using access controls such as a password, biometrics, encryption, privacy settings)
3. Identification (ability to use pseudonyms [cf. the “nym wars”]); Anonymisation
4. Clarity (doesn’t make sense because it is intentionally vague or incomplete)”