Model for ontological frictions (cf @Floridi)

Many of the frictions that determine the ease/difficulty with which one can access personally identifiable information fall under the heading of technology.

Whilst technology accounts for many frictions, it certainly doesn’t cover all of them. So, I have been trying to think about how to encapsulate the various different frictions in a model.

Yesterday, I very tentatively posted a slide on twitter about the opposing forces of privacy invasive technologies versus privacy enhancing technologies, and one response was simply to say “brilliant”. I had been quite wary of posting the slide, because it only tells part of the story. Other aspects might include privacy by design and differential privacy.

My efforts to come up with a suitable model are only in their initial stages, and right now its very much a work in progress.

Thinking of the technology segment, the nitty gritty might cover things like

  • Encryption
  • Secure networks
    –           VPNs
    –           Separation of staff wifi from user wifi etc
  • Strong passwords
  • 2FA
  • Use of blocking to inhibit tracking mechanisms
  • Password protections/password encoding
  • Specifically devised protocols or services
  • Warning systems (for externally captured data)
  • Limited disclosure technology (eg Sudoweb, Facecloak)
  • Use of blocking to inhibit tracking mechanisms
  • Pro-active information security measures
  • Network penetration testing
  • Limiting editing/access rights to those who really need them
  • Ensuring ability to undertake a forensic audit
  • Firewall
  • Proactively take measures to protect privacy
  • Clear cookies and browser history
  • Delete/edit something you posted in past
  • Set your browser to disable or turn off cookies
  • Adblockers
  • Addons to prevent tracking (PrivacyBadger, Ghostery etc)

The technology piece could be seen in the larger context of regulation – cf. Lessig’s code as law. So I then took David Haynes’ regulation model which covers four elements: norms, law, code, and self-regulation and tried to think from there about all of the other types of friction that aren’t covered by those headings.

It is definitely not easy to try and make sense and categorise the other elements, not least because they aren’t necessarily mutually exclusive.

For example, I decided to create a heading for “Obscurity” to cover obscurity, practical obscurity and obfuscation. These can in many cases be achieved through technology, but not necessarily. Making a deliberate decision NOT to digitise a work would be a means of achieving practical obscurity, and of ensuring that access to its content was far more restrictive than it ever would be had it been digitised. And if that contained sensitive personal data about individuals, the decision not to digitise will have restricted the flow of information, and would therefore be one of the frictions that Floridi refers to as “informational friction” or as “ontological friction”

For the moment, other than the regulation segments, the headings I have come up with are:

  • Temporal
  • Spatial
  • Sensory
  • Nature of the data
  • Obscurity
  • Digital information literacy (thinking specifically of digital privacy literacy)
  • Digital privacy protection behaviour
  • and finally contextual integrity (Nissenbaum).
Advertisements