10 library-related examples of temporal ontological friction

First of all, I should make clear what I am talking about when I refer to “ontological friction”. I am thinking of the forces that oppose the flow of information with a region of the infosphere. It is connected with the amount of effort required for some agent to obtain, filter, or block information about other agents in a given environment, by decreasing, shaping or increasing informational friction (Floridi 2014)

In other words, those things that make it easier or harder to have direct access to personal data.

And, more specifically, frictions in the temporal dimension – so time-related frictions.


(Gutwirth, Leenes et al. 2014) refer to temporal ontological restrictions (like the opening times of libraries). To me that seemed a very odd choice. Clearly library opening hours would determine ease of access to printed books (though not to the electronic resources offered), but it stumps me as to how that would limit access to personally identifiable information.

Nevertheless, it spurred me on to think of temporal ontological frictions in a library context. So here are ten I came up with:

  1. How long is a library user’s reading history retained by default?
  2. Does the user have a choice as to whether the data is retained or not?
  3. And can they personalise to determine precisely how long they want the data kept for?
  4. How long are details of previous searches kept for on each of the online services the library subscribes to?
  5. Is user data is anonymised at a certain point in time? If so, when?
  6. What is the average response time for any data subject access requests?
  7. How long are any paper-based records for computer bookings kept for?
  8. For what length of time is any library CCTV footage held?
  9. Do public access computers get restored back to their native state after each user has finished their session?
  10. When was the last time any data protection training was provided for library staff?



FLORIDI, L., 2014. The 4th revolution: how the infosphere is reshaping human reality. Oxford, United Kingdom: Oxford University Press.

GUTWIRTH, S., LEENES, R., DE HERT, P. and SPRINGERLINK EBOOK COLLECTION, 2014. Reloading Data Protection: Multidisciplinary Insights and Contemporary Challenges. Dordrecht: Springer Netherlands.


The “nothing to hide” response – what it should really say

When people object to surveillance they are told rather tritely that if they have nothing to hide, they have nothing to fear. I think that’s a bogus argument because it conveniently overlooks the dimension of power and control. But what I found interesting was the comment in “The spy in the coffee machine” (O’Hara & Shadbolt, 2008), where they say:

A response that would be correct, but somewhat less persuasive, would be “if you keep within the law, and the government keeps within the law, and its employees keep within the law, and the computer holding the database doesn’t screw up, and the system is carefully designed according to well-understood software engineering principles and maintained properly, and the government doesn’t scrimp on the outlay, and all the data are entered carefully, and the police are adequately trained to use the system, and the system isn’t hacked into, and your identity isn’t stolen, and the local hardware functions well, you have nothing to fear”.


Using “ontological friction” to achieve privacy in libraries

Luciano Floridi envisages “Ontological friction”  as referring to the forces that oppose the information flow within (a region of) the infosphere, and hence (as a coefficient) to the amount of work required for a certain kind of agent to obtain information

Utilising the typology of privacy by Koops et al (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2754043) I have tried in the table below to think of examples for each of the privacy categories, and for these to be relevant to the work of the library & information sector.

This is very much a work in progress, because I have only just started to think of it in terms of the privacy classifications used by Koops et al. So by all means if you have any comments or feedback, do let me know (@priv_lib).

Typology of privacy & ontological friction
Privacy classification Descriptions Examples of ontological friction
Bodily privacy Direct and indirect invasions of bodily integrity. An indirect physical intrusion would cover use of facial recognition system, as it enables information to be obtained about a person’s body without physical contact.
Use of fingerprints as a means of authentication in a school library (to do so compulsorily and against the will of the pupil would contravene the Protection of Freedoms Act 2012).
Spatial privacy Privacy expectations in and around one’s home (& possibly also the workplace). Privacy screens around bookable public access computers. Thick walls/partitions between the different functions in a shared services building (co-location of library, customer services, housing etc). Level of friction can be determined by thickness of walls, use of glass for the partitions etc.
Communicational privacy Someone violates this type of privacy by, for example, intercepting personal communications (such as opening or reading mail or using bugs), eavesdropping, or accessing stored communications without consent The level of informational friction will be determined by the type of communication used – such as verbal communication, email, social networking
Proprietary privacy Reputational, & image management Librarian bloggers being respectful of user privacy if they write blog posts, books, or articles which cover their interactions with patrons (the “Refgrunt” genre).
Intellectual privacy Privacy of thought and mind, development of opinions and beliefs. Records of intellectual activity in hands of 3rd party vendors. “Social reading”: automatic disclosure of reading habits to one’s friends  gives internet users suggestions of new and interesting things to read. Care over use of embedded content, specifically book covers in library catalogues, to ensure you aren’t leaking catalog searches to Amazon. No right to possess stolen library books.
Decisional privacy Concerns the freedom to make decisions about one’s body and family. Decisional privacy involves matters such as contraception, procreation, abortion and child rearing. Freedom from interference in one’s personal choices, plans, and decisions. School library banning books on sensitive topics such as abortion, homosexuality, sexual intercourse etc.
Associational privacy Freedom to connect with whomever or with whichever group one chooses without being monitored Library recruiters undertaking invasive and unnecessary background checks which capture sensitive information about the respondents’ affiliations and memberships
Behavioural privacy Activities that happen in both public and private places, and encompasses sensitive issues Banning filming in the library without permission. Requirement to keep conversational levels low or quiet
Informational privacy Encompassing information/data/facts about persons or their communications Use of https:// on library websites


Privacy from a company perspective

At a recent professional event, one of the speakers made the point that companies only act ethically when it is in their financial interests to do so; which – if it were true (and of course one shouldn’t generalise) – would mean that they aren’t driven by ethics at all.

So it was interesting today to be reading about Company Information Privacy Orientation or (CIPO) (Greenaway, Chan et al. 2015). Far too often, people concentrate on privacy from the perspective of the individual. I say “too often”, because that means it is at the expense of considering things from the wider view (of society as a whole, at a group level, or as in this case, from the perspective of companies).  Greenaway et al believe that things are determined by control, and by justice. They believe that firms are influenced by three determinants:

– ethical obligations

– information management strategies

– legal risk assessments

And the result is that they put forward four categories of privacy orientation:

1. privacy ignorers

2. privacy minimisers

3. privacy balancers

4. privacy differentiators

 and they even give real life examples of companies who characterise each of these categories.


Privacy ignorers – these companies provide little to no control or procedural justice for their customers regarding the control of information collected by the firm, and they rely on individuals taking personal control of their information by choosing when to provide it.


Privacy minimisers – these companies engage in only as many privacy behaviours as are necessary to avoid legal action. Responding to the minimum requirements of government regulation is considered to be a key driver of an organisation’s privacy approach.


Privacy balancers – these companies adhere to industry or professionally based privacy codes, including those codes that exceed legal requirements. In this manner, they embrace both the letter and the spirit of privacy laws.


Privacy differentiators – these companies see privacy as a competitive advantage. They are most likely to offer significantly enhanced privacy protection as part of their business strategy in order to set themselves apart from their peer group


Greenaway et al raise a number of interests points worthy of future research:

1. Why do different companies adopt particular privacy orientations?

2. Is the CIPO explicit or emergent, in other words do they choose a specific privacy orientation from the outset or does it emerge based on a review of their past decisions?

3. Is the CIPO static or does it change over time in the light of decisions and deliberate behavioural changes?



GREENAWAY, K.E., CHAN, Y.E. and CROSSLER, R.E., 2015. Company information privacy orientation: a conceptual framework. Information Systems Journal, 25, pp. 579-606.


Privacy & library user registration

Today I had a quick look at the information required by a number of different library authorities for anyone wishing to join the library.

The first point I would make is why don’t all library authorities automatically refer to a privacy policy statement at the time that they collect this data – whether that be on the printed form or the online registration form. Indeed, one could also ask whether any of the library staff who receive completed (printed) application forms mentions privacy to the person applying for library membership. Wouldn’t it be more reassuring if they did. Wouldn’t it demonstrate that they take it seriously?

Secondly, a number of them want date of birth. But they don’t explain why. If a library authority lets users above a certain age threshold enjoy services for free that are chargeable to others (such as a charge for borrowing CD’s or DVD’s), why don’t they make clear that this is the case? In any event, is that the sole reason for wanting the date of birth information.

The online registration forms may well be set up so that the date of birth field is a compulsory field which must be completed in order to successfully submit an application to join the library.

But it is interesting to see just how many other things some of the forms ask for. For example, in one case they want to know:

  • Gender (including an option for “transgender”)
  • Age
  • Disability, where there are choices for mental health condition, physical impairment, non visible impairment, visual impairment, hearing impairment
  • Religion or belief
  • Sexual orientation
  • Ethnicity (which includes a choice for gypsy/traveller)

Whilst a few of the choices may have a “prefer not to say” option; that isn’t the case with all of the ones that data protection law would consider to be “sensitive personal information” requiring added protections.

The amount of information required varies from one authority to another, which begs the question as to why so much information is required by some authorities, when others don’t ask for and don’t need that amount of information at all.



Is there any evidence of chilling effect on reading/web searching behaviour from fear of being watched

In a privacy context, the term “chilling effect” refers to the way in which people could modify, change their behaviour, or self-censor because of the fear that they are being watched, or they know for certain that they are. The term can be applied in a number of different areas of law. For example, in the law of libel, somebody might choose to exclude a particular paragraph or sentence from an article or indeed might opt not to write the piece at all for fear of falling foul of libel law.

Jeremy Bentham designed a prison as a means of keeping prisoners under control consisting of cells which were arranged radially in a circle and a central tower. Known as the panopticon, the prison’s design consisted of cells that were backlit so that anyone in the tower could see anything that was going on in the cells, but that crucially the prisoners themselves weren’t able to see whether anyone was in the tower. The idea behind the building’s design was that prisoners would assume that the guards were watching them, and that they would act accordingly. The panopticon is therefore an instrument for exerting power and control.

Michel Foucault used the panopticon as a metaphor for the impact of surveillance where you feel visible and exposed, and feel the power of the potential surveillance (Foucault 1977).

“Records must be protected from the self-appointed guardians of public and private morality and from officials who might overreach their constitutional prerogatives. Without such protection, there would be a chilling effect on our library users as inquiring minds turn away from exploring varied avenues of thought because they fear the potentiality of others knowing their reading history” (Quad/Graphics v. S Adirondack 174 Misc.2d 291 (1997), 664 N.Y.S 2d 225. 1997).

You might not visit certain websites if you thought that the government would use this as evidence against you. You might choose to limit your reading choices to the mainstream, to the bland and the boring rather than read something controversial, racy, or embarrassing.

Is there any evidence that library users change their behaviour when they think or know that they are being watched?

Some of the examples below are specifically in a library context. Given that many library visits aren’t to borrow books, but to use the computers, the other examples are also relevant as they cover searching the internet:

  • A study at Central Michigan University’s Park Library found that LGBT material was borrowed 20% more if done by self-check than using the traditional circulation desk (Mathson, Hancks 2008).
  • Following the Snowden revelations of mass surveillance, users search behaviour changed, including a drop in traffic for search terms rated as being personally sensitive (Marthews, Tucker 2015).
  • In Norway you can look up anybody’s tax records. However, people must now log onto the tax system and thereby leave a trail of which records they have looked at. The number of searches fell by 90 percent from 2013 to 2014 as a result.
  • A young woman stopped short of printing out her research on sexually transmitted diseases when she learnt that the printer was at the front desk. Source: “Privacy concern, printer control clash at library” (Fillo 1999).

The right to read

Neil Richards argues for a certain kind of privacy, one which is different from tort privacy, which he refers to as “intellectual privacy”, and which he believes to be essential if we care about freedom of expression (Richards 2015). Indeed, he says that privacy and (freedom of) speech are not conflicting values but are mutually supportive. The three most important elements of intellectual privacy that Richards identifies are:

  1. the freedom of thought;
  2. the right to read (and engage in intellectual exploration),
  3. and the right to communicate in confidence

Reading is often an act of fantasy, and fantasy cannot be made criminal without imperiling the freedom to think as one wants. Moreover, the chilling effect of such an intrusion into intellectual privacy could cause others to skew their reading habits for fear of attracting the attention of the government (Richards 2008)

The United States Supreme Court first protected the right to read anonymously in Lamont v Postmaster General 381 U.S. 301 (1965), in a decision that struck down as unconstitutional a law requiring individuals to identify themselves in order to receive publications that were allegedly Communist propaganda. The Court’s opinion relied on the “chilling effect”.

Mass surveillance by the state results in the chilling effect of limiting our intellectual privacy, inhibiting our ability to seek out new ideas that conflict with the status quo and therefore inhibiting our autonomy (Clark 2016)


Libraries are not the only place that private records of intellectual activity can be found. Other examples include:

  • lists of who purchased which books from bookstores; or
  • publishers’ records of who subscribes to a particular journal title.

In re Grand Jury Subpoena to Kramerbooks & Afterwords, Inc., 26 Med. L. Rptr. 1599 (1998) the U.S. District Court for the District of Columbia held that the government must meet strict scrutiny when defending subpoenas to a bookstore for customers’ book-purchase records. The case involved the Independent Prosecutor Kenneth Starr’s attempt to obtain Monica Lewinsky’s book-purchase records. Starr was trying to establish whether or not Lewinsky had purchased a novel by Nicholson Baker entitled “Vox” which chronicles an intimate and graphically sexual telephone conversation. In holding that the government would have to overcome strict scrutiny, the court stated that “[t]he bookstores and Ms. Lewinsky have persuasively alleged a chilling effect on their First Amendment rights.”

In Tattered Cover Inc v City of Thornton 44 P. 3d 1044 (Colo 2002) six police officers entered the Tattered Cover Bookstore in Denver with a search warrant for a specific customer’s book-purchase records. Joyce Meskis, who was the store’s owner, refused to hand over the records because she was concerned that complying with the warrant would violate the customer’s First Amendment and privacy rights.

In holding that Tattered Cover did not have to turn over the customer’s book records, the court first explained that its decision vindicated both the rights of the bookstore and of the book-buying public in general. Both the United States Constitution and the Colorado Constitution safeguard the right of the public to buy and read books anonymously, free from governmental intrusion.

The court stated that “when a person buys a book at a bookstore, he engages in activity protected by the First Amendment because he is exercising his right to read and receive ideas and information.” Otherwise, the right to free expression would be meaningless if the right to receive the thoughts that someone else was free to express was not similarly protected. But the court went further, expressing the right in more affirmative terms: “Everyone must be permitted to discover and consider the full range of expression and ideas available in our “marketplace of ideas”. Additionally, fearful that any governmental inquiry into the book buying habits of the public would “almost certainly chill their constitutionally protected rights,” the court emphasized the importance of anonymity.

Accurate information can tell inaccurate stories:

– reading a murder mystery does not make someone a murderer

– tracking someone’s location to having been outside a shop minutes after it has been robbed doesn’t make them the robber

– reading a book about aphasia doesn’t automatically mean that the person reading the book is afflicted by the condition

People read books for an infinite variety of reasons, and drawing generalized conclusions from another’s reading choices wrongly assumes that the most obvious reason is always the correct one.

Clifford Lynch (2017) says “remember that knowledge of actual reading activity rather than simply knowing what texts have been accessed or acquired still does not guarantee understanding of the values, beliefs, opinions, or intentions within a given human mind. We can only hope that governments, and commercial data collectors and exploiters, know this as well” (Lynch 2017)

Why does this matter? It matters because of the way in which courts have used the records of what people have read as an indication of intent (see, for example, United States v Curtin 489 FJd 935, 956 (9th Cir. 2007))

Penney (2016) undertook an empirical study providing evidence of regulatory “chilling effects” of Wikipedia users associated with online government surveillance. The study explored how traffic to Wikipedia articles on topics that raise privacy concerns for Wikipedia users decreased after the widespread publicity about NSA/PRISM surveillance revelations in June 2013. Using an interdisciplinary research design, the study tested the hypothesis, based on chilling effects theory, that traffic to privacy-sensitive Wikipedia articles reduced after the mass surveillance revelations. Penney found not only a statistically significant immediate decline in traffic for these Wikipedia articles after June 2013, but also a change in the overall secular trend in the view count traffic, suggesting not only immediate but also long-term chilling effects resulting from the NSA/PRISM online surveillance revelations. This study is among the first to evidence—using either Wikipedia data or web traffic data more generally—how government surveillance and similar actions may impact online activities, including access to information and knowledge online. (Penney 2016)

Proving a causal link to the “chilling effect” is not easy. As Penney acknowledges: “Privacy theorists, security researchers, and social scientists have also expressed skepticism about the possibility of large scale chilling effects caused by online surveillance. One reason for such skepticism is increasing public acceptance of, or desensitization to, privacy and surveillance concerns, particularly in new technological contexts. Indeed, some research in the field suggests that any chilling effects would, at the very most, be temporary or ephemeral, as online users have changed their behavior in response to shifting norms” (Penney 2016)

Social science research has long illustrated that self-reported or expressed concerns about privacy do not necessarily reflect people’s actual behavior online, a phenomenon sometimes referred to as the “privacy paradox.”

The theory of “chilling effects” received a comprehensive exploration in Schauer’s Fear, Risk, and the First Amendment: Unraveling the “Chilling Effects Doctrine. Schauer conceived of chilling effects as primarily resulting from people’s fear of prosecution or legal sanction and the uncertainties of the legal process (Schauer 1978)

(Marthews, Tucker 2015) found a statistically significant 5% reduction in Google searchers for certain privacy-sensitive search terms after Edward Snowden’s revelations in June 2013. Their study not only provides evidence of chilling effects, but also offers a research design that may be employed to study chilling effects in other online contexts (Penney 2016). However, as Penney says “the authors obtained their data from Google Trends, which provides Google search data in “normalized” or adjusted format. The search data is normalized in two ways. First, the data represents only a percentage of total Google searches for any given term. Second, Google then “adjusts” the search data to render comparisons across regions more easily; these results are further “scaled to a range of 0 to 100”. This, the authors admitted, meant it was “harder to make projections” based on the findings of the study— such as resulting “economic outcomes”.

PEN America’s research documents the chilling effect of encroaching surveillance on creativity and free expression:

  • 28% have curtailed or avoided social media activities, and another 12% have seriously considered doing so;
  • 24% have deliberately avoided certain topics in phone or email conversations, and another 9% have seriously considered it;
  • 16% have avoided writing or speaking about a particular topic, and another 11% have seriously considered it;
  • 16% have refrained from conducting Internet searches or visiting websites on topics that may be considered controversial or suspicious, and another 12% have seriously considered it;
  • 13% have taken extra steps to disguise or cover their digital footprints, and another 11% have seriously considered it;
  • 3% have declined opportunities to meet (in person, or electronically) people who might be deemed security threats by the government, and another 4% have seriously considered it. (PEN America 2013)

Stoycheff undertook a study exploring how perceptions and justification of surveillance practices may create a chilling effect on democratic discourse by stifling the expression of minority political views. Using a spiral of silence theoretical framework, knowing one is subject to surveillance and accepting such surveillance as necessary were found to act as moderating agents in the relationship between one’s perceived climate of opinion and willingness to voice opinions online. (Stoycheff 2016)


CLARK, I., 2016. The digital divide in the post-Snowden era. Journal of Radical Librarianship, 2, pp. 1-32.

FILLO, M., 1999. Privacy concern, printer control clash at library. The Hartford Courant, .

FOUCAULT, M., 1977. Discipline and punish: the birth of the prison. Pantheon Books.

LYNCH, C., 2017. The rise of reading analytics and the emerging calculus of reader privacy in the digital world. First Monday, 22(4 (April 3rd)),.

MARTHEWS, A. and TUCKER, C., 2015. Government surveillance and internet search behavior.

MATHSON, S. and HANCKS, J., 2008. Privacy Please? A comparison between self-checkout and book checkout desk circulation rates for LGBT and other books. Journal of Access Services, 4(3-4), pp. 27-37.

PEN AMERICA, 2013. Chilling effects: NSA surveillance drives US writers to self-censor. New York: PEN American Center, .

PENNEY, J., 2016. Chilling effects: Online surveillance and wikipedia use.

Quad/Graphics v. S Adirondack 174 Misc.2d 291 (1997), 664 N.Y.S 2d 225. 1997.RICHARDS, N.M., 2008. Intellectual privacy. Tex.L.Rev., 87, pp. 387.

RICHARDS, N., 2015. Intellectual privacy: rethinking civil liberties in the digital age. Oxford University Press.

SCHAUER, F., 1978. Fear, risk and the first amendment: Unraveling the chilling effect. BUL rev., 58, pp. 685.

STOYCHEFF, E., 2016. Under Surveillance: Examining Facebook’s Spiral of Silence Effects in the Wake of NSA Internet Monitoring. Journalism & Mass Communication Quarterly, 93(2), pp. 296-311.


Literature on privacy of individual, the group, and society

BERNAL, P., 2013. Individual privacy vs. collective security? No! Paul Bernal’s blog, (October 17),.

BLOUSTEIN, E.J., 2004. Individual and group privacy. 2nd printing edn. New Brunswick: Transaction Publishers.

CANNATACI, J.A., 2016. Getting things done in privacy protection part 2: another dimension of privacy: communal privacy, privacy of the community and personality. Privacyandpersonality.org, (June 5),.

DE WOLF, R., 2016. Group privacy management strategies and challenges in Facebook: A focus group study among flemish youth organizations. Cyberpsychology, 10(1),.

DE WOLF, R., WILLAERT, K. and PIERSON, J., 2014. Managing privacy boundaries together: Exploring individual and group privacy management strategies in Facebook. Computers in Human Behavior, 35, pp. 444-454.

DOURISH, P. and ANDERSON, K., 2006. Collective Information Practice: Exploring Privacy and Security as Social and Cultural Phenomena. Human-Computer Interaction, 21(3), pp. 319-342.

DUMSDAY, T., 2008. Group privacy and government surveillance of religious services. Monist, 91(1), pp. 170-186.

EDWARDS, L. and VEALE, M., 2017. Slave to the algorithm? Why a “right to explanation” is probably not the remedy you are looking for. (May 23),.

FAIRFIELD, J. and ENGEL, C., 2015. PRIVACY AS A PUBLIC GOOD. Duke law journal, 65(3), pp. 385-457.

FLORIDI, L., 2017. Group Privacy: a defence and an interpretation. In: L. TAYLOR, L. FLORIDI and B. VAN DER SLOOT, eds, Group privacy: new challenges of data technologies. Springer, pp. 83-100.

FLORIDI, L., 2016. On human dignity as a foundation for the right to privacy. Philosophy & Technology, 29(4), pp. 307-312.

FLORIDI, L., 2014. Open Data, Data Protection, and Group Privacy. Philosophy & Technology, 27(1), pp. 1-3.

FRENCH, P., 1984. Collective and corporate responsibility. Columbia University Press.

JONES, P., 2016. Stanford encyclopedia of philosophy (entry for group rights).

KUPRITZ, V.W., 2011. Individual and group privacy needs across job types: Phase 1 study. Journal of Architectural and Planning Research, 28(4), pp. 292-313.

MAY, L., 1989. The morality of groups. University of Notre Dame Press.

MITTELSTADT, B., 2017. From Individual to Group Privacy in Big Data Analytics. Philosophy & Technology, .

NARAYANAN, A. and SHMATIKOV, V., 2005. Obfuscated databases and group privacy, Proceedings of the ACM Conference on Computer and Communications Security 2005, pp. 102-111.

NEWMAN, D.G., 2004. Collective Interests and Collective Rights. The American Journal of Jurisprudence, 49(1), pp. 127-163.

PETRONIO, S. and ALTMAN, I., 2002. Boundaries of privacy.

PIETKIEWICZ, I.J. and WLODARCZYCK, M., 2015. Crossing the boundaries of privacy in accidental encounters: interpretative phenomenological analysis of therapists’ experiences. Clinical Psychology and Psychotherapy, 22, pp. 708-721.

STRAHILEVITZ, L.J., 2010. Collective privacy. In: M. NUSSBAUM and S. LEVMORE, eds, The offensive internet: speech, privacy and reputation. Harvard University Press, .

TAVROV, D. and CHERTOV, O., 2016. Evolutionary approach to violating group anonymity using third-party data. SpringerPlus, 5(1), pp. 1-32.

TAYLOR, L., FLORIDI, L. and VAN DER SLOOT, B., eds, 2017. Group privacy: new challenges of data technologies. Dordrecht: Springer.

ZHAI, R., ZHANG, K. and LIU, M., 2016. A dynamic group privacy protection mechanism based on cloud model, Proceedings of the 2015 5th World Congress on Information and Communication Technologies, WICT 2015 2016, pp. 83-88.

ZHAI, R., ZHANG, K. and LIU, M., 2015. Static group privacy protection mechanism based on cloud model, Proceedings – 15th IEEE International Conference on Computer and Information Technology, CIT 2015, 14th IEEE International Conference on Ubiquitous Computing and Communications, IUCC 2015, 13th IEEE International Conference on Dependable, Autonomic and Secure Computing, DASC 2015 and 13th IEEE International Conference on Pervasive Intelligence and Computing, PICom 2015 2015, pp. 970-974.