A model is a metaphor; a construct of philosophical indirection that is especially useful in understanding new and/or complex concepts. It can be quite literal. A model sailing ship can offer significant insights into the structure and operation of its larger kin. It can be picked up and easily turned this way and that, offering observational perspectives that are difficult if not impossible to achieve when looking at a ‘real’ ship. Extending well beyond the literal, however, some models can be far more abstract. In this article, Timothy Jurgensen offers some rationale for a modelling approach to the concepts of privacy and identity.
Among the more complex models of constructs that behave largely according to basic physical processes are the simulators for high‑performance aircraft. Such flight simulators are able to reproduce an airplane’s operational characteristics, including responses to external environmental stimuli. Used mainly for training, a simulator can allow a pilot to experience most of the sensations of actually flying an airplane. An administrator of the simulator can even induce a wide variety of error conditions, allowing the trainee an opportunity to learn how to deal with unusual situations from a position of relative safety. Many simulators also include an autopilot that can emulate pilot actions for controlling the aircraft.
By building recursive sets of models that encompass fuselage, wing and control surfaces, the flight characteristics of new designs can be evaluated in great detail, for example, in wind tunnel testing. These subcomponents of the complete airplane can be viewed as a collection of simple machines from which the final product can be aggregated. This aggregation can be modelled using a simulator that can show behaviours of the entire design in the face of a complete range of flight configurations and environments. Most new aircraft have quite literally flown for many hours and in many environmental situations before the first complete rendition of the plane is actually built. It would be extremely beneficial if similar facilities were available for social systems, allowing one to learn something of dealing with the almost infinite variety of social orders, external to the reality of those systems.
The ability to form and function in social groups is an intrinsic strength of the human species. It is an interactive process that entails cognitive functions, with their subjective assessment processes; quite distinct from the objective cause and effect found in the purely physical world. From the simplest social order to the most complex, social interaction invites understanding through the use of comprehensive models. In this short paper some rationale is offered for a modelling approach to the commonly perceived concepts of privacy and identity.
The most basic social order seems a good starting point; that is, two people who interact. Perhaps they meet accidentally on the street or perhaps they meet in an office to engage in business. Perhaps one attacks another in a dark alley. Perhaps one pulls the other from a burning building. In every circumstance, each person seeks a desired outcome of the interaction. It might be a trivial desire, perhaps to simply walk past the other on a busy street. It might be a more profound desire, to conclude an agreement to buy and sell a product. These simple examples suggest that an interaction might result from a conscious act on the part of a person, or it might be forced by another. It might just be the result of seemingly random luck.
However grounded, when found within the context of an interaction, each person expresses their motivations relative to the situation through actions intended to influence the outcome. They assess the probability that their actions will achieve the desired result. In the course of this assessment, each person also assesses their understanding of the other person. This can be crucial to achieving a perceived successful outcome. To provide a means of homeostatic regulation to the model, an assessment of the validity of anticipated outcome is needed. This evaluation of a completed interaction can then provide corrections to the interaction process going forward.
Model of social interaction
So, we can recognise the basic constructs of a model of social interaction. An interaction proceeds within a defined context. Each participant to the interaction is driven by some personal motivation. Each has some anticipation of the other’s motivations and potential actions based at least in part on the history of prior interactions. From this beginning, we can define some of the seminal concepts:
• The probability that an action will result in the desired consequence is termed trust.
• The assertion of actions based on personal motivation is termed privacy.
• The understanding that one person has of another person is that other person’s identity.
• The probability that a consequence resulted from a specific action is defined as truth.
• The consequences of assertions of privacy by a person result in the formation of that person’s identity in the minds of others; a process we refer to as provisioning.
It is then useful to consider the approaches to this conceptualisation in a bit more rigorous fashion. The derived model is recursive, recognising that interactions proceed through multithreaded courses.
Of significant interest to the current discussion, the model suggests that in the assessment of trust, the concepts of privacy and identity are complementary facets of interactions among people. In recognition of this, two slightly more formal seminal definitions are made:
• Identity is self from the perspective of others.
• Privacy is the provisioning of identity.
This recognises that identity is not an internal characteristic of a person, but rather is the perception of that person by other people. Conversely, in order to exert influence over the identity thus fashioned, privacy encompasses a person’s control of interaction involvement, including the motivation of actions.
When interaction consequences take on physical form, they become manifestations of identity. Identity is experienced in the minds of those others as the sensation of reputation. Through pairwise interactions, a person asserts privacy in order to establish identity. Identity is important because it is the source of reputation that others use to gauge the strength of their willingness to interact with us. Privacy is cause. Identity, and hence reputation, is effect.
Behaviours addressed by the model have derived over the ages from the interpersonal interactions among people. They are behaviours based on the seminal processes of social engagement: expression and impression through the human sensorimotor system, including language. In other words, it is a model that tracks from evolutionary development since the emergence of the human species. The identity facilities observed within the model have two distinct facets:
• Differential‑identity: repetitively distinguishing people (telling them apart).
• Experiential‑identity: committing to memory the experiences of people.
Physiological mechanisms give rise to these distinct facets of identity; mechanisms that can be simulated in the technological realm. Over the ages, the physiological mechanisms gave rise to effective engagement of people through small social orders.
Mechanisms to assert privacy
The assertions of personal privacy that mark involvement in social interactions on a small scale can be extended through technical mechanisms to effective assertions of privacy that enable larger social orders. These technical extensions form a collection of simple machines through which privacy can be achieved beyond simple interpersonal interactions; extensions into the distributed world of digital interactions.
A partial list of these simple machines includes:
It is through these mechanisms that a person can successfully assert personal privacy. With a nod toward cause and effect, it seems obvious that these mechanisms all have direct affect on the assessment of the trust on which interactions are based.
Opacity refers to the ability to observe the various aspects of an interaction. In the world of small social orders, we seek to control opacity by restricting access to the context of an interaction. We put letters in envelopes before we mail them. We conduct meetings in closed rooms. We put locks on the doors. As we seek to extend the same capabilities of opacity to large social orders, we adopt more complex mechanical or electrical means. In the digital domain, we make use of cryptography.
The initial assessment of trust relies on the integrity of the various aspects of an interaction. Integrity refers to the observed ‘trustworthiness’ of those aspects. A less circular definition is: integrity is truth derived from observation. In other words, integrity will generally rely on a recursive evaluation of truth based on an objective, as opposed to subjective, assessment of consequence versus action. A priori, one trusts that a specific action will result in a specific consequence. A posteriori, one objectively observes the truth of whether the consequence actually resulted from the action.
While integrity is a desired characteristic for all aspects of an interaction, it is particularly relevant in attributing an assessment of truth to the actions of individuals participating in an interaction. In this discussion of the simple machines of privacy, identity refers to the means to establish and propagate differential‑identity and experiential‑identity. The mechanisms to establish differential‑identity involve enrolment in a closed system and means to authenticate that enrolment at arbitrary times and places. This is the realm of biometric characterisation of individuals.
In small social orders, these mechanisms are grounded in physiological facilities of the individual person. In large social orders, particularly in the digital domain, mechanisms of biometric measurement coupled with cryptography allow the physiological behaviours to be replicated by technical means.
Authority refers to the ability to invoke an action. In the absence of effective social systems, authority is limited only by physical and physiological means. This is the domain of “might makes right”. If you are able to invoke a particular action then you have the authority to invoke that action.However, a social order that supports a rigorous means of authority establishment and enforcement allows some actions to be proscribed and others prescribed. This control of authority is generally perceived as policy, law, or perhaps as a moral code. A necessary capability of an effective social order is to convey some credential of authority that is strongly associated with differential‑identity.
In the assessment of trust and truth during an interaction, a means of remembering the details of the interaction is necessary; a recording of experiential‑identity if you will. Integrity demands that there be reliable means of attribution: associating instances of authenticated differential‑identity with instances of experiential‑identity.
So, with this brief introduction of the means and requirements in mind, what observations might be drawn from this model of privacy, and hence of identity?
First, the model suggests that personal privacy is the central facility through which people engage in social activities. This, in turn, suggests that the primary behaviours to be addressed by any model of social interactions are:
• To enable the arbitration of competing privacy demands of participants.
• To enable the arbitrage of the infringement of personal privacy to benefit the social order.
• To provide means to remediate the unwarranted infringement of the personal privacy of interaction participants.
We can observe these behaviours in any social order, from an impromptu gathering on a street corner to the most profound aggregations of the nation‑state. Indeed, arbitration, arbitrage and remediation should probably be considered as additional simple machines in the modelling of privacy.
To realise the complete model, analogous to the aggregation flight simulator mentioned above, a comprehensive digital infrastructure is required; an infrastructure we designate as an Identity, Authority and Attribution (IAA) System. Currently more primitive incarnations of such systems offer means to address some of the behaviours noted. However, such piecemeal implementations of IAA Systems can be as much a threat to, as they are support for, personal privacy. A concluding observation is that comprehensive forms of IAA Systems offer significant enhancements to the personal privacy of the individual; particularly when such systems are buttressed by appropriate expressions of public policy. The details of such forms are topics for future consideration.