Home » Uncategorized

Social Complacency Model of Personal Privacy – The Faulty Discourse

2808366104

Something I learned during my graduate studies is that jails – like the old Don Jail as it once existed in Toronto (photographed by me above) – has special significance among those that study the history of people with disabilities.  For one thing, there are many people with disabilities in jail – sometimes placed there because there isn’t a more suitable facility.  A person can receive care treatment in jail that is not available outside the correctional system.  On the other side of the coin, people are literally “disabled” by being put in jail.  They become confined and in certainly ways immobilized.  (There is a difference between moving and getting anywhere.)  Their ability and especially opportunity to freely communicate with others in the world are limited.  Of course, there is tremendous loss of personal autonomy.  So actually, jail is a place of great social and personal significance.

When I was growing up – thinking about the ordeals of the people inside – I thought that working at a correctional facility would be quite an ideal job for me.  I don’t have the stature to be a correctional officer.  I wanted to be responsible for checking and maintaining records of the jail facilities, supplies, and the needs of the inmates – handling and analyzing the resulting data.  Apart from the negative aspects of imprisonment that I mentioned earlier, jail is a place where people are closely watched.  Surveillance is persistent.  It should come as no surprise that members of the general public are alarmed by the thought of being studied or examined by information companies – not at correctional facilities but at routine places – such as parking lots, malls, airports, transit stations, and at the workplace.  There is a deep and I sometimes wonder almost genetic level of apprehension not just from confinement but the sense of being monitored in an uncontrolled or unwanted manner.

There is a difference between gathering data to exploit people, serve them, and control situations.  At the moment, conflation appears to be the norm, perhaps because even the mere hint of being in a jail-like environment is so horrifying.  I appreciate the cause of protecting public privacy by placing limitations on data-collection technologies and the companies that make use of them.  But I sense that the conversation is being steered to places and in ways that caricaturize the nature of privacy.  Although I don’t doubt their good intentions, people are creating silos in the discourse by reducing a multidimensional concern to a rudimentary point of contention – consent.   Some local issues that have recently been raised by the media include the following: 1) the placement of facial recognition cameras in mall directory stands; 2) the placement of cameras on elevator advertisement screens; and 3) the involvement of an information company to plan waterfront development in Toronto.  I understand that the basic concern raised by some opponents is the apparent lack of consent in what seems like expanding corporate interests into the most intimate day-to-day routines of citizens – so the portrayal goes.

I understand the conversation surrounding the incursion of commercial interests into private spaces – such as the unauthorized activation of video and voice recording equipment by an external entity.  While I have my concerns about the need to obtain consent, I should point out many people are quite willing to allow routine aspects of their lives to be shared with others; and in fact they put some effort into sharing their lives for example on social media.  I see people whose heads seem to be permanently attached to their smartphones.  I am left thinking that these users probably want more services to help bring them closer to others.  They certainly need to control the manner of data collection and the recipients of the data.  So discussions over privacy in this area are completely understandable.

Loaded on my smartphone, I have a program that monitors my driving for my auto-insurance company.  In the homes of many people today there are devices constantly listening for instructions – e.g. to turn on lights, play music, order products online.  When I go to a bank or certain sections of a department store, I am accustomed to encountering cameras reminding me that I am being monitored.  I know this because they show an image of me shopping.  For my part, I tend to smile at the cameras so the proprietors get a good shot.  So the weirdness of being monitored is becoming less weird by the day.  Of course, the point of contention is consent although I don’t recall anyone ever asking for mine.  I suppose that consent is “presumed.”  For me, consent is not a contentious point at all since I entirely agree that it is necessary or at least highly desirable in private places.

I believe that people when they have private thoughts and moments, they sometimes expand their preconceptions of privacy to include the location or surroundings.  Not for me.  When I am deep in thought in the laxatives section of a store, and people start crowding around me presumably all hopeful for improved bowel movement, I know that actually I am in a “public space” – that of course is owned by a commercial operation.  So my claim to privacy is so extremely limited.  I leave for a moment for all of the people to go away.  Then I return once again to reflect on the selection of laxatives.  Indeed the moments of reflection belong to me.  But I can be monitored and joined by other shoppers at any moment.  Indeed, the cameras are in the store for a reason.  If I don’t behave myself, I can be pushed out of that space and compelled not to return.

My main counterargument in this blog is that “personal privacy” is relevant in places and situations where personal privacy can be reasonably expected and is genuinely desirable.  For instance, a person can be shot, stabbed, or raped in an elevator.  Privacy should not be protected in an elevator.  If anything, the activities in an elevator should probably be completely open to the public.  Similarly, an active shooter in a mall should not be protected by privacy provisions.  A mall is not a private place.  There are no intimate details to protect – since any “intimacy” that occurs is public – meaning that there can be no reasonable expectation of intimacy.  I support the idea of a company like Google being involved in waterfront planning in Toronto.  We are after all talking about waterfront space.  This is a big space that many people are expected to eventually occupy – out in the open – by which I mean not privately such as in a bedroom or apartment.  Indeed people will have their private spaces, which should be protected to maintain that privacy.  But outside these spaces, they are out in the open in public spaces shared by strangers.  Any space that might be shared by a stranger by nature cannot be defined as personal or private.

The question then might not be whether something is private by nature of the place that the data is extracted – but rather the extent to which control is relinquished for public or in some cases commercial use.  Here my focal point would be on the “consentability” of the data – i.e. the extent to which consent should be required or reasonably expected.  Can I demand that a stranger in a public space obtain my permission before looking at me – and before maintaining any recollection of my face?  Is the social expectation that people keep their eyes on the ground and have no idea who else occupies shared spaces in the community?  Of course, normally we don’t expect people to spend a whole lot of time looking.  I would argue that up to a point, consent is not expected.  I would even suggest that “socio-environmental awareness” is a right – that is to say, the right to be aware of the people and things around us.

Even without consent, I can quickly determine everything that I “need” to know about the people around me: number of individuals; general appearances; familiar faces; usually some context behind the intersection (e.g. this person is walking their dog; this person seems to be going to the recycling bin; this person seems to be going to work).  I don’t care about anything else, really.  I also appreciate that I personally have no right to know anything else – since I would be invading the privacy of others if I did.  I suggest that many commercial entities operate within this “non-consentability threshold.”  Without necessarily having laws and regulations in place, a bank or store can have surveillance and monitoring equipment.  This entity is asserting the right of socio-environmental awareness.  It must know what is happening within its space.  Imagine how much higher the insurance premiums would be – and how much greater the risk of liability – if the entity were indifferent, complacent, or negligent about events occurring in its spaces.

Issues surrounding secret surveillance – for instance, by a government intelligence agency over citizens of the state – should probably be interpreted in relation to competing “control” objectives.  Freedom might be curtailed if such an agency would like to control a situation – perhaps to prevent the emergence of a domestic threat.  On the other hand, citizens do not wish to give up control without just cause.  In relation to the private sector, perhaps citizens have some concern over ownership – given that the private sector seems likely to use the data collected for commercial purposes.  “Why should they make money from my data?”  This is a moot question if people are wondering why companies are trying to make money.  They exist to make money.  What people might be implying is that the proceeds of data use belong to them – like a copyright.

True enough during the early years of the internet, I recall receiving a cheque from one company after I allowed them to collect data from my internet surfing.  The more data they collected, the more money I could expect to receive.  Of course, casual surfing is something that I almost always do in private.  I use the internet at work but more to help me complete my tasks.  These days, it is no longer necessary to really follow where individuals surf since site traffic stats combined with cookies already provide so much data.  But if a company genuinely wishes to track where I surf from the standpoint of being inside my computer, this would be a problem in that I use my computer to access financial accounts.  So the data belongs to me not in a discursive sense but in a legal sense – where unauthorized data collection seems to point to criminal activity.  Beyond this, it is important to appreciate that in fact companies already know where we go when and where it is most important – that it is say, in relation to their websites.  Who does the data belong to in relation to site traffic?  I dispute the idea that companies are not entitled to data pertaining to events that occur on their property.

Clearly being in an elevator is quite different from being on a website.  For one thing, it is much more difficult to monetize the data.  Certainly the data has to be commodified to have value: it is probably useful for “commercial purposes” only in bulk.  There isn’t much money to be made from a glance at a face.  It could be argued that the interpretation of that data might be inappropriate – e.g. asserting the meaning or significance of different racial groups or even the methodology behind the expression of race.  Still, if people are hoping to receive a cheque for their data footprint in public spaces, I am going to suggest it is going to be really tiny payout.  In terms of people simply being entitled to privacy in the elevator, I already expressed my opinion that they are not given that total strangers can occupy the same space.  In fact, these strangers are entitled to know some superficial details of the people inside.  For instance, I am entitled to know if somebody in the elevator has a knife or gun, is about to sneeze, has a dog about to pee.  I have a right to socio-environmental awareness.

Let me begin to paint a different picture.  I believe like many people do that data should be used ethically.  Perhaps unlike other people, I also feel that collecting data is a matter of social responsibility and sensitivity.  We are responsible for ensuring that people, to the greatest extent possible, lead responsible, peaceful, and happy lives.  People in the private sector might not relate to this last statement – but only if they look at a single side of the ledger.  The larger a company gets, the more difficult it is to ignore different sides of the ledger.  Miserable and impoverished people cannot purchase products and services.  People that commit suicide – or that decide to disengage from society – represents market shrinkage.  So the insulation of an organization from the market represents a tragic and catastrophic thing since it is inherently unsustainable from both a business and social standpoint.  True growth only comes from the prosperity of clients.

Data must be collected to ensure that society operates in a manner that serves people.  Imagine a world where governments didn’t care what its citizens do – if they kill each other, themselves, vandalize property, sell narcotics, are unemployed, or take control over neighbourhoods.  Now consider companies that have only apathy towards the general public.  Maybe they sell products mostly to the rich and affluent – a tiny and increasingly shrinking market.  For the poor, they might sell the least desirable food and clothes.  They don’t provide services to or even acknowledge the value of those that are unemployed, elderly, or have medical conditions or any kind of disability.  Imagine being invisible in society – because those with the ability to provide products and services do not care to recognize your existence.  Think about being in a wheelchair in a city with curbs, steps, and stairs everywhere – because the designers didn’t care to take your existence into account – maybe because people in your situation don’t matter to them.

When driving home in the afternoon, I often wonder why planners seem to ignore my difficulties on the highway.  Lanes are closed for months with assets sitting on top doing nothing.  I am not taken into account.  The lived experiences of citizens don’t seem to matter.  When I go through an intersection and recall that the driver of a van deliberately killed a number of pedestrians right at the same place, I wonder why our policing seems so unprepared to prevent future attacks.  I go down a road where a young man once started randomly shooting people; and I am baffled why nobody was watching over members of the public and protecting them.  We give up certain freedoms in order to gain others.  One freedom that I expect to have is the freedom to live securely.  In essence, I am suggesting that we probably should have much more systematic surveillance – probably by computers.  Human surveillance is costly – and it doesn’t allow for systematic algorithmic analysis or event-driven response systems.  Not to mention that humans are sometimes full of peculiar beliefs, distorted ideals, and socially constructed biases.

On the lighter side of things, I routinely have difficulty finding the products that I need or want.  When companies do a bad job providing service, and I want to complain to help them resolve the situation, it is nearly impossible sometimes to reach a human or perhaps some kind of online page to express my grievances.  When I want to contact my own government to access my tax account, in the past I experienced great difficulty getting through.  In fact, I still haven’t gotten through despite numerous attempts maybe over 15 years.  I stopped counting although I try again periodically.  I don’t need privacy in these situations.  I want people to come to me – to help me.  I want the streets monitored by computer sentinels.  I want traffic control algorithms.

I think that the issue of privacy has become a bit insulated from reality.  The reality is, when a city has a lot of people, buildings, roads, and public resources, high levels of complexity emerge in relation to actionability, accountability, and systems of interaction.  It becomes difficult to serve people – properly and effectively.  They transform from citizens to silent masses.  They are silent because technologies are not in place to convey, analyze, and respond to their needs.  Many great civilizations have fallen.  I am sure that each case is unique.  But my suspicion in relation to ours is that we are encountering some logistic difficulties.  I think that the last thing we should be trying to do is create roadblocks that impair promising technologies – especially when “personal privacy” is not genuinely in danger.

When the Soviet Union declined – I am old enough to remember – there were many years before Putin during which Russia was chaotic – run by thugs and oligarchs.  There is no place for complacency in a complex society.  Personal privacy has a context.  First, it involves personal things.  Second, it deals with private matters.  An elevator ride is not personal – like maybe a wart or a rash on my neck.  Nor is it private – like my banking information or medical history.  It is an elevator ride.  A murderer can’t say – and yes, apparently murderers share elevator rides with people – “Hey, shut off that camera.  I didn’t authorize you to record my ride with these people.”  Say what – it’s not a private place – no authorization needed.  Perhaps the emphasis should be less on the collection of data itself and more on how that data is used.  It is necessary to ensure that the use of data genuinely serves the public good – or in any event creates no risk to the public.  To this end, an open policy of full disclosure might mitigate many concerns from members of the public.

I am going to suggest, if the powers that be forwarded each citizen the following contract, many people would sign it:  “You promise to share much more of your data with us so we can follow your life, ensure you have a place to live, money to pay for food and transportation, so you can benefit from proper education and healthcare, be assured of safety and security, a meaningful life where you can care for yourself, those dear to you, and so you can contribute positively to society.  Please sign this contract.”  Of course there would be different versions of this contract depending on the political slant or the services available from the organization.  So I don’t believe that the conversation on the use of these new technologies should be cheapened or taken lightly.  We should not simply raise barriers and say that this solves the problem.  There was a problem even before the technologies were introduced – in the form of social complacency.

Complacency is not the sort of normal that should be perpetuated.  Because there is a desperate need for improved stewardship.  The normal should be that every organization really wants to know what people want in order to serve their needs; and this service should include protecting and helping the public in all aspects of life.  So yes, privacy protection is important.  But in spaces that are not private – because they can be occupied at the same time by other people including total strangers – reducing the conversation to the need for consent creates unreasonable and perhaps illogical barriers to promising new technologies and systems.