Most of our public utterances and interactions quickly fade into obscurity. Chances are that you don’t remember the faces of the people at the grocery store or the license plate number of the last car that passed by you. Thanks to moral codes, human evolutionary limitations, and our imperfect memory, chances are that others don’t remember these details either. This is the concept of obscurity.
Obscurity means that information is relatively safe when it is hard to understand or obtain. In daily life, it denotes “publicly private” details about things, people, and conversations that assume an inconspicuous nature to others. In other words, information is out there, not hidden, yet not readily understood or acquired—it is intertwined in contexts and buried in people’s social circles and public peregrinations.
When you stand in line at the coffee shop and overhear a person in front of you speaking, moral codes tell you that it is wrong to attempt to find out more about what is being said, and most likely you don’t have the motives to do so. It requires an almost intrusive amount of effort to identify the content, context, and identities of those who do not belong in your social environment.
In addition to motives and effort humans’ two inherent evolutionary limitations, memory and perception, further secure information into obscurity: first, the passage of time ensures that human memory will fade and many details will cease to exist in people’s minds, and, second, space and distance physically conceal much of what is happening and being said around us.
Although it affords a sense of security and privacy over our information in public (think of the times you have relayed deeply personal and sensitive information to your companion at a restaurant), obscurity always involves risks, for it is not the same as privacy. It merely increases the levels of protection by function of probabilities; i.e., the cost of identifying and making sense of information becomes much higher.
However, all it takes is a trigger event for those protections to become undone—e.g., a revengeful former lover that decides to aggregate and piece information together and place it in public view or a political opponent that decides to search for and unearth one’s past actions and behaviors. Those are legitimate fears with plausible detrimental effects, but more dangerous to obscurity is the use of technology.
Obscurity in big data and surveillance
In the age of big data, where society overwhelmingly interacts online, publicly and privately, how safe are our conversations and actions from being grappled by advanced algorithms and sewed into an ever-expanding web of inter-relationships that seek to predict behaviors, personalities, and all sorts of attributes about people? How safe are the secrets that we once uttered in public, expecting that they will never turn back to bite us? When it comes to the debate around privacy, all sides of the argument tend to neglect the publicly private existence of personal information and the dangers of confining it to the two extremes, leaving nothing in between.
Once considered the job of the quintessential detective, to compile and display deeply personal information about individuals to interested parties (which conveys one’s expertise in unearthing and connecting personal information that is scattered in public and private domains), today’s detectives are not humans but algorithms, and their expertise is only the upfront effort to make them operational.
Woodrow Hartzog and Evan Selinger write that in a 1989 Supreme Court case, Department of Justice v. Reporters Committee for Freedom of the Press, the court recognized a privacy interest in the “practical obscurity” of data, or information, that is distributed in the public sphere but could only be found by employing an unrealistic amount of effort and time.
Since then, the concept has not been considered in courts and legislatures to defend intrusions of privacy, and technology companies have slowly permeated obscurity’s territory. One notable example is Clearview AI.
Clearview AI
Clearview is a facial recognition application that scouts the web for existing public information of individuals, but unlike other facial recognition solutions that rely on mass surveillance and limited datasets, Clearview’s strength lies on its code. In essence, the user of the application—be it a random person on the street or a police officer—takes or uploads a photo of a person and within seconds the algorithm returns its findings: the possible identity of the individual, their connections, locations, interests, personal characteristics, sexual orientation, etc. The New York Times reported that the company has mined over 3 billion pictures as of January, 2020—from Facebook to Google’s products to millions of other websites—allowing it to triangulate information and create strong connections around it, a feat that without its algorithms would require immense human power.
Its main goal is to help law enforcement solve crimes, but as is the case with other surveillance tools, its chilling effects do not stop there. The tool navigates the public sphere and extracts its information from it.
There is no law against what the company does, and Clearview brags about this on its website—yet, government officials, news organizations, and the public find Clearview’s capabilities intrusive and disturbing. Senator Edward J. Markey noted on Clearview’s behavior in a letter to the CEO of the company: “the product appears to pose particularly chilling privacy risks, and I am deeply concerned that it is capable of fundamentally dismantling Americans’ expectation that they can move, assemble, or simply appear in public without being identified.” Kashmir Hill, a New York Times reporter who tested the application, stated that uploading her picture into the app “returned numerous results, dating back a decade, including photos of myself that I had never seen before.”
It is becoming increasingly easy for governments and corporations (and soon the public) to access peoples’ personal information. It wasn’t long ago when we could venture in public in expectation of reasonable privacy. Today’s legal doctrine has evolved to the point that nobody can expect privacy in public, and companies like Clearview prove that. Clearview takes what we once considered semi-private and puts it in the forefront of our lives, creating an impeccable public memory repository. In that domain, obscurity remains undefended and our information is no longer relatively safe for it is easier than ever before to understand and obtain it. If Clearview or other copycats become widely available our lives will become open books, inviting the eyes of the curious, compelling us to suppress our words, thoughts, behaviors, and public wanderings, forcing us to live with the fear that our movements and identities will be open to scrutiny at anytime and anywhere.
The possibility of being watched and identified in all domains of our lives results to the ultimate prison of mind. We all know what it feels like to experience this. Perhaps you wanted to search for a sensitive topic on the internet, go to a questionable setting by yourself, or discuss a controversial idea with a friend. Even though you know that there is nothing physically holding you back, pursuing some things feels like nudging an ambiguous threshold, a boundary that advises, even screams, at you to turn back. Sometimes you decide to cross it, other times you heed its warnings and succumb to the uncertainty. There is no middle point where privacy and public exposure intersect anymore.
In defense of obscurity
Now more than ever, we need to embrace obscurity to prevent that. Obscurity operates as a shield that that allows temporary, non-representative of an individual, ill-advised, and temperamental thoughts and actions to be forgotten. This is especially important for when growing up, because kids and adolescents can only adequately grasp the world by experimentation and failures, even at the risk of future regrets. When thoughts and actions turn into permanent records, society and all its individuals become conformists. Without obscurity, it would be impossible to take risks, unapologetically fail, and temporarily embarrass ourselves. Eccentricity, challenging ideas, and productive confrontations, would take a deadly hit. The very notions that make us better people will turn against us.
We need to preserve the collective fading memory as we steadily become inseparable from technology. We need to remove obscurity from its ambiguous position and communicate it to the public, and we certainly need to add friction between technology and public life by devising solutions that render our publicly private existence inscrutable to algorithms. There is much to be done but not enough time, for technology always moves orders of magnitude faster than enacted law.