More stories

  • in

    Facial Recognition and the Convenience of Injustice

    Some people are concerned that the latest generation of powerful technology tools now being developed and deployed may undermine important features of civilized societies and human life itself. Notably, Elon Musk is so worried about the danger of artificial intelligence (AI) that he invested in accelerating its development.

    Musk has voiced his concern while simultaneously expressing the hope that if good, stable and responsible people such as Elon himself develop AI before the evil people out there get their hands on it, his company SpaceX will succeed in moving the human race to Mars before AI’s quest to enslave humanity is complete. Fearing people may not make it to Mars in time, Musk launched Neuralink, a company that promises to turn people into cyborgs. Its technology, implanted in people’s brains, will presumably put a transformed race on the same level as AI and therefore allow it to resist AI’s conquistadors.

    Although this story of the race to the future by opposing forces of good and evil may sound like the plot of a sci-fi comic book, Musk has on various occasions said things that actually do resemble that scenario. In the meantime, AI is being put to use in numerous domains, theoretically with the idea of solving specific problems but, more realistically, according to the time-honored laws of consumer society as a response to the perennial pursuit of convenience.

    In the quest for convenience, one of the tasks people have assigned to AI is facial recognition. Apparently, it has now become very good at using the image of a face to identify individuals. It may even perform better than Lady Gaga in the knotty problem of distinguishing Isla Fisher from Amy Adams.

    Facial Recognition Technology and the Future of Policing

    READ MORE

    Law enforcement in the US has demonstrated its interest in the added productivity facial recognition promises. Like everyone else, the police like to know who they are talking to as well as who they should go out and arrest. The problem is that they sometimes arrest and incarcerate people that AI’s facial recognition has mistakenly identified. The accuracy of the existing tools diminishes radically with non-Caucasian faces. That means more wrongful arrests and indictments for black suspects.

    The New York Times makes a timid attempt to delve into this ethical issue in an article that bears the somewhat tendentious title, “A Case for Facial Recognition.” The article sums up the case in the following terms: “The balancing act that Detroit and other U.S. cities have struggled with is whether and how to use facial recognition technology that many law enforcement officials say is critical for ensuring public safety, but that tends to have few accuracy requirements and is prone to misuse.”

    Here is today’s Daily Devil’s Dictionary definition:

    Critical (for):

    A deliberately imprecise term used to evaluate the importance of an act that exists on a sliding scale between absolutely essential and probably useful, making it a convenient way of creating the belief that something is more important than it really is.

    Contextual Note

    The adjective “critical” derives from the Latin word “criticus” and relates to the idea of “crisis.” It came into the English language in the mid-16th century with the meaning “relating to the crisis of a disease.” When The Times article tells us that “many law enforcement officers” say facial recognition “is critical for ensuring public safety,” we need to realize that those officers are not referring to a crisis but to their own convenience. Facial recognition can, of course, produce a crisis when it misidentifies a suspect. But the crisis is for the suspect, not for the police.

    Embed from Getty Images

    The expression “critical for” has come to signify “really important in my view,” a very subjective appreciation in contrast to the far more factual sounding “crucial,” which comes from the idea of the “crux” or the core of a problem. The article underlines this question of subjectivity when it reports that a black officer it quotes “still believed that, with oversight, law enforcement would be better off using facial recognition software than not.” “Better off” is not quite the same thing as “critical.”

    But let’s take a closer look at the claim that “law enforcement would be better off.” How do we parse the subject, “law enforcement,” in this sentence? The term “law enforcement” can be an abstract noun meaning the official function of monitoring behavior in a community to ensure optimal compliance with laws. This appears in sentences such as “one of government’s responsibilities is to provide the community with the means of law enforcement.”

    But law enforcement can also refer to the police themselves, the officials who are empowered to apprehend and deliver to the judicial system those who are suspected of infringing the laws. Which one is “better off” thanks to facial recognition? In the first case, abstract law enforcement — we are speaking of the safety of the community. “Better off” would then mean more optimally and more fairly conducted. In the second it is the men and women doing the job. For them “better off” basically means improved convenience.

    So which one is the article’s author, Shira Ovide, referring to? Clearly, the following explanation indicates that for her, law enforcement refers to the police and not to the needs of the community. “That’s the position of facial recognition proponents: That the technology’s success in helping to solve cases makes up for its flaws, and that appropriate guardrails can make a difference.” It’s all about the job of “helping to solve cases.” Ovide is a technology specialist at The Times, which might explain her focus on convenience rather than the ethics of policing.

    What Ovide says is superficially true, but the same logic could be applied to slaveholding. If we admit the argument that slavery helped to boost agricultural production — which of course it did — we could point out that the boost it provides makes up for its flaws. That was how slaveholders reasoned. The crucial difference — rather than critical — lies in examining the nature and the impact of those flaws. After all, slaveowners also thought about “appropriate guardrails.” They simply called them “slave patrols.”

    This is where, for Ovide, bureaucracy comes to the rescue of the logic of convenience and reveals the underlying logic of modern law enforcement: “The new guidelines limited the Police Department’s use of facial recognition software to more serious crimes, required multiple approvals to use the software and mandated reports to a civilian oversight board on how often facial recognition software was used.”

    The article ends on a slightly ambiguous note but fails to go into any depth on the moral question and its civic consequences. It seems to endorse the idea that with the right procedures, the gain in efficiency is worth the random damage it will produce.

    Historical Note

    The above reference to slave patrols may sound exaggerated, but it is not trivial. As Chelsea Hansen at the Law Enforcement Museum points out, slave patrols were “an early form of American policing.” The strategies and organizational principles that grew out of slave patrols influenced the evolution of policing in the United States. Race has always been a major, but usually unacknowledged feature of American law enforcement culture.

    The late anthropologist David Graeber put it brutally when he noted that the criminal justice system in the US “perceives a large part of that city’s population not as citizens to be protected, but as potential targets for what can only be described as a shake-down operation designed to wring money out of the poorest and most vulnerable by any means they could.” Mass incarceration has, among other things, enabled a modern form of slave labor.

    In other words, there is a vast historical and cultural problem the US needs to solve. That is precisely what’s behind the idea formulated as “defund the police.” The slogan itself is misleading. What it really means is “rethink the police.” But asking Americans to rethink any problem appears to be beyond their capacity. It’s always easier just to point to one simple practical task, like defunding.

    The debate about face recognition in policing should not focus on the tasks of police officers and the convenience it affords them but on the relationship between law enforcement and the community. But that would ultimately require weaning the consumer culture of its addiction to the idea of convenience.

    *[In the age of Oscar Wilde and Mark Twain, another American wit, the journalist Ambrose Bierce, produced a series of satirical definitions of commonly used terms, throwing light on their hidden meanings in real discourse. Bierce eventually collected and published them as a book, The Devil’s Dictionary, in 1911. We have shamelessly appropriated his title in the interest of continuing his wholesome pedagogical effort to enlighten generations of readers of the news. Read more of The Daily Devil’s Dictionary on Fair Observer.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More