The Dangerous Fantasy of Cybernetic Cops
The political and cultural theorist Paul Virilio famously wrote that we live in a state of permanent (or “pure”) war. By this, he meant that there is an increasing “perversion” of any clear-cut distinction between civilian and military institutions and, by extension, civilian and military life. According to Virilio, after the Second World War, Western economies and societies were permanently reorganized to support military power. The focus shifted from battlefield tactics to a broader strategy aimed at restructuring industrial society to rapidly produce rockets, missiles, and other armaments — or, more recently, computational, algorithmic, and sensing technologies.
It is in the context of pure war that the technologies, techniques, and logics of militarization spill into the everyday. One prominent example is the increasing militarization of technologies used in the governance of civil societies, such as by police forces. The geographer Stephen Graham refers to this as the “militarization of everyday life”: the “insidious” creep of both technologies that track, sort, and profile, and logics that have increasingly come to characterize approaches to urban governance. Cops see urban environments as sites of “threat” and are outfitted with rugged tactical armor, high-powered assault weapons, and facial recognition software. Just like that, the logics of militarism — flowing through technology — extend into quotidian life.
Augmented reality (AR) is one such example of how militarization has seeped into everyday life, particularly through the work of police forces. For AR firms that provide software and hardware to police, there is an enduring fantasy of creating “smarter” cops who operationalize real-time data flows in ways that mirror common depictions of cyborgs in popular works of fiction. For example, Vuzix — a major manufacturer of augmented reality headsets, which has developed applications that incorporate facial recognition (working with companies like the controversial Clearview AI) — makes the (curious) comparison between a wearable augmented reality headset and Paul Verhoeven’s 1987 film “RoboCop.” As the company states on its website: “While the dystopian society envisaged in RoboCop is nothing to emulate, the innovative tools its main character uses to protect those in need is within reach.”
Like Palmer Luckey or Michael Abrash in their desire to recreate “The Matrix,” Vuzix seemingly misses Verhoeven’s subtext in “RoboCop” of social satire and commentary on the corporatization of urban governance in light of the growing privatization of social services under Reagan-era austerity. It also seems lost on Vuzix that, as a purveyor of technologies of violence and control to police forces, it plays a similar role to Omni Consumer Products — the nefarious corporation and antagonist in the film, a company that facilitates and seeks to profit from the chaos of a dystopian Detroit through the provision of extremely harmful technologies to police forces. As we have pointed out elsewhere in a more comprehensive review of AR police tech, the goal of many of these AR applications for policing is to mediate how police think, feel, and act while doing police work. The point is that they provide police with a kind of anticipatory edge over the “threats” of the urban environment, feeding and visualizing streams of data — such as those derived from facial recognition systems.
Virtual reality (VR) policing applications envision a different kind of cybernetic cop. Unlike AR, which overlays digital elements onto physical environments, the simulated modeling of the world through VR seeks to shape how police think, feel, and act before they do police work. It introduces a unique kind of anticipatory logic, rooted in the largely fallacious assumption that VR has the capacity to instill the user with preparedness.
Already VR is being used to train police forces worldwide, including in the United States, the United Kingdom, and Australia. One market-leading firm in this space is Axon (previously Taser), a company that chiefly develops technology and ostensibly nonlethal weapons for military, police, and civilians (most famously the Taser electroshock weapon, widely used by police in the United States). In 2017, following the rebranding of Taser to Axon, the company began developing bodycams for police forces, sold on the premise of transparency, accountability, and more ethical policing — a response to a spate of high-profile police killings of predominantly Black and brown people in the United States driving the adoption of these bodycams. Axon’s cameras were a means to “capture truth” (while also operating as a “rugged communications beacon”), according to their press releases, allowing the communication of information between cops — a tool for further weaponizing police under the guise of security.
Extending this reformist vision of policing, in 2021 Axon began developing VR simulation tools (as part of a VR-based learning management system) to be sold to police departments — framed as providing both weapons skills and empathy training. As Axon put it:
Combining the HTC VIVE Focus 3 VR headset and the VIVE Wrist Tracker from industry leading partner, HTC VIVE, Axon VR provides an all-in-one, portable platform that’s simple to deploy and use, with no extraneous hardware, time or space constraints. The VR training provides connectivity for both in-person and remote experiences, creating an on-demand platform that can be accessed anytime.
Axon also offers simulator training for weapons, including its own Taser energy weapons, as well as firearms training — equipping users with an accurately modeled and weighted handgun, outfitted with sensors such that its movements are modeled in the VR simulation. The risks of making cops more lethal in the United States — a country with a particularly murderous police force — should be obvious. But Axon’s selling point for its VR is that “unlike traditional simulators that only offer use-of-force training, Axon VR enhances an officer’s ability to de-escalate many of the most common calls for service.” Put differently elsewhere, Axon offers “empathy training” — providing officers with “immersive content designed to encourage critical thinking and de-escalation.” In their view, de-escalation training aims to bring about “improved civilian interactions” and to help “rebuild the fractured relationship between cops and communities.” The company sees itself as “spearheading the dialogue that helps heal society.”
A particular point of focus for Axon in its promotional materials is a VR-based module simulating an encounter between a police officer (played by the user) and a person (modeled in the system) with various forms of cognitive impairment (with modules including autism and schizophrenia — options, as one journalist notes, that are “laid out on a menu screen like the levels of an early 2000s platformer game”). Axon reports that its VR training modules have been adopted by over 1,000 police departments across North America. For Axon, the modules are “not about stopping criminals”; rather, they are about equipping first responders with the skills to “recognize and successfully deal with a range of mental and psychological conditions.” Further, “the goal of these modules isn’t about ‘catching the bad guy’ but rather work to ensure the safety of the subject as well as the officer.”
In a similar fashion, Google’s Jigsaw division — which focuses on “threats to open societies” — has developed Trainer, a VR platform for “adaptive scenario-based training.” As Google describes it:
Recent advances in virtual reality (VR) have demonstrated the potential for technology to create scalable opportunities for law enforcement and other public safety professionals, particularly in the realm of education. In training contexts, VR can create a uniquely immersive experience, employing heightened tensions to build critical skills in an environment that mimics the same physiological responses as those generated in real-world interactions.
VR here is part of an evaluation assemblage that hinges on Google’s natural language processing AI, Dialogflow. In this way, Google claims that its software can categorize likely user intent based on speech inputs to the system’s interactive voice-response system (where users communicate with virtual characters).
For both Google and Axon, contra a wider range of mixed reality policing technologies and other policing tech (e.g., “predictive policing”), training is framed as explanatory rather than predictive. The outcomes of how police perform at training simulations have formed part of evaluative criteria in training, a way to divine which cops are “good” and which are “bad” (and a data point for backing up arguments about police reform).
It is hard to take seriously such propositions of VR as a techno-fix for deeply sedimented, structural issues of racism, ableism, and authoritarianism that pervade police forces and underlie the violence enacted by police forces in the United States (which currently form the largest market for these training simulations). Additional training — presented as a common suggestion by those arguing for police reform (as opposed to abolition) — is unlikely to dislodge the entrenched “warrior” culture in police departments across the United States, characterized by a belief in the “thin blue line,” the worldview commonly held by many cops that they’re the only thing keeping society in check and must do so at any cost.
Such VR systems of simulation feed into a wider societal problem, what abolitionist scholar Jackie Wang calls carceral capitalism. As Wang argues, with the technology-driven nature of hypermilitarized and surveillant policing — where new technologies facilitate the criminalization and prosecution of relatively nonserious offenses — the dynamic has proven incredibly lucrative for the private sector firms in the carceral industry (such as those that provide the telecommunications systems for prisons, for which they charge exorbitant usage fees).
The carceral industry sees new opportunities for profit in VR. One particularly egregious example is Global Tel Link (GTL), a prison contractor that provides telecommunications systems and payment services to prisons in the United States. In 2017, GTL filed a patent for a “system and method for personalized virtual reality experience in a controlled environment.” Put plainly, GTL wants to charge prisoners to use its VR software, allowing the incarcerated to, “for a brief time, imagine himself outside or away from the controlled environment.” VR is at once the mechanism for training cops to target and persecute difference and the mechanism from which a parasitic carceral technology industry seeks to derive profit from that same oppression.
Fantasies of simulation — of VR’s capacity to model real-world phenomena — are often considered in neutral terms. Yet simulation is neither made nor deployed in a vacuum. It is closely entangled with the values, aims, and logics of the institutional systems within which it is embedded. For all its supposed novelty, VR entrenches forms of thought and action that reinforce the status quo. This is particularly apparent when adopted by police forces and militaries — where a fantasy of simulation might be more accurately construed as a fantasy of violence: a technology bringing about injury, harm, and even death, or exacerbating existing forms of inequality, discrimination, and bias.
Ben Egliston is Lecturer in Digital Cultures at the University of Sydney and an Australian Research Council DECRA Fellow.
Marcus Carter is Professor in Human-Computer Interaction at the University of Sydney and an Australian Research Council Future Fellow. He is the author of “Treacherous Play” and “Fifty Years of Dungeons & Dragons.”
Egliston and Carter are coauthors of “Fantasies of Virtual Reality,” from which this article is adapted.