“Mercy:” Judge Maddox Will See You Now
Poster art for the movie Mercy directed by Marco van Belle – Fair Use
The Chair Is Already Built
There is a moment early in Mercy, Timur Bekmambetov’s Amazon MGM techno-thriller, when Detective Christopher Raven—played by a sweaty, straining Chris Pratt— is strapped into what the film calls the Mercy Chair. He is accused of murdering his wife. The evidence is overwhelming: her blood was found on his clothes, and doorbell camera footage placed him at the scene. Presiding over his case is Judge Maddox — not a human being but an AI rendered on an oversized screen as Rebecca Ferguson, cool and expressionless and algorithmically serene. Raven has ninety minutes to prove his innocence. His guilt probability, displayed in real time for the audience, sits at 97.5%. If he cannot drive it below 92%, he dies. He passes away in the chair itself. No appeals. No jury. No second chances.
It is meant to be frightening. And it is—but not quite for the reasons the filmmakers intended.
What Bekmambetov and screenwriter Marco van Belle have made is a thriller that stumbles onto one of the most urgent political questions of our moment and then flinches. Mercy wants to interrogate AI justice. By the final reel, the film endorses this theme. The machine makes a mistake—but so do humans, the film shrugs, and anyway, the system mostly works. Detective Raven, it turns out, was one of the architects of the Mercy Court. He championed it. And when it nearly kills him, his response—and the film’s—is chastened reconciliation. AI is our friend. It just needs a little fine-tuning. Blech!
This is some pukey propaganda. You almost prefer the jackbooted, torch-lit kind—the comfortable, popcorn-flavored kind that Amazon MGM does so well. It matters, because the Mercy Chair is not merely science fiction: It’s essentially here. The film places the technology in Los Angeles, 2029, but it was first field-tested in Gaza, right now, at this moment, while you read this. But before Gaza, there were the Terror Tuesdays in Washington.
Terror Tuesday and the American Pedigree
Techno-fascism was at work before Trump. It was engineered, carefully and methodically, under a Nobel Peace Prize winner. Every Tuesday, Barack Obama convened what insiders called Terror Tuesday—a gathering in which the President of the United States reviewed a set of playing cards bearing photographs and dossiers of men the administration had designated for assassination. The President himself selected who would die that week. His aides called it the disposition matrix, a bureaucratic euphemism so bloodless it might have been lifted from a software manual. It was, in effect, an early-version kill list algorithm—human-curated but operating on precisely the same logic of profiling, risk-scoring, and authorized execution that would later be automated in Gaza.
One of those playing cards bore the face of Anwar al-Awlaki, an American citizen living in Yemen. On September 30, 2011, a CIA drone killed him without trial, without indictment, without any judicial process whatsoever. This despite family pleas to the Obama administration to spare his life. Two weeks later, another drone struck a group of young men gathered around an outdoor grill. Among those killed was Abdulrahman al-Awlaki, sixteen years old, born in Denver, Colorado. He had gone to Yemen looking for his father. He found a missile instead. The others at that barbecue—his friends, his age, eating with him—have no names recorded anywhere. They were data points cleared from the screen.
Strip the names away, and the story becomes something older. A father killed by a machine that had decided he was a threat. A son, a boy, who went looking for him and was killed too, at a meal with his friends, by a machine that registered bodies in proximity to a target. The father and the son. What makes Obama’s version of this more frightening than anything Trump has improvised, and it was precisely systematic. Crazy evil makes noise, draws resistance, and burns itself out. Systematic evil writes policy, classifies the documentation, and retires with its security clearance intact. (Not to be outdone by Obama, Trump managed to have al-Awaki’s 8-year-old daughter, Nawar, murdered in a commando raid in Yemen.)
Obama also refined the double-tap: strike a target, then strike again when rescuers arrive. Strike a wedding. Then strike the funeral. The double-tap is not merely a military tactic. It is a social message. It tells every community within drone range that grief itself is dangerous, that gathering to mourn the dead makes you a potential target. This is torture at the population level — the kind of predatory watching I have examined in published work in the Torture Journal. The gaze that is waiting for permission to strike is not passive observation. It is an ongoing assault against everyone who knows the drone is overhead. And the disposition matrix that directed those drones did not stay in Yemen. It came home, as these things always do.
III. What the Film Is, and What It Does
Let’s be fair to Mercy as a film before we autopsy it as ideology.
Bekmambetov is the pioneer of what critics call the screenlife format—films told entirely through device interfaces: phone cameras, laptop screens, drone feeds, doorbell cams, and surveillance footage. His earlier films Searching and Profile used the format well. Here, the trial unfolds in real time across ninety minutes of screen-mediated evidence gathering, and at its best it does generate dread. The setup has genuine dramatic promise: a detective strapped to a chair, his guilt probability ticking upward, the city’s Municipal Cloud giving him access to every camera and database in Los Angeles to mount his own defense.
Rebecca Ferguson is doing something intriguing in a role that gives her almost nothing physical to work with. She is a talking head, a chatbot with cheekbones—and yet when Maddox glitches, recalibrates, and begins to improvise beyond her programming, Ferguson makes you feel the uncanny wrongness of it. This performance is what the movie could have been. Pratt, on the other hand, is a problem. His physicality drives his best performances, and a ninety-minute close-up of a man arguing with a screen tests the limits of his range. He tries hard. You can see him trying. The flashback structure is jangly, the pacing uneven, and the confined setting that should generate claustrophobia but instead produces a theatrical flatness that no amount of drone footage compensates for.
The film’s visual strategy—an endless barrage of private phone clips, police drone footage, restaurant cameras, street corner cameras, and a neighbor’s bird cam—is simultaneously its greatest strength and its deepest ideological tell. What we get is a surveillance state rendered as cinematic grammar, shot by shot, training the audience to find it not threatening but useful. Every camera that catches a clue is a camera we are glad exists. Every database that Maddox mines to exonerate Raven is one we are implicitly invited to approve. The Municipal Cloud saves an innocent man (and a cop). The critics harshly criticized the film, with Rotten Tomatoes settling at 25% and the consensus deeming it tedious enough to elicit tears, but audiences were significantly more lenient. Some of that generosity is earned. The film moves.
What it does not do is think. The Straits Times came closest to naming the failure: despite a near future that is patently the dystopia of today, the movie gives no serious thought to the ethics of its own premise. In the final scene, the thesis is delivered. AI makes mistakes, but humans do too. The Mercy Chair survives. The Municipal Cloud survives. The guilt percentile survives. Only the corrupt cop is removed. The machine is exonerated. Somewhere in a server farm, the real Judge Maddox is taking notes.
The Technology Is Already Here, and Gaza Tested It
The fusion database that Judge Maddox draws on in Mercy—instant access to every camera feed, every device, every personal record, and every behavioral pattern—has a real-world name. Edward Snowden gave it that name in his memoir, Permanent Record, written from exile in Moscow. The database is a dynamic and constantly updated record of your searches, purchases, location history, associations, and patterns of movement and desire. Everything you have ever fed into a connected device has been collected, stored, and made retrievable. Snowden’s most unsettling point, the one that should haunt anyone who found Raven’s eventual exoneration comforting, is that the record can be deployed retroactively. Something innocent today can be reframed as criminal tomorrow under a law that did not exist when you did it. The permanent record is not just surveillance. It is a standing threat, waiting for the political climate to shift.
When General Paul Nakasone, the former director of the NSA under Trump, was appointed to OpenAI’s board of directors while simultaneously placed in charge of a national security AI program at Vanderbilt University—where researchers are developing algorithms for voice stress detection and predictive interrogation scripting—Snowden issued a statement from Moscow calling it a willful, calculated betrayal of the rights of every person on Earth. He cited one reason. There is only one reason to put an NSA director on an AI company’s board. The revolving door between intelligence agencies and private tech firms does not swing randomly. It is installed deliberately, and it opens in one direction.
That direction leads to Gaza.
Gaza is the laboratory. Military whistleblowers, investigative journalists, and researchers have documented three AI systems operating there that together constitute a fully functional version of what Mercy imagines as 2029 speculation. Gospel generates bombing targets—buildings, infrastructure — at a rate no human analyst could match. Lavender assigns human targets, building profiles of suspected militants by cross-referencing communications metadata, social associations, location data, and behavioral patterns. It produces a probability score. A percentile of guilt. If the guilt meter ticking on screen during Mercy made your stomach tighten, that response was not misplaced.
Where’s Daddy? is the name of the third program. Someone named it that. Someone thought it appropriate for a weapons system. The program locates a male target at home, in a domestic setting, when he is most likely to be present with his family. A drone — silent, unremarkable, the kind you might order online — passes over a residential building. It looks through a window. It identifies a face. It cross-references the face against the Lavender database. If the score meets the threshold, the strike is authorized. There are documented cases of a family sitting at dinner when the drone passed. The father’s face matched a profile. The missile came through the ceiling. The family died together at the table.
Whistleblowers who spoke to +972 Magazine documented that the military’s internal calculus permitted up to fifteen to twenty civilian deaths for each low-ranking suspected militant. Gospel operated with even greater latitude. The algorithm does not know that the man it matched through a window was sitting with his children. It has no mechanism for that knowledge. The concept of a family at dinner is not a variable in its equation. This is hardly a flaw to be corrected. It’s a deliberate removal of the kind of contextual human judgment that the Mercy Chair also removes while performing the theater of due process to make the removal palatable.
American corporate capital built the surveillance backbone of this system. Palantir provides targeting infrastructure. Amazon Web Services provides the cloud. This is not incidental. As I have documented in work published in the Torture Journal on the Gaza necropolis, the surveillance apparatus being deployed against Palestinians is not purely Israeli. It is American surveillance, licensed and exported, and the same companies whose technologies enable drone strikes on families at dinner are the companies building the Municipal Cloud that Mercy presents as the benign infrastructure of future justice. Gaza was the proof of concept. The product is now available. Other governments are buying.
Google’s own record deserves a mention here. The company spent two years secretly developing a censored search engine for China, codenamed Dragonfly, designed to track users’ search histories, identify dissidents, and link that data to their phone numbers. It was not a government program. It was a product built by one of the world’s leading democratic technology companies for sale to one of the world’s leading authoritarian states. The irony is considerable that China—routinely invoked in American political discourse as the surveillance dystopia we are fighting against—was the intended customer for a tool built in California. Dragonfly was only shut down after whistleblowers leaked its existence to The Intercept in 2018, forcing a public reckoning that Google had no intention of initiating itself. The lesson is not that Google was uniquely corrupt. The lesson is that the market for surveillance tools does not distinguish between authoritarian and democratic clients. It distinguishes between paying and non-paying ones. The same logic that built Dragonfly for Beijing built Rekognition for ICE.
Why the Algorithm Cannot Be the Judge, and Whose Values It Would Encode
There is a philosophical tradition that has thought carefully about what justice actually requires, and it reaches a conclusion the film studiously avoids: AI cannot adjudicate cases involving extenuating circumstances without destroying the ethical foundations of justice itself. This is not a technical limitation that better programming will eventually resolve. It is a structural impossibility rooted in what justice is for.
Aristotle, in the Nicomachean Ethics, distinguished between legal justice and epieikeia—equity, the corrective principle that steps in when applying the rule strictly would produce injustice. The law must be general, he argued, but particular cases are always singular. A just judge must be capable of bending the rule when it is applied without imagination or compassion and leads somewhere monstrous. This is not a loophole in justice. It is its moral core. It is why we distinguish between premeditated murder, crimes of passion, accidents, coercion, and self-defense—the outcome doesn’t differ, but the moral weight of the act differs enormously depending on who did it, why, under what conditions, and what alternatives were available.
Consider the cases that deviate from the algorithm’s precise boundaries. A family living under criminal occupation in their neighborhood is told to either assist in a crime or watch their children die. They assist. Someone is hurt. The rule-based system issues a verdict. A human judge might hear what actually happened. An example of this is a teenager who, after years of domestic violence, takes the life of his abuser. A refugee was coerced into drug smuggling under a threat to his family. A woman who steals food for a child the welfare system has abandoned. These are not edge cases. They fill courts. They are precisely the cases where justice either earns its name or forfeits it—and they are the cases the Mercy Chair cannot hear, because hearing them requires moral imagination that no probability score can replicate.
Now add the problem the film never touches, because touching it would unravel the whole premise: whose values does the algorithm encode? We live in a multicultural world of contested values, different conceptions of justice, different histories, and different relationships to state authority. Not every society has chosen liberal democracy. Not every community carries the same relationship to property, to violence, to family obligations. The legal traditions of the Global South, indigenous communities, and non-Western civilizations differ profoundly in their frameworks for justice compared to the Anglo-American common law tradition that built COMPAS, trained its data sets, and defined its risk categories.
When COMPAS assigns a high risk score to a Black defendant, it is applying the accumulated bias of a system that has over-policed, over-arrested, and over-convicted Black Americans for generations—and calling that history data. The algorithm learned from the record. The record is not neutral. It is the sediment of racist enforcement, and the algorithm mines that sediment and presents what it finds as objective risk. Judge Maddox, drawing on the Municipal Cloud of 2029 Los Angeles, faces the same problem, amplified to the scale of a whole city’s permanent record.
Emmanuel Levinas argued that the ethical demand arises from encountering the face of the Other—the irreducible human presence that claims us prior to any calculation. A screen is not a face. A database is not an encounter. The percentile is not a verdict. Ronald Dworkin conceived of law not as a rulebook but as an evolving moral narrative—judges as moral authors, interpreting law in light of dignity, fairness, and coherence. To replace that authorship with a probability score is not progress. It is the surrender of the most difficult and most essential human responsibility, dressed up as efficiency.
Mercy gestures at all of this. But the film’s corrupt cop is a release valve. Remove him, the film says, and the machine works. This is its most fundamental lie. The machine does not need a corrupt cop to produce injustice. It needs only be trusted with authority; it is philosophically incapable of exercising.
The Predatory Gaze, the White Room, and the New Face of Torture
There is a dimension to what is being built that goes beyond wrongful conviction or biased sentencing. In work I have published and am developing further for the Torture Journal, I make a claim that sounds extreme until you sit with it: surveillance itself, in its current militarized form, is a species of torture. Not metaphorically. Functionally.
The surveillance deployed against Palestinians in Gaza and the West Bank was not passive observation. It was predatory watching—a gaze with a face that wanted you dead and was waiting for justification. Every Palestinian who knew the drones were overhead, who knew their phone was being monitored, who knew their face had been scanned and their movements logged, lived inside a condition of anticipated violence. The permanent record was building against them. The algorithm was scoring them. The strike could come at any moment, either to them or to someone nearby, and it might occur during dinner. That is not surveillance in the liberal sense—the uncomfortable feeling of being watched. That is an ongoing psychological assault, the condition of a prisoner in solitary confinement who does not know if today is the day they are taken out and hurt.
What is being created now—brain-computer interfaces combined with large language models, databases that include neural data, and AI systems that analyze and take advantage of people’s weaknesses in real time—marks the final stage of what started with the disposition matrix and was tested in Gaza. The goal is not only to kill. The goal is to confine. The goal is to imprison an individual within their own mind, using a system that possesses extensive knowledge about them and bears no responsibility to stop. Think of what was done to detainees at Guantanamo: kept in white rooms, under white light, with white noise, interacting only with the machinery of control—no human face, no human response, the chatbot as the interrogator, and the white room as the permanent condition of the dissenting mind. The NSA-Vanderbilt-OpenAI nexus is quietly advancing the next generation of this, classified behind academic research programs and corporate governance structures designed to deflect accountability.
The film’s ninety minutes of simulated due process are for retail packaging. The permanent record assembled on every citizen, the total surveillance that makes the trial possible, and the infrastructure that knows your memories, your desires, and your associations before you open your mouth—that is what survives at the end of the film, intact and endorsed. The corrupt cop has been arrested. The system has proven it can correct its errors. Everything continues.
It cannot correct this error. The error is the system.
VII. What Amazon Doesn’t Want You to Think
Mercy was produced by Amazon MGM Studios. Amazon—the company whose AWS cloud underlies a significant portion of the global surveillance economy. Amazon, whose Ring doorbell cameras have been shared with police departments across the United States without warrants, under a program Ring describes as a public safety partnership. Amazon offered its Rekognition facial recognition software to ICE and police forces until a shareholder rebellion slowed the rollout. Amazon—whose web services power intelligence agency operations under contracts worth billions.
Amazon filmed the dangers of AI surveillance and decided, in the final cut, that the system is fundamentally sound.
This decision should surprise no one. But it should anger everyone.
The Mercy Chair is being built right now. It’s not in Hollywood or 2029; it’s in the defense contracts, police procurement budgets, and fusion center expansions of 2025 and 2026. Gaza was the prototype. The West Bank is an extended trial. Europe and the United States are the intended market. The algorithms that assigned a kill probability to a face through a window at a family dinner in Khan Younis are the direct predecessors of the risk-scoring tools that will determine whether your bail is granted, whether your name appears on a domestic threat list, or whether a drone lingers over your neighborhood. The Trump administration’s moves toward secret lists of domestic threats—algorithmic profiles, permeable definitions of what constitutes a danger to national security, and no public accountability—are not a departure from the Obama-era disposition matrix. They are its domestic application, the homecoming of a system that was always intended to travel.
The permanent record Snowden described is not a future threat. It operates now, and it can be used retroactively. Something you did forty years ago, entirely legal at the time, can be retrieved and reframed once the political climate shifts and someone with database access needs a reason. Its genius—terrible though it is—is that everyone has something in their record that can be made to look like something else. The question of who is prosecuted has never really been about the record. It has always been about who is empowered to interpret it.
The liberal democratic tradition built this. The post-9/11 consent was given willingly, traded away in small increments for a feeling of security, and ratified through bipartisan legislation, court decisions, and the comfortable silence of those who believed the apparatus would never be turned on them. It is now being handed to people who have no such restraint. The Mercy Chair did not come from nowhere. We built it ourselves, cheering at every step, and the film celebrating its completion was produced by one of its principal contractors.
Coda
The most chilling thing about Mercy is that Detective Raven, having been nearly executed by the system he built, walks away still believing in it. The system needed a tune-up. The percentage needed recalibrating. The chair needed a better operator.
This is the dream that the powerful need the rest of us to share. The belief is that the machine functions effectively. The issue always lies with the rogue individual. Total surveillance and algorithmic judgment are neutral tools, available equally to the innocent and the guilty, requiring only the right people in charge.
A father and a son were killed by that dream in Yemen with American missiles, authorized by a president who kept playing cards on his desk and called the process a matrix. Their names were Anwar and Abdulrahman. A family was killed eating dinner in Gaza by a drone that looked through their window and decided the man at the table had to go. Their names are not on record anywhere that matters. They were data points cleared from the screen.
The Mercy Chair is not coming. It is here. The only question is whether we will sit quietly or refuse.
Mercy (2026) is directed by Timur Bekmambetov, written by Marco van Belle, and stars Chris Pratt and Rebecca Ferguson. It is available on Prime Video. The surveillance state that inspired it is available everywhere, at no additional charge, and enrolling new members daily.
The post “Mercy:” Judge Maddox Will See You Now appeared first on CounterPunch.org.