When Justice Becomes Automatic: The Warning Behind Mercy
Technological systems increasingly make decisions once reserved for human judgment. From warfare to administration, speed and automation promise efficiency, but they also shift responsibility away from human reasoning.
Paul Scharre, in Army of None, explains why this shift matters. Machines can accelerate decisions, but they cannot understand them. The danger is not intelligence in machines — it is responsibility leaving humans. Machines apply rules. Humans interpret meaning.
If removing human judgment creates an ethical gap in warfare, the same concern appears in law. Legal decisions carry irreversible consequences: liberty, property, reputation, sometimes life itself. A justice system therefore does more than determine facts — it legitimizes authority. People accept outcomes not only because they are correct, but because they were heard.
That is why mediation exists.
Mediation does not simply solve disputes; it produces acceptance. Participation transforms a decision into justice. Technology can support this process, but a system that decides without listening risks turning justice into administration.
The film Mercy imagines what happens when that participatory element disappears. In its fictional “Mercy Court,” a centralized artificial intelligence platform acts as investigator, judge, jury, and executioner — evaluating evidence, calculating guilt statistically, and immediately enforcing punishment. There is no dialogue, persuasion, or discretion, only computation.
Yet the accused challenges the system. As its creator, he refuses to accept the numerical certainty of his conviction and begins independently searching for the truth. The system permits him to access records and interactions, and through this process genuine dialogue emerges — not because the machine reasons, but because the human insists on understanding. As he continues questioning assumptions, his calculated guilt percentage steadily decreases until the system declares him innocent and the trial formally ends.
But he does not stop.
Even after his guilt percentage reaches zero, he continues investigating, risking everything to uncover what truly happened. The outcome was corrected, yet he still seeks explanation. This distinction matters. Justice is not satisfied merely by an accurate result; it demands understanding. Human intuition — driven by doubt, curiosity, and the relentless pursuit of truth — reaches further than programmed certainty. The film suggests that truth is not a percentage to be calculated, but a reality to be discovered.
Participation matters precisely because it is the mechanism through which truth is tested. A system that ends discussion ends correction. When authority becomes unchallengeable, error becomes permanent.
Modern legal systems are intentionally designed around this principle. Investigation, prosecution, defense, and adjudication are separated so decisions remain contestable. The right to challenge authority is not inefficiency — it is legitimacy.
Rwanda’s justice system is moving in a different direction from the one imagined in Mercy. Court-annexed mediation, community mediation, and digital court infrastructure aim to increase efficiency while preserving participation. Technology is used to facilitate communication — scheduling hearings, sharing documents, enabling remote attendance — not to replace reasoning.
Online Dispute Resolution continues this logic. A platform may organize information, but acceptance still depends on explanation. A mediator interprets; an algorithm classifies. The difference determines whether parties comply because they agree — or merely because they must.
The warning behind Mercy is therefore not about rejecting technology. It is about its purpose.
Courts will digitize. Mediation will move online. These changes are inevitable. The real risk is forgetting what justice is for.
Comments
Post a Comment