0200 - 300 kills - 2021.08.02
Ah, strip #200, time for a fairly big reveal!
Yes, the characters haven't been mentioning it (for the same reason that characters in our time don't go around casually discussing the Geneva conventions all the time), but international conflict in 2167 is bloodless. You aren't allowed to shoot an enemy soldier in the head with a bullet, in the same way that you aren't allowed to deploy landmines or pretend to be a medic.
...and now I finally get to talk about Operation Slit Throat.
In 2017, before I drew Panel One of this webcomic, I wrote up a timeline of world history that helped to shape the world of 2167. On that timeline, we see a slow buildup of automation that results in widespread unemployment, the gradual development of the colonies on Mars that would eventually become a powerful independent nation, the transition of the world's powergrid to hyper-efficient solar energy, and the side effects of the election of US President Gordon Smith on January 19, 2061.
You see, the United States in 2061, as now, had powerful military bases scattered throughout the world, and they worked using TIARA, the Threat Intel Analysis and Response Algorithm. This highly sophisticated program used Deep Learning (which is programmer-speak for "this built itself, and I'm not entirely certain what's under the hood, but it works") to analyze intercepted enemy communication, pinpoint threats, and launch drone strikes at terrorist targets.
Now, the Powers That Be wanted to make sure they wouldn't be accused of prejudice, so TIARA was specifically designed to be colour-blind. It was hard-coded so that it wouldn't pick targets based on their gender, ethnicity, religion, nation of birth, or level of melanin. And yes, for years, it worked perfectly well and never blew up the wrong person (or, at least, everyone it blew up could be, retroactively, declared to be the right person)... until the millisecond that Gordon Smith put his hand on a Bible and swore to defend the constitution.
Unbeknownst to its programmers, one of the factors that TIARA had taught itself to use in determining whether or not someone could be a terrorist was commonness of surname. (You may have noticed that Muslim communities often have a lot of people with last names that are all variations of "Mohammed".) Thus, when the POTUS changed from Vanderbilt to Smith, a switch flipped. TIARA was now aware of an individual with 1) a common surname, 2) a lot of money and resources, 3) the allegiance of thousands of armed soldiers, 4) many alternate aliases (like "POTUS"), 5) frequent travel, 6) bases of operation around the world, 7) mentioned frequently in terrorist chatter, etc, etc, etc.
And yes, of course, when TIARA launches a drone strike, it notifies a human operator, who can immediately countermand it. This is, unfortunately, not useful when the drone strike mission has a travel time of zero seconds.
Thousands of intelligent weapons, finding themselves right on top of a known terrorist's assets, immediately did their job and detonated. In less than fifteen minutes, over ten thousand people lost their lives, and the damage was estimated in the trillions of dollars.
The United States initially tried to blame either Chinese or Russian hackers, but, thanks to some very brave (and tragically short-lived) whistleblowers, indisputable evidence of what actually happened was made public, and led to a contentious summit in South Africa.
The Johannesburg Accord was signed, banning any automated or machine-assisted targeting of humans in warfare. Shortly thereafter, the use of humans in installations or vehicles that were capable of functioning unmanned was also banned, so that nations couldn't use their own people as human shields.
As of this conversation in Lee's apartment, it's been slightly over a century since those rules were put in place, and others were gradually layered on top of them. Caleb's military service was arduous and stressful and risky, yes, but it was never in any way homicidal. As Caleb told Orb back in strip #115, they felt guilty about participating in the international war machine, but never because they were hurting people directly. They felt guilty about the waste of trillions of creds, all going into ever-more-elaborate drones that continually destroy each other, all so that the nations of the world can take turns liberating each other.
Having rules for warfare at all may seem counter-intuitive - after all, in a fight for something truly important, why wouldn't you use every possible tool at your disposal? - but they are important for the simple reason that implementation of a tactic gives your enemy permission to use the same tactic. If you're in a fistfight with someone and you pull a knife, you don't get to bitch when they also pull a knife, because you're the one who turned this from a fistfight into a knifefight.
This is one of many reasons why I (along with millions of others) so vehemently opposed the normalization of "enhanced interrogation" back in the post-9/11 era. Aside from the ineffectiveness and moral depravity of torture, waterboarding captured enemy combatants necessarily excuses them doing the same thing to your own captured troops.
....man, this is a long entry.
Anyway, one of the themes of Forward is that, for all that we may think of Human Rights as somehow inherent and sacred and immutable, we have seen them change over time, and, therefore, should not expect them to remain the same in the coming centuries. Slavery wasn't always regarded as an abomination. Freedom of speech and religion weren't always guaranteed. Who's to say what rights will be considered fundamental and unquestionable, many generations from now?
A world in which soldiers aren't allowed to kill each other may seem far-fetched, but I've always said that if your vision of 2100 isn't as different from 2000 as 2000 was from 1900, then why are you bothering to write sci-fi?