0213 - Crowd control - 2021.11.01



I am somewhat annoyed when sci-fi depicts AI "advancing beyond its programming" or "becoming more than just a machine". It's a lot like when a human develops psychic powers because they "unlock the unused portions of their brain", or when they're exposed to "unknown radiation" and it makes them super-strong. It relies on some numinous, supernatural thing that exists beyond what we understand and can therefore be whatever magic the writer likes.

Like a lot of Christian kids of my era, I was deeply impressed by C.S. Lewis, and didn't second-guess his brilliance until later in my life. One of the things he said was that morality must, necessarily, be something above instinct and thought, because it directs you to indulge one instinct and suppress another - something that tells you to press one piano key and not another cannot be, itself, a piano key. It must be the soul, you see, that guides us to good behaviour, and, thereby, towards a loving and ethical God.

This, of course, works perfectly well until a) you see a nonhuman animal engaging in rudimentary moral-like behaviour, or b) you hook up a human to an EEG and watch parts of their brain light up as they make moral decisions, or c) you alter a human via hormones or brain surgery, and watch their morality shift accordingly.

I also feel that the "supernatural" either follows a set of rules or it does not. If it does, it is simply another branch of physics that we haven't fully analyzed yet, no less pre-determined than gravity and pressure. If it does not, it is essentially random and may as well be a set of blind coin flips.

Caleb may not be expressing it overtly, but I think they do have a mental picture of ethics and morality as being something that is somehow untainted by their synapses or their circuits, and is therefore unimpeachable, undoubtable, and unexaminable. No doubt, this is why they need Mezzer Twofeather's class.


0213 - 2167/07/06/17:01 - Caleb's apartment, living room
CP: Y-you really think that all morality is j-just... crowd control?
Zoa: Not all morality. All communication about morality. And, given that there doesn't appear to be a physical reality to it aside from the data in your head... yes. It's not a rock you can pick up, it's a variable - sometimes an integer, sometimes a boolean, depending on the function you're calling.
Ziggy: Or, perhaps, an integer that is high enough or low enough that it may as well be a boolean.
Zoa: Precisely.
CP: I s-suppose I shouldn't have expected a couple of AIs to understand th-that morality is something beyond programming.
Zoa: And I suppose I shouldn't have expected a mammal to decouple it from emotion and instinct.
CP: Right and wrong are n-n-not instincts.
Zoa: The hell they aren't. Humans react to perceived wrongdoing with anger and revulsion long before they can concoct an ethical reason for doing so.
Ziggy: Zoa's right, Caleb. Studies have shown that morality and disgust are neighbours in the human brain. Think of any time you've heard someone describe a morally repugnant action as "disgusting", or describe something disgusting as "oh, that's just wrong".
CP: Th-that's that backfilling thing that Doc was t-talking about.
Zoa: Yeah, that's where I got that whole bit from. I've been learning about psychology today!
Ziggy: Mm, you should be careful about that, Zoa. Using unstructured learning from experience, rather than simply downloading a compendium of information, can sometimes lead AIs to wildly incorrect conclusions.
Zoa: Yes, much like figuring out morality for yourself, rather than receiving it as a set of instructions, can sometimes lead a human to erratic behaviour.
Ziggy: Well said.
CP: Hearing the t-two of you agreeing with each other is d-definitely either morally wrong or disgusting, and n-now you've got me second-guessing which it is.