0149 - The purpose of sorry - 2020.08.10



That's right, in 2167, the Sapir-Whorf Hypothesis has been upgraded to a Law. They proved that it's true! Yay cognitive science! Yay science in general!

This conversation is, of course, essentially the same conversation about "pain" that Doc had with Lee a few moments ago. It's a relevant question, when it comes to different types of minds (such as, oh, I don't know... artificial intelligence?). Is there a meaningful difference between an emotional state that makes someone feel bad and a sequence of data that induces the same behaviour? If you want to induce a change in behaviour, do you particularly care if the person involved feels bad or not?

One could make similar arguments about the justice system. If we consider prison's primary purpose to be the safety of the community and the rehabilitation of criminals, and we found that criminals could be converted into fine, upstanding, stable, law-abiding citizens via a program of free Netflix, bottomless margaritas, and daily blowjobs, would you support that change? Does it particularly matter whether or not bad people are made to feel bad?

If the Sapir-Whorf thingie is actually a law, does that mean that forcible changes in language can be used to change the culture? If our language actively differentiated between the emotion of love and the practice, would that make relationships healthier? Has the shifting cultural tide of slurs falling out of fashion actually improved the experience of the individuals those slurs targeted? If changing language does result in changing behaviour, to what extent can it and should it be regulated?

This is another one of those comment files that's just a pile of disconnected philosophical questions, I guess. They aren't necessarily meant to provoke rigorous debate. I don't know that any of these questions even have answers, necessarily, but, much like the GM asking about your character's backstory, sometimes just forcing yourself to give answers to seemingly irrelevant questions can clarify aspects of yourself that you didn't know you already knew.

As a completely unrelated side note, I'd like to point out that today's strip is number 149. 1-4-9 is the proportions of the monolith from 2001: A Space Odyssey. It's meant to be a universal sign of the aliens' intelligence because it's a sequence of squares (and because a Fibonacci monolith, 1-1-2, would be rather less imposing.)


0149 - 2167/07/06/15:57 - Lee Caldavera's apartment, living room.
Doc: Lee, what do you think "being sorry" means?
LC: Feeling bad about something you did.
Doc: Zoa, what do you think "being sorry" means?
Zoa: Acknowledging that an action you took was suboptimal and, therefore, altering your behaviour pattern going forward.
LC: Wait, you don't feel bad?
Zoa: Of course I feel bad. I feel bad about everything. But how I feel doesn't matter.
LC: Yes it does!
Zoa: If you tell the thermostat to make things colder, you wouldn't want it to feel emotionally bad about how hot it is, you just want it to recognise that the temperature is incorrect and change its settings accordingly.
Doc: Much like "love" or "hate", there is the emotional state, and there is the intention to act. The two are correlated, of course, but they are not the same thing.
Zoa: I blame the English language. This shit is all clear when I think about it in computerese.
Doc: Agreed. And, as per the Sapir-Whorf Law, the inexactness of terminology limits Lee's ability to recognize and deal with their own emotional states, to say nothing of the emotions of others.
LC: So I should learn to speak and think in binary?
Zoa: That's gonna be very time-consuming, but I'll try my best to help, if you want.
Doc: Zoa, please stop attempting to fundamentally alter my patient's consciousness.