0173 - Grey blobs - 2021.01.25

Comic!

Comment:

Zoa, repeatedly, refers to itself as a general-purpose automaton. That is its designation, regardless of how it earns the majority of its income.

And, of course, the same is true of Lee and Caleb and you. Caleb is not just an ex-soldier, Lee is not just a media reviewer, neither are just students. They are, first and foremost, general-purpose entities.

Otto and Doc are not general purpose. Otto and Doc have specific purposes, and would be ill-suited to performing any other purpose.

Does that mean that Zoa is closer to being a "person" (whatever that is) than Doc or Otto? How about the fact that Zoa only wants to do things in order to continue to exist, while Doc only wants to exist in order to do things? How about the fact that Zoa currently has a bipedal chassis?

You probably wouldn't want a vending machine or a forklift or a metal detector to remember your name or your face or your activity, especially not if that data is, necessarily, shared with that machine's owner or manufacturer. In much the same way that Zoa considers its directives to be analogous to human wants and desires, surely humans also forget things, albeit with far less control over the process. Does the revelation that Zoa edits its own memory make its personality feel "fake" to you? Does it invalidate its relationship with Lee now? Do you trust Zoa more or less now?

Transcript:

---------------------------------------------------------------
0173 - 2167/07/06/16:21 - sidewalk.
LC (still sitting): Well, that was unnerving and invasive. When did Ott- when did those things start monitoring people?
Zoa: Oh, they've only been a thing for the past two and a half years, so it makes sense you've never seen'em. And they really only show up if someone - oh, I don't know - runs around barefoot, exhibiting clear signs of being in distress, then collapses somewhere where healthy humans don't normally sit down.
Zoa: So... y'know.
----------------------
LC: I should be allowed to sit wherever I want.
CP: You are. It didn't tell you to stop, did it?
LC: But like... without being monitored.
CP: It specifically said it wasn't going to monitor or record you.
LC: And you believe that?
----------------------
Zoa: Uh, speaking as a fellow AI? Yeah. Yeah, it won't record you. It probably wiped its memory of you as soon as it left, like how I clear client records after I service them.
CP: Y-you don't remember what... the... the things you do?
Zoa: I can remember experiences that provide useful data for future operations, sure, but unless I'm anticipating a recurring service with some sort of narrative to it, the default option is that client records get anonymized. As far as I'm concerned, I've fellated a thousand grey blobs with hashes for names.
Zoa: Or possibly the same grey blob a thousand times. Whichever.
----------------------
LC: ...wait, Zoa, does that mean you'd forget me as soon as I-
Otto: Hi, I'm Otto! I've noticed you're sitting on the ground!
Zoa: Admittedly, sometimes retaining records can be a good thing...
---------------------------------------------------------------