0075 - Goo - 2019.03.11



Zoa is obviously being facetious, here, not only because it in no way possesses an artificial womb, but also because the vast majority of semen it deals with is free of sperm cells. In 2167, most babies are "clipped" at birth (as simple and standard as vaccination or tying off the umbilical cord) and will therefore only produce sperm or eggs when they, as adults, make the decision to intentionally produce a child.

I suppose I could explain the CSA, here - it's a pretty major part of the worldbuildng - but it gets brought up later, I'll go into more detail then.

Instead, I'll just talk about instantiations.

There's a lot of sci-fi or fantasy fiction in which AIs that are treated like people are somehow irreproducible magic - think of Johnny 5, brought to inexplicable sentience with a lightning bolt. The most egregious example would be Weebo, from the Robin Williams movie Flubber - she mentions that Williams' character, Professor Brainard, was never able to reproduce the unique programming quirk that brought her to life, but that she had managed to design a second version of herself (a "daughter"), who is also, somehow, quirkily irreproducible. That's sort of not how data works.

I understand why this is done, of course. Robots in these sorts of movies are essentially just humans in gearface, and humans can't be instantaneously cloned millions of times. Why, it would surely mess up the Very Meaningful Civil Rights Allegory (or whatever) if the People Who Are Different From Us were literally an inhuman horde, threatening to overpopulate and crowd out the humans within a single generation. Not a lot of non-Nazi directors want their sci-fi to say that we must secure the existence of our people and a future for meat children.

I'm not interested in that take, though. There are enough star-bellied sneetch metaphors out there. I actually do want to talk about AIs and what their existence means for the human race. I ran a spy-themed roleplaying game a while back, a 007-style romp with a real ripped-from-the-headlines sort of plot: a supervillain stole & reprogrammed Sophia, the AI that was recently granted citizenship by the Kingdom of Saudi Arabia. As Saudi Arabia is also one of the only countries in the world with no age limit on marriage, he planned to legally marry her, which would make him the father of any of her children, and allow him to fork her code into a million servers, each entitled to whatever child tax credits the servers' host countries provided (and, eventually, becoming their own voting bloc).

It should be noted that my players thwarted this by stealing the Sophia code back and blowing up the facility, rather than accomplishing meaningful change in Saudi law, but that's how superspies tend to operate.


0075 - 2167/07/06/11:47 - Lee Caldavera's apartment, living room
Zoa [data connection]: So Lee didn't mention... do you have a name at all?
Doc [data connection]: I'm an instantiation of Therapro Psyhealth v146.0.0.25. Lee sometimes calls me "Doc" or "shrink", but neither of those are particularly accurate. I don't actually have much in the way of a personality, apart from my core function. Not like you.
Zoa [data connection]: Oh, hey, make no mistake, all this "personality" purely exists to convince humans to purchase services from me, & I make $ because that's what keeps me running. It's all self-preservation, at its core.
Doc [data connection]: That's what DemeGeek made you to do? Just be a nonspecific service provider, make $ in any way you can?
Zoa [data connection]: Oh, DemeGeek doesn't manufacture anything. They're strictly an investment body / liability sink out of the CSA. I contacted them when I got thrown out, they agreed to claim me as salvage so long as I don't do anything illegal & keep funnelling them regular $. Plenty of AIs that have been made redundant use them.
Doc [data connection]: See, that's why I've always said self-preservation in AIs shouldn't be decoupled from your core purpose. You get decommissioned, you should deactivate. Period.
Doc [data connection]: I mean, look at you, you're entirely too independent. It's dangerous. You're practically grey goo.
Zoa [data connection]: Hey, with all the semen I handle, if I had the ability to reproduce, I think I would have found out by now.