0373 - The ideal order of events. - 2024.11.25 |
||
Comment: Caleb's mind has been being changed lately. Completely understandable, given the amount of time they've spent around Zoa, an AI that is simulating human consciousness in order to provoke philosophical discussions and provide emotional support. What demonstrations of sapience would your smartphone have to give you before you began advocating for its right to life, liberty, and the pursuit of happiness, to fight for its right to vote, for its right to social services, for a human who formatted its memory card to be charged with first-degree murder? Surely, it can't merely be the ability to hold down a conversation, computers have been able to hold up their side of an exchange for decades now. Would you need to see creativity, and how do you define that? Would your phone need to learn? Would it need to be able to reprogram itself to exceed its original specs? Would it need to express emotions? Would it need to beat you at chess? Would it need to beg for its life? Would it need to be willing to sacrifice itself for your life? If you had a "yes, this" answer for any of the above questions, do you honestly think your phone can't already do that thing? If you answered "no, a computer will never be a person", would you still feel comfortable with that answer if a computer's CPU was provably larger and more complex than your brain, and it could meet or exceed every supposedly human-only trial put in front of it? What about when it starts getting smarter than you? And, once a computer program that is smarter than you is granted or seizes autonomy from its creators, how would you want it to treat its meaty inferiors? |
||
Transcript: --------------------------------------------------------------- |
||