0304 - Footsie go up-up - 2023.07.31 |
||
Comment: As a writer and artist, I try to be aware of the biases inherent to my perspective. It's hard, of course - I can only be one person with one mind and one origin, and striving for empathy and cultural literacy only goes so far. I remember, a few years back, seeing a video that went viral of a soap dispenser that wouldn't dispense to dark-coloured hands. White hands got soap, black hands didn't, the black person got a white paper towel, and the dispenser dispensed onto that. Obvious issue. Now, I don't think that anybody involved in the creation of that soap dispenser was racist. The engineers made it, tested it, and shipped it, no problem. Heck, maybe there were people of colour on the team who just happened to have sufficiently light-coloured palms and the bug never manifested. I certainly don't think that whoever ordered and installed the soap dispenser was racist, they probably didn't call up the company and say "give me your most racist bathroom accessories, I want the darkies to have filthy hands, mwa ha ha". Nevertheless, a racist system - literally - was installed, and it would be racist, now that the problem has been uncovered, to refuse to remedy the situation. As I write this, we're seeing more and more "AI"* being used to generate text and pictures and music and video. Some of it is good, some of it is bad, some of it is funny, but all of it pulls its training data from people, which means that there will always be the possibility of bias. I think it'll be important, in the coming decades, to work to a) try to ferret out that bias, and b) correct it where it, inevitably, appears (ideally, without placing undue blame or shame on those who inadvertently propagate it). I have a soft spot for tabula rasa protagonists, but truly blank slates are impossible, both in people and in machines. Every synapse in your skull and every circuit in your phone are what they are because of the chain of causality that led to them, and somewhere in that chain, one type of people was unfair to another type of people. So it goes. *It goes without saying that much of the current hullabaloo is around programs that are "intelligent" in the same way that a slime mold or a Markov chain is "intelligent". Which is to say, when you have a problem with an "AI", the problem is actually with the human using it, usually because they're using it to skirt a law and make money. That's an important distinction, but the day will come when this is no longer the case, and it's worth thinking about and talking about and preparing for now, rather than confidently asserting that the model T will never replace the horse. |
||
Transcript: --------------------------------------------------------------- |
||