The "is self-awareness a requirement for sentience"
Wait, now, you didn't say self-awareness; you said self-preservation. I don't see those as the same thing, do you?
I think that part of the ethical question relates to not only whether you are harming the AI, but whether the AI self-perceives harm
Mmm...by that standard, you could conclude that it's unethical to deny somebody homeopathic nostrums, if he believes in them.
Part of the barbarism was to devolve people to the status of things.
True. And part of the complexity of AI is that, at some point, it elevates things to the status of people.
I find myself OK with a pig in a large healthy pen living with other pigs and eventually being slaughtered, but I am definitely not OK with factory-farming
Yeah. My dividing line is further, but not for any good reason. We almost always buy non-factory-farm chicken. And I won't eat veal, since it's excessively cruel. (Well...I won't contribute to veal being made. When I took a piece of meat at Google lunch a few months ago, and then read the sign—no, I don't know why I did it in that order—I didn't throw out the veal; that wouldn't reduce the suffering. I'm more careful now.)
I've always wondered how an ordinary 3-Laws robot would deal with a situation where it would take too long to decide who to save
Come to think of it, that was the problem with at least one of the early JGs: it could make moral judgments, but too slowly to be of any use.
(no subject)
Date: 2012-03-02 08:21 pm (UTC)Wait, now, you didn't say self-awareness; you said self-preservation. I don't see those as the same thing, do you?
Mmm...by that standard, you could conclude that it's unethical to deny somebody homeopathic nostrums, if he believes in them.
True. And part of the complexity of AI is that, at some point, it elevates things to the status of people.
Yeah. My dividing line is further, but not for any good reason. We almost always buy non-factory-farm chicken. And I won't eat veal, since it's excessively cruel. (Well...I won't contribute to veal being made. When I took a piece of meat at Google lunch a few months ago, and then read the sign—no, I don't know why I did it in that order—I didn't throw out the veal; that wouldn't reduce the suffering. I'm more careful now.)
Come to think of it, that was the problem with at least one of the early JGs: it could make moral judgments, but too slowly to be of any use.