CMV: Torturin a cookie from Black Mirror is as immoral as torturing a human

My argument was against the top level comment's analogy. I don't think you should change your view at all (this doesn't break Rule 1)

Of course I'm being of bad faith because the pain that the AI could be feeling is more plausible than a random "what if" scenario.

In fact, this seems to be the default opinion in the White Christmas universe since the cops leave it on for the weekend with the express purpose of torturing the cookie. So yes it's indeed not the same as a Wager.

But then it would end up being a "benefit (using the device) vs risk of cost (risk that the Ai is actually sentient)" issue/question which is not the matter of my CMV.

Actually my CMV is exactly on that "risk of cost" trying to determine if the cost is real or not and take out the doubt on the "risk" part.

To reframe my argument a bit, I am just trying to point out that the chance of sentience switches the calculus from a utilitarian analysis to a deontological one. If you take the stance that human-like sentient beings are ends in themselves, then any risk of such torture happening should be immediately proscribed.

I suppose it depends on your own emotional makeup. It's like Those who walk away from Omelas. Some people think it's ok to torture the kid but others don't and the difference is due to fundamental moral disagreements that can't really be reconciled.

I personally think that the value of deontological thinking is that it reduces the risk of ending up in a Black Mirror type world in the first place, whereas I'm sure anyone can eventually cook up a utilitarian justification for horrific situations.

/r/changemyview Thread Parent