Nebulasaurus
2 min readMar 31, 2023

--

Perhaps my wording wasn't clear, but I was talking about the ability to, for example, feel pain when you get pricked with a pin. Empathy is when you can intuit that someone else feels pain when they get pricked by a pin. But sentience is when you feel the pain yourself.

And this ability to feel pain (and happiness) is what makes people special. Perhaps not uniquely special, because other animals can do it too. But it's what makes us worth caring about, and being concerned for. It's why we worry about hurting other people and animals, but we don't worry about hurting a rock or a computer.

And that's what's at stake when we talk about AI being sentient. We're talking about whether we should start to care about the well-being of computers - which could mean yielding some human autonomy in favor of a computer's. Which I think would be a bad decision.

As for 'turing-sentience": people get confused, but the turing test is for testing intelligence, not sentience. Intelligence is the property of a system by which it is able to produce novel and adaptive reactions to stimuli. It doesn't necessarily have anything to do with sentience. Intelligence and sentience may frequently go hand-in-hand, but they are not the same thing. We can't directly test for sentience in the same way we can test for intelligence.

You also expressed concern about people's inability to agree with each other. And this sentiment also comes up when you mention the problem of an "accepter".

But the solutions you provide in the third paragraph don't actually solve this problem - because you’d still have to get people to agree to use those metrics.

But the real solution is that we don't actually need every single person to agree. with anything. How many people do you think would agree with the idea that "a government should forcibly punish or imprison people convicted of murder"?

Probably 99 out of 100 right? But definitely not everyone. And yet we do it anyway. Because at some point, every society has to accept a certain tyranny of the majority. There's just no way around it sometimes.

So at some level, society has to just make a decision about who gets "accepted" as being sentient. We already do this, in that we protect dogs and cats, but don't do the same with bugs. But we're getting to the point where we have to take a stance on AI as well.

--

--

Nebulasaurus
Nebulasaurus

Written by Nebulasaurus

I think most people argue for what they want to believe, rather than for what best describes reality. And I think that is very detrimental to us getting along.

Responses (1)