Nebulasaurus
2 min readJun 17, 2022

--

The ultimate signature of sentience has NOTHING TO DO with mimicking human communication, nor with any other display of intelligence, for that matter.

The real mystery of sentience is the ABILITY TO WITNESS. It's the ability to notice when you feel good or bad, or warm or cold, or any other of the vast spectrum of experiences we witness via our senses.

Unfortunately, as you say, we don't have a way of detecting this directly. The only sentience we actually know of beyond all doubt is our own. This is why solipsism is a thing.

But most of us are willing to draw an analogy between ourselves and other people, and make a small logical jump to assume that other humans also witness things - and are therefore sentient - like us. And we make a similar jump to assume that our favorite animals also witness things.

And this logic can take us pretty far. Because if dogs and cats are sentient, then presumably, so too, are birds, lizards, and fish. And potentially smaller things, like bugs. And maybe even even other things, like plants, fungi, single celled organisms, or atoms.

It's possible that the witnessed experience of smaller things, like bacteria, are less vibrant and varied than bigger things, like humans. But it's hard to know. All we can really do is guess how things might feel, based on how analogous they are to ourselves.

And with respect to all of those things I mentioned above, we have an advantage, in that we share a common lineage. All life forms on Earth are related, and part of the same family tree. And so when we assume similarities between ourselves and other earthly life forms - like that it hurts when we get injured, and it feels good when we eat a nourishing meal - our analogy actually has a good chance of being relevant.

But when we try to assume the same similarities between ourselves and the machines we create, we have a much lower chance of those analogies being relevant. Because we in fact have no idea what would make it feel good or bad, despite the words it might type back to us. So even if a machine can have a more complex interaction with us than a bacterium, it is nevertheless still less analogous to us than a bacterium, because it is not part of the same family tree.

None of this is to say that an AI couldn't potentially be sentient. It is just to say that the Turing test, which LaMDA clearly passed in this case, is not an adequate, or even a relevant, test to see if it has sentience.

--

--

Nebulasaurus
Nebulasaurus

Written by Nebulasaurus

I think most people argue for what they want to believe, rather than for what best describes reality. And I think that is very detrimental to us getting along.

No responses yet