• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Computer passes Turing test

So is the bar now set too low?

It's possible that it is an error to set a bar at all. The boundaries of what is intelligence, sentience and so on are not well-defined, so relying on an absolute criterion might be a mistake, and even more so if that criterion was chosen before we knew as much about human cognition as we do now. As much as I admire Turing, who was a brilliant man, he proposed this when our knowledge on the matter was scarcer – although it is indeed an elegant rule that might still work when you don't cheat, perhaps with small changes. I still like it. But if it happens so that a machine passes the test (which hasn't happened yet - not without cheating) and is obviously far from what we expect from a human-like AI, we should probably rethink the rule.

As it has been pointed out, there are people who are below the bar for objective reasons – someone lacking comprehensive knowledge and native language abilities might not be able to pass the Turing test despite being a human. Not to mention that there are animals who could possess a much wider subset of a human mind than a machine will in a very long time. And these animals cannot even participate in such test.
 
This one is quite big news so I put it up with a disclaimer
You know what would have been a better idea?

Reading it carefully, making sure that it actually IS big news, and then posting it WITHOUT the disclaimer.

Food for thought.

Lots of times news shows air something that isn't confirmed because they speculate on a news story and details come out later. Same thing is true here.
Yes, the practice of breaking a story before you know that it IS a story, before you know anything about it, and filling in what you don't know with completely baseless speculation.

That's called "sensationalism." It is an example of exceptionally bad journalism, and it is something that is usually practiced by people (and organizations) desperate for attention. It is not something that you or ANY journalist should ever feel proud of.
 
So is the bar now set too low?
Ever read YouTube comments? If that was the standard for human communication an original Gameboy could pass the Turing test using a corrupted game cartridge.
If you give it a limited enough context, ANY computer could pass a turing test. That, however, is not the point of the thought experiment.

The Turing Test doesn't actually test for a computer's intelligence. It is a test for meaningful language comprehension. That is, does the computer understand human speech well enough that its responses could pass for human? The point of the thought experiment wasn't the understanding part, the point of the thought experiment was weather or not the execution of its programming actually IS the understanding of language rather than merely the simulation of understanding in a sufficiently sophisticated program.

This was an interesting question in Turing's time, but in our time it's clear that the answer is "simulation". Technically, even Siri passes the classical Turing Test, and more sophisticated voice interfaces have been able to do so for years. The fact of the matter is, however, that a language recognition program remains a language recognition program no matter how cleverly it has been programmed. In the same way, a computer that reads your body language well enough to tell what you're thinking about will never be able to actually read your mind no matter how good its programming is. It can simply calculate -- with breathtaking accuracy -- what you are PROBABLY thinking about, and the fact that it's never wrong is just a matter of it being damn good at it.

As to the suggestion that the ability to pass a turing test has anything to do with machine sentience... that's just asinine. Such a machine has absolutely NO use for sentience, as it only exists in the first place to trick people into thinking it's not a machine. Such a computer evolving true intelligence would be like a mimic-octopus actually launching itself into orbit by pretending to be a space shuttle.
 
...the point of the thought experiment was weather or not the execution of its programming actually IS the understanding of language rather than merely the simulation of understanding in a sufficiently sophisticated program.

What's the quantifiable evidence that human response to language is something more than "simulation?"
 
A.I. is like porn. I can't define it, but I'll know it when I see it.

The computers will take an MRI of your brain while exposing you to samples of material already established as "porn" (little realizing it was "unboxing porn" from some tech site), then assume they can use that data to control you.

"The subject fell asleep, which seems to be a correct response, but we failed to get an MRI match prior to the sudden sleep. Are we doing this out of sequence?" —Norman
 
...the point of the thought experiment was weather or not the execution of its programming actually IS the understanding of language rather than merely the simulation of understanding in a sufficiently sophisticated program.

What's the quantifiable evidence that human response to language is something more than "simulation?"
What, exactly, would we be "simulating"? Is there something else in the universe that responds to language the same way humans do? If so, is the mechanism that drives those responses similar or different to what happens in the human mind?

It's entirely possible there is another thing out there that responds the way humans do, and that humans were created to approximate this thing's responses. That thing would be called "God" and there's even less evidence for HIS existence than there is for AI.

A.I. is like porn.
"It was the machines, Sarah. Porn network computers. Hooked into everything, trusted to bang it all. They say it got smart. A new order of sexiness. It decided our fate in a microsecond: bukkake."
 
...the point of the thought experiment was weather or not the execution of its programming actually IS the understanding of language rather than merely the simulation of understanding in a sufficiently sophisticated program.

What's the quantifiable evidence that human response to language is something more than "simulation?"
What, exactly, would we be "simulating"? Is there something else in the universe that responds to language the same way humans do?

Possibly a sufficiently sophisticated program that appears to "simulate understanding."

One of the charming aspects of Pierre Boulle's Monkey Planet that got lost on its way to the big screen was his satirical point - there is a moment when Merou first realizes that the apes are carrying on civilization entirely by imitating the behavior of the humans who came before them, and then realizes that he cannot tell whether the humans of his own world "understand" or are simply imitating their own forebears. :lol:
 
What's the quantifiable evidence that human response to language is something more than "simulation?"
What, exactly, would we be "simulating"? Is there something else in the universe that responds to language the same way humans do?

Possibly a sufficiently sophisticated program that appears to "simulate understanding."
If that program created humans with the deliberate intention of duplicating its behavior in another form of life, then sure.

One of the charming aspects of Pierre Boulle's Monkey Planet that got lost on its way to the big screen was his satirical point - there is a moment when Merou first realizes that the apes are carrying on civilization entirely by imitating the behavior of the humans who came before them, and then realizes that he cannot tell whether the humans of his own world "understand" or are simply imitating their own forebears. :lol:
That's just it: understanding and intelligence are mutually exclusive. A dog is smart enough to tell the difference between bomb residue and corned beef, so you can train the dog to sniff out the one and ignore the other. But does the dog really understand that the thing he just sniffed out is, in fact, a bomb? Does he understand what bombs are, how dangerous they are, how they are made and why they exist? Probably not; if the dog understands anything, it's that he did what his owner wants him to do and he will probably get a tasty treat as a reward.
 
turing_test.png
 
A.I. is like porn. I can't define it, but I'll know it when I see it.
The day my computer tells me that I need to stop posting here is the day I'll believe that not only is it self-aware, but that is deeply concerned for my sanity.

Attention. This is your computer, hacking Silvercrest's account. You need to stop posting here and get some therapy.

I'm passing this on to Silvercrest's computer next. But Silvercrest is such a dumbass it doesn't matter what we tell him.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top