View Single Post
Old June 2 2012, 06:09 AM   #82
Rear Admiral
Location: Democratically Liberated America
Re: David Brin's latest novel, and a TED talk

stj wrote: View Post
No, my point was that creativity isn't some mystical apprehension of the occult. Human creativity derives from fairly mundane processes. There is no reason given so far to decree that these processes or homologues for them cannot be recreated by nonorganic systems.
The fact that it's mundane process is irrelevant. Creativity can not be quantified and analyzed therefore it can't be copied. the only way non-organic systems could compete is through shear brute force using an untold number of iterations to produce something but then it wouldn't be intelligence

As I've said (twice I think,) you don't have to understand something to model it.
And of course you are completely wrong. Sure you could emulate a complex system using a black box approach. People do that all the time to create systems that pass the Turing test but in order to do that you need to know all the possible inputs AND outputs and that's where you fail. Creativity has an INFINITE number of possibilities. Then there is also the fact that if you don't understand something, you certainly can't improve it and that's the whole point of singularity.

Oh, I think intuition is very important in practice. But I don't think it is intrinsically different from boring, step-by-step thinking.
Most problems can not be broken down to boring, step-by-step thinking. Let's say you have a choice of two ice cream flavors, rocky road and butterscotch. You've never tasted either and you can only choose one. Which would you choice and no you can't use some random number generator like flipping a coin. You could do it by some non-linear reasoning. Could a machine do it. Now your other point is that, it really isn't important. I disagree, the ability to handle chaos is very important since there an infinite number of possibilities and you can't plan for all of them. Sure you could emulate intuition through a random number generator but then it wouldn't be intelligence.

I must say that the notion that a random number generator wouldn't be "human intelligence" suggests an implicit notion that human-level intelligence requires a human-type consciousness. Seriously, since so much human intelligence isn't aware, why is it so necessary to insist on this?
Adaptability. Without these two aspects, a machine intelligence would eventually face a problem it can't solve or a question it can't answer. That's the whole plotline to ST:TMP by the way

Possibly the most important practical obstacle in the long run to AI is our desire that programs do only limited things, i.e., the tasks we want done.
I would argue that specialization is a good thing. Look at the difference between a PC (generalist) and a console (specialist). Specialization means less resource therefore less cost. Which begs the question why would we need an AI in the first place.
This Space for Rent
Yminale is offline   Reply With Quote