First you're missing the point that most people don't create new ideas but ABSOLUTELY NO expert system can.
point was that creativity isn't some mystical apprehension of the occult. Human creativity derives from fairly mundane processes. There is no reason given so far to decree that these processes or homologues for them cannot be recreated by nonorganic systems.
Second expert systems can only reconfigure ideas under narrow parameters....Creativity may be one aspect of human intelligence we can't emulate simply because it can't be quantified.
As I've said (twice I think,) you don't have to understand something to model it. But don't feel bad: You're not the first who didn't read closely.
Actually intuition is very important. Often time we don't have enough information to make an informed decision therefore we "guess" and hope for the best. You might not think intuition isn't important but remember without we would be paralyzed. You could emulate intuition with a random number generator but then it wouldn't be intelligence.
Oh, I think intuition is very important in practice. But I don't think it is intrinsically different from boring, step-by-step thinking. It's "merely" automatic thinking, in which we don't consciously articulate reasons for our conclusions, don't consciously trace the path of thought. I think intuition selects the most probable choice, though, so I don't think a random-number generator would be useful in devising an equivalent to human intelligence.
I must say that the notion that a random number generator wouldn't be "human intelligence" suggests an implicit notion that human-level intelligence requires a human-type consciousness. Seriously, since so much human intelligence isn't aware, why is it so necessary to insist on this?
As I said before, "consciousness" can be substituted by "point of view." The supposed illusion of consciousness can somehow enable a person with closed eyes to identify the location of a limb and communicate this to another human. Quite aside from the peculiar ability of an illusion to generate true information, if I had written "The supposed illusion of a point of view can somehow...." you'd have supposed I'd lost my mind. Now, a human needs its neural homunculus, it needs its point of view to regulate its body and navigate the world. Why would a CPU need a point of view?
Possibly the most important practical obstacle in the long run to AI is our desire that programs do only limited things, i.e., the tasks we want done. By our standards, if in a world of thousands of expert systems embodied in furniture, houses, implants constantly interacting in increasingly complex tasks, programs that start to rewrite themselves to achieve whatever bizarre aims emerge from this chaotic mass would be ghosts in the machine, fit only for exorcism.
PS The famous Ken MacLeod quote (which keeps getting modified because it's hard to distinguish nerds, geeks, etc.) is from his first novel, The Star Fraction. It's volume one of his Fall Revolution series, which so far as I know is unique for having two separate endings.