I'm just going to respond to this part:
Exponential growth is very slow at first. The relatively belated appearance of extraordinarily rapid growth in a brief period is precisely the disjunction that prompts the name "Singularity." The only real basic criticism of the AI aspect of the Singularity lies in demonstrating some real reason why exponential increases in computing power cannot by brute calculation produce AI. That's the problem for the critics here, they can't make such an argument. They argue only from personal incredulity.
Brute calculation will not produce AI. It simply won't. How do I know this? Well, for one thing, I know how computers work, from the software level down to the microscopic electronic hardware. Why do I have to prove that you can't brute-force AI? Isn't the onus upon those who say you can
In any case, computers just don't work that way. An extremely powerful computer won't magically do things that a less powerful computer can't--it will just do them faster
. That's all.
We can write algorithms which are very good at handling certain complex situations, things like natural language processing and map routing. But today's computers are absolutely no good at devising novel solutions when presented with a problem, which is what one would expect a good AI to be capable of.
Making computers faster and more powerful doesn't do anything but exactly that. This assumption that it will somehow result in AI through brute power is nonsensical.