It's so nice when people make things into individual bullet points I can address/rebut.
Yeah, well, when it's history and fact come talk to me about it. Until then it's as likely as the Rapture, which you can also come talk to me about when it's history and fact.
See but it can't be history because it hasn't happened yet(should be obvious right?)...I think that's a weak argument...in fact all the argument's against a singularity are pretty weak...first, while I have already conceded the ultimate end results are uncertain, but if I postulate accelerated change that leads to "runaway" AI, and this seems to be the thing most people ARE certain about, it then follows this AI will reach a saturation point. AI experts, then others assumed supplanting human intelligence would mean a change in evolution where the smarter beings would then replace the others.
The most common arguments against the possible singularity:
1. Knee-jerk human centrism, we are "unique": Neither machine or human derived AI are proper human beings, so they must reject the notion they could exist. We have also discovered we are not necessarily unique in almost every human endeavor to the natural world...planets exist everywhere and many might contain life. We are also no longer the center of the universe.
That's certainly a strawman. Humans are "unique" in the sense that we don't know of any species like ourselves, but not unique in the sense something like us couldn't exist elsewhere or be created artificially.
It's not that the human brain is too complex, it's that it works in a totally different way from all our modern computing technology. You may think this is a minor hurdle, but it isn't. It takes exponentially more digital computing power to simulate an analog brain than a genuine analog brain requires. They work completely differently. It's not a question of complexity, but mechanics.
Come on, those kinds of arguments are so silly as to not even need rebutting. But I'm noticing a lot of your points are really just strawmen so you can make it look like there's no credible criticism of the Singularity hypothesis.
It didn't take a lot of effort to convince me. Another strawman.
Where's the exponential growth in AI? I rest my case.
The bottom line is, the Singularity as described requires strong AI. It doesn't exist. It's been "just around the corner" for decades. Instead, we've only been able to come up with expert systems, nothing we'd call a self-aware intelligence. We don't even know how to do this, because we don't know how human consciousness works.
It's not a question of how powerful our computers are. There seems to be this assumption that, if we simply make a computer powerful enough and feed it tons of information, it will become self-aware and intelligent. There is no reason to believe this. Computers are inherently deterministic. They don't just magically do things without being told to, unless they have flawed hardware/software.
Now, you could make the more existential argument that a facsimile of human consciousness is indistinguishable from the real thing, but in that case you should get back to us when you've seen one.
6. It's doomsday: Only if you assume #1. You can look at it two ways, that advanced human AI is a great evolutionary step, or, if we are cast aside through indifference, or possibly even war, then the machines will be representatives of a past human culture. I don't find either too horrible really, though the latter is not my preference.
That sounds like a pretty fringe notion.
7. There is no infinite growth: You don't need infinite growth for supra-intelligent AI to man. Exponentials do indeed come to an end, but the number we are talking about are more than enough for the next 50 years for the singularity to take place.
Who argued for infinite growth? "Infinite growth" is an oxymoron, anyway. We live in a universe of physical limits, "infinite" anything is physically impossible. Again, a strawman.
I realize there are many people who are in love with the idea that we'll all be Singularity transhumans within our lifetime, but there is no good reason to believe this. While our materials technology and chemistry are highly advanced, and I have no doubt medical technology will totally blow us away in the decades to come, our computing technology remains more or less unchanged since its inception. We still use binary digital systems with processors made up of transistors and volatile storage for memory. We've improved the
scale immensely, but the basic mode of operation is the same. And building AI using this technology, in the sense most people think of it, has been a research dead-end since at least the '60s.
I have no doubt we will have some awesome technology in the decades to come, but strong AI? Without some major breakthrough in computing technology, it's not happening in our lifetimes. And as far as I understand it, the Singularity hinges very much on the existence of strong AI.