I don't see why they can't raise the "optimal balance", you can raise brain performance, but what's to say they can't easily control other elements of the AI-human brain, sort of a self-aware safety net within the brain itself(yes I think I've seen this in SF before)that eliminates by products of increased peformance like schizophrenia. Or who is to say a virtual human/foglet brain simply isn't much hardier than a totally natural or augmented biological brain?
Are you arguing from science or from the desire to believe? Too many people cling to the Singularity as a matter of religious faith -- "the Rapture for geeks," as Ken McLeod calls it. Science demands healthy skepticism. And just in general, the future never turns out to be the way people expect it to be. With so many people today utterly convinced that the Singularity is inevitable, the more that convinces me that it won't happen, certainly not the way people expect.
Yes I've seen that idea of course, the difference with the Singularity, is there is a lot of data, models, accurate prediction track record and not simply faith, this is where it separates itself from end of the world cults and past futurists, which often were much more speculative and relied on linear models. Some of the best minds in their fields agree with many of the end results, if not all the specifics of the current predicted date of the singularity...I'm fully able admit the date can vary, but it's not a pie in the sky idea...there's a lot of groundwork. Others also admit the singularity scenerio may come to pass, but not in a positive light, this is also likely, which is why I argue that we need to accelerate as humans even moreso.
In terms of details...well, the singularity might happen, yet many of the details could be off...one technology might be substituted for the other. If you are doubting the technology, there are lots of examples of foglet work, AI, nanotech is now a $2 billion industry...after how many years? Roughly 20 since Engines of Creation.
One thing people are missing...it occurs to me at a time of accelerating change (which we are factually in) we are going to be able to make more predictions and better predictions of the future than we ever have, at least until there may be a singularity type breakdown, if it indeed happens.
One of the chief supporters of the positive singularity, lists a counter, point by point to the skeptics in his book and on his website...
Kurzweil
Finally, regardless of the outcome, the discussion of the singularity has changed my point of view on both the future of SF and the world. Its no longer enough most ofthe time for me to see mundane ideas of the future with no info technology involved in the fabric of the culture, where staid, conventional, brute force technologies exist that don't take into account programmable matter and the like. "In Time" was a very good movie to me, but I don't see it as a realistic future in any way, its value lies in it's parable. I recall seeing a recent interview with a famous SF writer (I forget whom at the moment) who said hard SF literature is in a holding pattern as it takes into account the implications of the singularity...