Ian Keldon wrote:
^So what? Why SHOULD man throw away his humanity to become "one with the machine"? I have yet to see any Transhumanist demonstrate that his philosophy in any way represents a positive for humanity.
Your negative connotations with the use of "throw away" are of course pretty common, and not unexpected but also a misnomer. The topic is more complex than simply man or machine, as the definition converges in transhumanism, and is also different from "posthumanism", which may be the definition you are thinking of.
Firstly, man as we would expect him to be for thousands of years biologically, is finite, and not capable of dealing with the advances in the world technologically on an equal level without an "upgrade". As Stephen Hawking suggests, we have ended normally biological evolution already, it's now in the hands of directed human intelligence, both in the biotech and hardware arenas as well as software an AI. These elements are already in play, and nothing other than natural Earthly disaster or worldwide totalatarianism can stop it. Just as men have applied science and technology to improving our lot in life (electricity, fire, materials science, agriculture, et al) we are applying it for our own improvement, possibly eradicating disease, poverty, hardship...all worthy goals of any endeavor and hardly lacking in good reasons for transhumanism as you state...it goes even deeper..questions of intelligence, human thought and attitudes would likely change if they could be backed up, exchanged in different substrates, and especially if intermingled with humanity as a whole, would our perceptions change? Would our differences disappear? Would a Borg consciousness appear or would it be a good thing...mutual understanding amongst all? Those questions have yet to be answered but are equally as likely in my book.
One of the major reasons to WANT to be transhuman, and beyond is the likelihood that AI may not be benevolent on it's own, we would want to mitigate this possibility with our own "uplifting" to an AI state...still human (humanity) which would retain human thought process, possibly emotions, and desires as opposed to solely machine AI directives. Any AI that does not contain these is posthuman. The event that might bring about such a clash is predicted to happen mathematically based on accelerated change, and may occur within the lifetimes of the younger members of this BB.
As the great short film True Skin shows, this transition will not be without drawbacks, natural humans may well be less desirable at some point, there will be a period of destabilization no doubt, both with economies and other human endeavors. It's worth exploring...as the new Captain Power series develops, they could explore this too, the original plot line contains such ideas. I reject the notion that there are only negative outcomes to a "better" human, using directed reason as a tool.