Discussion in 'Science Fiction & Fantasy' started by RAMA, May 13, 2012.
And we have another flameout!
Care to refute his points though?
Please. Your strategy is to just exhaust people with excessive verbiage. You haven't proven anything other than that you can type a lot.
Yet another milestone in the mapping of the human brain...just saw this today...dated June 1st.
Technology and fear...
I'm not afraid of technology or the future, dude. I welcome them. I wish the Singularity was right around the corner--the idea of being a transhuman sounds badass.
Doesn't mean I'm willing to delude myself into believing it's about to happen, though.
Lots of people subconsciously put the brake on this, with knee jerk reactions to the implications post Singularity...its the unknown or our potential lack of place in the universe that scares people. Not being able to predict anything after a certain date may scare the willies out of anyone who is into science or science extrapolation! Reading Marovec in the 90s put me in a dark mood for a long time, so I stored it in the back of my head, only to realize there may be ways around the dreaded "AI holocaust". So far only Star trek (STTMP, possibly some eps of STNG) and The Matrix movies have explored what might happen if everybody didn't die off from clearly Singulatarian AI! Of course the Matrix also showed the teething problems before that came to pass( an understatement I suppose). Id like to see some braver visual SF out there, just as Brin himself has written about the Singularity.
It's not delusional to extrapolate from the best current evidence, something people do every day on a variety of topics. Just today I read an article from a skeptic, he agrees on 6 of 7 points of the Singularity, including the lack of knowing what happens behind a point of human level AI...something he also agrees is a given...he simply disagrees with Ray Kurweil's version of it. I'd say I don't know what will happen, only that if I accept some of the elements of it are likely or probable, it leads be to believe others are too. Everything else I say tries to demonstrate why I believe this way.
I don't care one way or the other - I just think that grown-ups should stand up for reality.
The folks in all of this who are self-evidently afraid are the cybernetic totalists who embrace the delusion that they're about to cheat death in some way. In a matter of decades Kurzweil will be extinguished, as his father was - as all of us eventually are. He can't accept the fact of the latter, so why expect him to face the prospect of the former honestly?
Reality by definition means what has happened or is happening now(at least the accepted version of it in the macro world)...then you shouldn't be in the sci-fi business at all.
I expect to live a normal lifespan. I don't expect to reach transhumanist or posthumanist levels. I don't want to bring anyone back. If I did make it to the Singularity I wouldn't expect to know the difference or care much about the past anyway. Kurzweil, probably the most visible popularizer of the meme, is just one part of a much greater whole, which has been going on long before he came on the scene, and will certainly continue after his death.
Well that doesn't sound like cult leader worship or anything...
QFT. Being a realist is not "fear".
Neither is knowing what one is talking about, of course. For the most part, the evangelists don't - their followers certainly don't.
Believers in this nonsense don't realize how blinkered accusing people of being "afraid of" the Singularity is - it's exactly the same as religionists accusing atheists of "being angry at God," completely missing any understanding of what the disagreement is about.
Better hurry up with that there singularity.
I haven't typed a lot. Cut'n'paste isn't typing. It is typical that you guys don't notice these things, though.
It's true that I've typed more than you but then, since I'm actually saying something it takes more words. You really haven't had much more than "computers don't work like this now."
This is entirely beside the point, so your words have still been a waste, exhausting verbiage trying to pass as argument. You don't even know that exponential growth is very slow at first! If you can't get something that simple right, you really have nothing to contribute but "AMEN!" as the preacher thunders out his fire and brimstone condemnations.
You are all class.
Wow, and it's me who doesn't read what you write, huh?
It's not just that computers don't work that way now, it's that there is no reason to believe they will ever work in the way Singularity prophets think they will. They've worked under the same basic principles for decades, pretty much since they were created. Where is the evidence that we are on the verge of some revolutionary new computational technology?
Oh, yeah. I'm just too dumb to know what "exponential growth" is. That's the ticket.
Here's a newsflash: it's not "exponential growth" if a technology remains in relative infancy for decades. The bottom line is, artificial intelligence has been an almost total failure. After half a century of research and investigation, we've had to settle for special-purpose expert systems and refined algorithms. "Generalized AI" remains a fantasy that we seem to be no closer to achieving.
Computing technology itself has indeed shown exponential growth in that time. For some reason, AI hasn't. This may be a difficult fact to accept, but that's just what it is.
All such predictions never take into account future developments, this includes food shortages, fuel shortages, power shortages, etc. I believe I've provided more than a share of potential ways out of such problems.
It's very human to think the negative outcomes are more likely than positive ones:
Well Im also an atheist, so it wont be coming from me.
Still not caring.
Belief in the Singularity is no more or less than an expression of the religious impulse, dressed up in the trappings of technology - it can't be grounded in rational atheism.
Are you just not understanding that the explosive growth we've experienced over the past couple centuries is not sustainable? It depends entirely on the exploitation of finite fossil fuels. Even the most optimistic scenarios tell us that we will have to see dramatic population reductions, drastic cutbacks in consumption (both in energy and physical resources), and a general slowdown of growth. These are not things we will choose, they will simply happen as a consequence of a) running out of fossil fuels and b) getting off of them to halt the damage of climate change. If we don't do b), then a) will happen. It's really a question of whether we want to make a gradual transition to a more sustainable way of living, or hit a brick wall and have to pick up the pieces after a catastrophic collapse.
It is naive to think we will <tech> our way around this. The period of time spanning the Industrial Revolution up to now is unprecedented in human history. We don't know what will happen going forward, but it is inarguable that we are exhausting our planet's limited resources: we are destroying biodiversity, we are (perhaps irreversibly) altering the global climate, running out of fresh water sources, running out of oil, running out of phosphates, etc. etc. We do not have the resources to continue along this path, and the impression I get from Singularity advocates is that they think we will handwave these problems away by reaching the Singularity before our energy and resource problems come to a head.
Nanomachines will fix all of that. Seriously, dude. Take a hit.
Separate names with a comma.