Discussion in 'Science Fiction & Fantasy' started by RAMA, May 13, 2012.
^And a real puff piece.
The difference will be visible when you call a Scot English to his face- visible in the form of an oncoming fist, usually!
Its not very rigorous, don't expect it to be, it's a short article in Time. However, it can make a good entry point for the meme after all it's likely to be one of the two most important events in human history.
Existence comes out on June 19th right before my birthday, I've reserved my copy.
Yes...I still buy real books even though I have an e-book reader.
Yeah, well, when it's history and fact come talk to me about it. Until then it's as likely as the Rapture, which you can also come talk to me about when it's history and fact.
In person, seeing the differences between English and Scottish wouldn't leave time for making claims about the differences, or lack of them. Re MacLeod in particular, I must say that I'm having trouble remembering whether I'm reading him or Charles Stross. And trouble sometimes distinguishing Paul McAuley and Geoff Ryman, too. I'm not a literary person (obviously) and mostly talk about the obvious stuff that people like to overlook. For some writers, the difference between Scots and English is too subtle for some of us over here.
See but it can't be history because it hasn't happened yet(should be obvious right?)...I think that's a weak argument...in fact all the argument's against a singularity are pretty weak...first, while I have already conceded the ultimate end results are uncertain, but if I postulate accelerated change that leads to "runaway" AI, and this seems to be the thing most people ARE certain about, it then follows this AI will reach a saturation point. AI experts, then others assumed supplanting human intelligence would mean a change in evolution where the smarter beings would then replace the others.
The most common arguments against the possible singularity:
1. Knee-jerk human centrism, we are "unique": Neither machine or human derived AI are proper human beings, so they must reject the notion they could exist. We have also discovered we are not necessarily unique in almost every human endeavor to the natural world...planets exist everywhere and many might contain life. We are also no longer the center of the universe.
2. Human brain is too complex: In fact, the greatest era of brain discovery is happening right now as we speak, it turns out the brain is actually quantifiable. This is direct contradiction to all we've heard about how the brain is so complex. For this I'll refer you to here: http://singinst.org/overview/whatisthesingularity/
3. Religion, ethics: Only god created man, or can create intelligence. Ethically, does creating facsimiles, then superior intelligence diminish us? Will creating something that may destroy/bypass us be self-defeating? Everyone has a personal answer here...I reject religious reasoning out of hand. I answer the second question elsewhere in the list. The third is most valid, we may still have potential to stop the technology from advancing but not without a huge, concerted effort, one not likely to happen for many reasons, including economic ones. We MAY be able to program in an Asimovian robotic law, but I doubt that will work with exponential AI.
4. Linear thinking: It takes a tremendous amount of effort to convince the avg human that things are not like our perception all the time, ie: that there are microscopic organisms, that we are made of atoms, that no creator was needed to create the universe, and consequently that our slow perception of time and limited, 80+ years avg lifespan keeps us from seeing the big picture.
5. We are far behind technologically from what we need to be for a singularity: This always assumes lack of exponential growth, and always fails on those terms. This is the clearest evidence out there...exponentials end but only to make way for the next paradigm change. The existence of the meme also makes the prediction self-fulfilling, there are many creators of this technology and $ to back it working hard to make it appear now. I also feel that while the math makes the singularity possible before 2050, it may not happen, but is probable before 2100.
6. It's doomsday: Only if you assume #1. You can look at it two ways, that advanced human AI is a great evolutionary step, or, if we are cast aside through indifference, or possibly even war, then the machines will be representatives of a past human culture. I don't find either too horrible really, though the latter is not my preference.
7. There is no infinite growth: You don't need infinite growth for supra-intelligent AI to man. Exponentials do indeed come to an end, but the number we are talking about are more than enough for the next 50 years for the singularity to take place.
^Sigh. Like I said, when it's history and fact, come talk to me.
Are you talking Lawnmower Man 3 here? Not that I understood 2 at all. Or Demon Seed/Collossus: The Forbin Project? Or Frankenstein plus we may be due for another backwards step via a world war to ultimately perfect man for sure. There may be higher powers on Earth preventing it from happeneing as well - like time travellors or women in general - a hot flash for a woman president might be akin to a thermo-nuclear explosion.
There is an astoundingly huge difference between surfing the web today and surfing the web 17 years ago!
When discussing computer technology, things seem to operate in shifts and rises. In other words, technology is linear until a major shift in technology, such as the integrated circuit. I expect we will continue to see linear growth in computing power until the next major shift, be it holographic technology, dna based computing, or something nobody has thought of.
However, denying that changes in technology is accelerating is difficult to do. Humans alive today enjoy technology that couldn't be imagined in 1900. Humans in 1900, however, weren't living all that differently from a technology perceptive than humans in 1800.
Whether the next 100 years will offer the same range of advancements as the last 100 is pure speculation, however.
Ben Bova's novel 'Star Brothers' was about nano - technology vastly improving Humans and creating a new and better breed bringing about an alien visitation. Sounds like a euphoric rapture. Certainly different things are gonna happen when you increase man's capacity for thought creatively and otherwise and a machine's ability to be more Human like and interfacing. Control is certainly the key issue. How does one control creativity? It seems the more one tries to control it, the more creative it will become. The only limit would be God herself as she does have the power to turn wise men into fools or the reverse very quickly. Considering she has every hair on your head numbered and all. Very hard to outwit God. Now creating a souless anti-christ sounds alot more likely. Like the fear was in creating a test tube baby or what not. Maybe creativity itself comes from invisible aliens?
It's basically geography. You don't call a Texan a New Yorker and vice versa. Want a catch-all term, you have 'British.'
More like the difference between "American" and "Canadian"
Texas and New York are both states of America with their own government, flag, stereotypes etc. but are still also both part of a single country, same currency, etc.
The United States and Canada are two different countries. Comparison for that would more likely be the United Kingdom and the Republic of Ireland.
Texas HAS no comparables...
The Brin Existence page is in full effect now...extended trailer with artwork for the novel.
David Brin excerpt asks how we'll make our machine AI behave...
Its not deniable, facts are facts. People wouldn't recognize how we live now in 1970. Culture, events, technology are so different. The difference is magnified with 1950s era technology. It used to be people could ignore technology, now it's extremely hard to do and requires effort. Even so, what humans perceive in our limited way doesn't even scrape the surface of real change happening underneath. Its as I said, its not about something as pedestrian as flying cars, its about the info technology!
Edit: This brought back memories of me on the internet early 90s, and my own first PC...upgraded from IE2 to 3, the simple pages loaded slowly, Real Audio was a pain in the ass....to think I STREAM VIDEO to all my video playing screens all over the house now seems unimaginable!
Extrapolation is not "fact". It's well informed guessing at most. It can be wrong.
So, in the 90's, I would turn on my computer, get on line. Go to start or an icon on my desktop and open a browser to surf. Content was generated by html that was either static or generated by PHP/JAVA/Flash. I would browse information sites and social sites and download media. And today? I do it faster. Yep, big difference.
Now, go back another ten years and you see a huge difference. No browsers, no widespread internet access, BBS's with local phone numbers rule the day. Win3.1 if you're lucky.
It's so nice when people make things into individual bullet points I can address/rebut.
That's certainly a strawman. Humans are "unique" in the sense that we don't know of any species like ourselves, but not unique in the sense something like us couldn't exist elsewhere or be created artificially.
It's not that the human brain is too complex, it's that it works in a totally different way from all our modern computing technology. You may think this is a minor hurdle, but it isn't. It takes exponentially more digital computing power to simulate an analog brain than a genuine analog brain requires. They work completely differently. It's not a question of complexity, but mechanics.
Come on, those kinds of arguments are so silly as to not even need rebutting. But I'm noticing a lot of your points are really just strawmen so you can make it look like there's no credible criticism of the Singularity hypothesis.
It didn't take a lot of effort to convince me. Another strawman.
Where's the exponential growth in AI? I rest my case.
The bottom line is, the Singularity as described requires strong AI. It doesn't exist. It's been "just around the corner" for decades. Instead, we've only been able to come up with expert systems, nothing we'd call a self-aware intelligence. We don't even know how to do this, because we don't know how human consciousness works.
It's not a question of how powerful our computers are. There seems to be this assumption that, if we simply make a computer powerful enough and feed it tons of information, it will become self-aware and intelligent. There is no reason to believe this. Computers are inherently deterministic. They don't just magically do things without being told to, unless they have flawed hardware/software.
Now, you could make the more existential argument that a facsimile of human consciousness is indistinguishable from the real thing, but in that case you should get back to us when you've seen one.
That sounds like a pretty fringe notion.
Who argued for infinite growth? "Infinite growth" is an oxymoron, anyway. We live in a universe of physical limits, "infinite" anything is physically impossible. Again, a strawman.
I realize there are many people who are in love with the idea that we'll all be Singularity transhumans within our lifetime, but there is no good reason to believe this. While our materials technology and chemistry are highly advanced, and I have no doubt medical technology will totally blow us away in the decades to come, our computing technology remains more or less unchanged since its inception. We still use binary digital systems with processors made up of transistors and volatile storage for memory. We've improved the scale immensely, but the basic mode of operation is the same. And building AI using this technology, in the sense most people think of it, has been a research dead-end since at least the '60s.
I have no doubt we will have some awesome technology in the decades to come, but strong AI? Without some major breakthrough in computing technology, it's not happening in our lifetimes. And as far as I understand it, the Singularity hinges very much on the existence of strong AI.
Nowhere is this more true than in medical science. Miracle cures for cancer and limb regrowth have been "just around the corner" since I was first old enough to read.
Artificial intelligence researchers agree less and less even on what it is they're trying to accomplish or create.
Most proponents of the idea that people might one day transfer themselves into computers don't even seem to ask intelligent questions about what people are - and the questions aren't even new ones. Kurzweil in particular often seems driven by father issues rather than imagination, much less science. In fact the prophets of the Singularity behave as if they've noticed nothing about the world except those facets of technology that fascinate them - least of all economics.
Separate names with a comma.