(MacLeod is an English writer [he says he's Scottish but there's no difference visible in his work from this side of the Atlantic]
^And a real puff piece.
(MacLeod is an English writer [he says he's Scottish but there's no difference visible in his work from this side of the Atlantic]
The difference will be visible when you call a Scot English to his face- visible in the form of an oncoming fist, usually!
Yeah, well, when it's history and fact come talk to me about it. Until then it's as likely as the Rapture, which you can also come talk to me about when it's history and fact.
I remember being astounded by my first 50MB harddrive. Then again by my 500MB drive, and my 2GB, 200GB, 1TB drives over the past 20+ years. In 1995 I could get on the internet and surf websites using Windows 95 and netscape. Not all that different from how I surf the web today.
It's basically geography. You don't call a Texan a New Yorker and vice versa. Want a catch-all term, you have 'British.'For some writers, the difference between Scots and English is too subtle for some of us over here.
It's basically geography. You don't call a Texan a New Yorker and vice versa. Want a catch-all term, you have 'British.'For some writers, the difference between Scots and English is too subtle for some of us over here.
Not quite.It's basically geography. You don't call a Texan a New Yorker and vice versa. Want a catch-all term, you have 'British.'For some writers, the difference between Scots and English is too subtle for some of us over here.
More like the difference between "American" and "Canadian"
I remember being astounded by my first 50MB harddrive. Then again by my 500MB drive, and my 2GB, 200GB, 1TB drives over the past 20+ years. In 1995 I could get on the internet and surf websites using Windows 95 and netscape. Not all that different from how I surf the web today.
There is an astoundingly huge difference between surfing the web today and surfing the web 17 years ago!
When discussing computer technology, things seem to operate in shifts and rises. In other words, technology is linear until a major shift in technology, such as the integrated circuit. I expect we will continue to see linear growth in computing power until the next major shift, be it holographic technology, dna based computing, or something nobody has thought of.
However, denying that changes in technology is accelerating is difficult to do. Humans alive today enjoy technology that couldn't be imagined in 1900. Humans in 1900, however, weren't living all that differently from a technology perceptive than humans in 1800.
Whether the next 100 years will offer the same range of advancements as the last 100 is pure speculation, however.
Extrapolation is not "fact". It's well informed guessing at most. It can be wrong.Its not deniable, facts are facts.
RAMA
Yeah, well, when it's history and fact come talk to me about it. Until then it's as likely as the Rapture, which you can also come talk to me about when it's history and fact.
See but it can't be history because it hasn't happened yet(should be obvious right?)...I think that's a weak argument...in fact all the argument's against a singularity are pretty weak...first, while I have already conceded the ultimate end results are uncertain, but if I postulate accelerated change that leads to "runaway" AI, and this seems to be the thing most people ARE certain about, it then follows this AI will reach a saturation point. AI experts, then others assumed supplanting human intelligence would mean a change in evolution where the smarter beings would then replace the others.
The most common arguments against the possible singularity:
1. Knee-jerk human centrism, we are "unique": Neither machine or human derived AI are proper human beings, so they must reject the notion they could exist. We have also discovered we are not necessarily unique in almost every human endeavor to the natural world...planets exist everywhere and many might contain life. We are also no longer the center of the universe.
2. Human brain is too complex: In fact, the greatest era of brain discovery is happening right now as we speak, it turns out the brain is actually quantifiable. This is direct contradiction to all we've heard about how the brain is so complex. For this I'll refer you to here: http://singinst.org/overview/whatisthesingularity/
3. Religion, ethics: Only god created man, or can create intelligence. Ethically, does creating facsimiles, then superior intelligence diminish us? Will creating something that may destroy/bypass us be self-defeating? Everyone has a personal answer here...I reject religious reasoning out of hand. I answer the second question elsewhere in the list. The third is most valid, we may still have potential to stop the technology from advancing but not without a huge, concerted effort, one not likely to happen for many reasons, including economic ones. We MAY be able to program in an Asimovian robotic law, but I doubt that will work with exponential AI.
4. Linear thinking: It takes a tremendous amount of effort to convince the avg human that things are not like our perception all the time, ie: that there are microscopic organisms, that we are made of atoms, that no creator was needed to create the universe, and consequently that our slow perception of time and limited, 80+ years avg lifespan keeps us from seeing the big picture.
5. We are far behind technologically from what we need to be for a singularity: This always assumes lack of exponential growth, and always fails on those terms. This is the clearest evidence out there...exponentials end but only to make way for the next paradigm change. The existence of the meme also makes the prediction self-fulfilling, there are many creators of this technology and $ to back it working hard to make it appear now. I also feel that while the math makes the singularity possible before 2050, it may not happen, but is probable before 2100.
6. It's doomsday: Only if you assume #1. You can look at it two ways, that advanced human AI is a great evolutionary step, or, if we are cast aside through indifference, or possibly even war, then the machines will be representatives of a past human culture. I don't find either too horrible really, though the latter is not my preference.
7. There is no infinite growth: You don't need infinite growth for supra-intelligent AI to man. Exponentials do indeed come to an end, but the number we are talking about are more than enough for the next 50 years for the singularity to take place.
See you in 20 years when things haven't changed as much as you think.
We use essential cookies to make this site work, and optional cookies to enhance your experience.