RSS iconTwitter iconFacebook icon

The Trek BBS title image

The Trek BBS statistics

Threads: 138,937
Posts: 5,390,097
Members: 24,721
Currently online: 562
Newest member: Miltan08

TrekToday headlines

New Trek-themed Bobble Heads
By: T'Bonz on Aug 21

IDW Publishing November Trek Comic
By: T'Bonz on Aug 20

Pegg/Wright Trilogy In The Works
By: T'Bonz on Aug 20

Star Trek: The Compendium Rebate Details
By: T'Bonz on Aug 20

Gold Key Archives Volume 2
By: T'Bonz on Aug 19

Takei Documentary Wins Award
By: T'Bonz on Aug 19

Cumberbatch To Voice Khan
By: T'Bonz on Aug 19

Shaun And Ed On Phineas and Ferb
By: T'Bonz on Aug 18

New Ships Coming From Official Starships Collection
By: T'Bonz on Aug 18

Trek Stars Take On Ice Bucket Challenge
By: T'Bonz on Aug 18


Welcome! The Trek BBS is the number one place to chat about Star Trek with like-minded fans. Please login to see our full range of forums as well as the ability to send and receive private messages, track your favourite topics and of course join in the discussions.

If you are a new visitor, join us for free. If you are an existing member please login below. Note: for members who joined under our old messageboard system, please login with your display name not your login name.


Go Back   The Trek BBS > Entertainment & Interests > Science Fiction & Fantasy

Science Fiction & Fantasy Farscape, Babylon 5, Star Wars, Firefly, vampires, genre books and film.

Reply
 
Thread Tools
Old December 8 2011, 07:34 PM   #61
Deckerd
Fleet Arse
 
Deckerd's Avatar
 
Location: the Frozen Wastes
Re: Some science fiction "firsts"

RAMA wrote: View Post
Sometimes Science fiction begats or spurs forward whole philosophies and new fields of study, working almost hand-in-hand with scientists/technologists/futurists. In terms of the Singularity--possibly one of the future defining moments of mankind--defined as a point in time where computers or AI outstrip the natural evolution of human intelligence to the degree that predicting the thought process and technological leaps afterward are impossible to those preceeding it unaided.
I don't want to be a wet blanket but why would any company fund a machine that tries to outstrip the human brain? I mean sure you can have processors that can calculate almost anything faster than a human brain can but human intelligence is not a set of calculations. Human creativity is not a set of calculations. Even if you did have an evil millionaire who wanted to create a program to find them all and in the darkness bind them, it would fail because people without any programming skill whatsoever breed geniuses all the time.
__________________
They couldn't hit an elephant at this distance.
Deckerd is offline   Reply With Quote
Old December 8 2011, 07:43 PM   #62
RAMA
Vice Admiral
 
RAMA's Avatar
 
Location: NJ, USA
Re: Some science fiction "firsts"

Crassmass Eve wrote: View Post
RAMA wrote: View Post
Sometimes Science fiction begats or spurs forward whole philosophies and new fields of study, working almost hand-in-hand with scientists/technologists/futurists. In terms of the Singularity--possibly one of the future defining moments of mankind--defined as a point in time where computers or AI outstrip the natural evolution of human intelligence to the degree that predicting the thought process and technological leaps afterward are impossible to those preceeding it unaided.
I don't want to be a wet blanket but why would any company fund a machine that tries to outstrip the human brain? I mean sure you can have processors that can calculate almost anything faster than a human brain can but human intelligence is not a set of calculations. Human creativity is not a set of calculations. Even if you did have an evil millionaire who wanted to create a program to find them all and in the darkness bind them, it would fail because people without any programming skill whatsoever breed geniuses all the time.
Here's why...we will always want to expand the capabilites of the human brain, if we want to be the ones who exist as AI or facsimiles of ourselves after the speculated singularity, as opposed to the "machine overlords" we'll have to improve storage, memory, and speed of the human thought process. Contemporary PCs already expand our human RAM and hard drive space for information, in the future we will want that directly tied into us. Even if we hadn't thought of the singularity, the only way to pre-empt biological evolution and speed up memory and thought, is to turn to artificial means.

The "bad"(good?) news is, researchers are already working on AI all over the world. Many of them believe in the inevitability of what they are doing leading to the takeover. I like to give humanity enough credit that we may supplant this takeover with our own AI evolution.

Is human intelligence more than the sum of it's parts? Well yes the human brain is amazing, but there are elements of it machines can do better already. I along with most--if not all--of the researchers do not believe in any inante ability of the human brain that is not biologically derived and cannot be replicated or surpassed in some way with AI.
__________________
It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring. Carl Sagan
RAMA is offline   Reply With Quote
Old December 8 2011, 07:50 PM   #63
Christopher
Writer
 
Christopher's Avatar
 
Re: Some science fiction "firsts"

^Except a recent study suggests it may not be feasible to expand human intelligence beyond its current level:

http://io9.com/5865987/why-our-minds...ar-as-they-can
"These kinds of studies suggest there is an upper limit to how much people can or should improve their mental functions like attention, memory or intelligence.... There are always trade-offs. In other words, there is a 'sweet spot' in terms of enhancing our mental abilities — if you go beyond that spot — just like in the fairy-tales — you have to pay the price."
__________________
Christopher L. Bennett Homepage -- Site update 4/8/14 including annotations for Rise of the Federation: Tower of Babel

Written Worlds -- My blog
Christopher is offline   Reply With Quote
Old December 8 2011, 07:52 PM   #64
Deckerd
Fleet Arse
 
Deckerd's Avatar
 
Location: the Frozen Wastes
Re: Some science fiction "firsts"

When you say "many of them believe the inevitability" of a takeover of AI; I don't believe you. There's a huge AI department at the university I work in and what they're doing is trying to get programs to learn. By learn I mean become aware of their environment, react to its parameters, remember those parameters and then work within those parameters. That's a long way from composing the Liebestod from Tristan und Isolde. In fact it's never going to happen.
__________________
They couldn't hit an elephant at this distance.
Deckerd is offline   Reply With Quote
Old December 8 2011, 08:20 PM   #65
RAMA
Vice Admiral
 
RAMA's Avatar
 
Location: NJ, USA
Re: Some science fiction "firsts"

Crassmass Eve wrote: View Post
When you say "many of them believe the inevitability" of a takeover of AI; I don't believe you. There's a huge AI department at the university I work in and what they're doing is trying to get programs to learn. By learn I mean become aware of their environment, react to its parameters, remember those parameters and then work within those parameters. That's a long way from composing the Liebestod from Tristan und Isolde. In fact it's never going to happen.

Yup, lots of those in the AI/robotics field lament how long it's taken to get where we are, but two things mitigate that. 1) Human biological evolution takes places over millions of years, AI has been worked on for mere decades out of that timescale. 2) The growth is exponential, meaning in rapid succesion, not on the normal linear timeline we usually perceive as humans in every day life, so the "slow" progress (which is actually lightning fast on a biological or even geological timescale)will mean such predicted AI in a few decades.

Yours is not an unusual reaction, because humans generally can only think of machines or intelligence as products independent of other things, and that will not be the case in the future. If you bring theism, human centrism into it, then there is going to be quite a knee jerk reaction to it. Trust me, if the "takeover" is true, you'll want to be an AI, and it may not have to be war, supplanting the machines may mean simply out-adapting/competing/evolving.

In terms of the actual material accomplishment of your "impossible" task, there is a lot of source material on the subject, Hans Marovec's work is available all over the internet for free. Of course a key work on the explanation of why the human brain is quantifiable, and computer technologies are improving(interestingly, a predicted 3D chip was just reported in Wired Magazine the other day) is available in Singularity is Near

RAMA
__________________
It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring. Carl Sagan
RAMA is offline   Reply With Quote
Old December 8 2011, 08:25 PM   #66
xortex
Commodore
 
Location: Staten Island, NY
Re: Some science fiction "firsts"

Well it's the Human component link that is the really scary part. It's not what machines can do for us, but what we can make machines capable of - like things like telepathy and creativity and making it limitless as far as we can see unless there is a collective mind like the Borg and there are dimensions that we can't see - the higher dimensions of pure thought. Good ole trial and error again. Whoops I opened up a whole new dimension of demons and angels waging war. Close it. i can't.

Last edited by xortex; December 8 2011 at 08:47 PM.
xortex is offline   Reply With Quote
Old December 8 2011, 08:50 PM   #67
Christopher
Writer
 
Christopher's Avatar
 
Re: Some science fiction "firsts"

RAMA wrote: View Post
Which is why it will be a facsimile AI...or in the shorter term, you'll see stuff like "jacking in" from cyberpunk or Matrix...think of AI as "buffers" to the storage of the brain...there are theoretical limits to the computer ability to process info beyond that, but they are immensely high.
I don't think you actually read the article I linked to, or if you did, you missed the point. What the research shows is that, yes, you could increase the brain's ability to do a certain thing, but there are negative consequences to that increase that might cancel out any benefits from it. Amplify a person's imagination too much and they become schizophrenic. Amplify their logic and systematic thought too much and they become autistic. Amplify their ability to discern patterns too much and they become paranoid. By analogy, you could engineer the body to have extra limbs or sense organs or muscles, but the added metabolic cost of having them might cancel out any gain from having them, or the amount of neurological connections that would have to be devoted to them might diminish one's mental or physical functionality. So there are limits to how much you can practically enhance a body's physical abilities, and the same may well be true for enhancing a brain.

So even if you did use external computer hardware to enhance the brain's performance, it might end up undermining the brain's performance in key ways as well, throwing off the balance that enables it to work. Human intelligence may already be at the point of diminishing returns -- or, to put it more optimistically, in a sort of "Goldilocks zone" for sentience, an optimal balance where our minds have enough complexity and dynamism to be conscious and creative but not so much that they become unstable.
__________________
Christopher L. Bennett Homepage -- Site update 4/8/14 including annotations for Rise of the Federation: Tower of Babel

Written Worlds -- My blog
Christopher is offline   Reply With Quote
Old December 8 2011, 08:53 PM   #68
RAMA
Vice Admiral
 
RAMA's Avatar
 
Location: NJ, USA
Re: Some science fiction "firsts"

Christopher wrote: View Post
^Except a recent study suggests it may not be feasible to expand human intelligence beyond its current level:

http://io9.com/5865987/why-our-minds...ar-as-they-can
"These kinds of studies suggest there is an upper limit to how much people can or should improve their mental functions like attention, memory or intelligence.... There are always trade-offs. In other words, there is a 'sweet spot' in terms of enhancing our mental abilities — if you go beyond that spot — just like in the fairy-tales — you have to pay the price."

Which is why it will be a facsimile AI or foglets/programmable matter...or in the shorter term, you'll see stuff like "jacking in" from cyberpunk or Matrix...think of AI as "buffers" to the storage of the brain...there are theoretical limits to the computer ability to process info beyond that, but they are immensely high. Computational Limits

xortex wrote: View Post
Well it's the Human component link that is the really scary part. It's not what machines can do for us, but what we can make machines capable of - like things like telepathy and creativity and makes it limitless as far as wel can see unless there is a collective mind like the Borg.
Well I don't see any evidence of telepathy now, so I don't think we'll see AI doing it...unless its a remote way to read future virtual human brains.

Edited for screwing up the urls..oops
__________________
It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring. Carl Sagan
RAMA is offline   Reply With Quote
Old December 8 2011, 09:02 PM   #69
RAMA
Vice Admiral
 
RAMA's Avatar
 
Location: NJ, USA
Re: Some science fiction "firsts"

Christopher wrote: View Post
RAMA wrote: View Post
Which is why it will be a facsimile AI...or in the shorter term, you'll see stuff like "jacking in" from cyberpunk or Matrix...think of AI as "buffers" to the storage of the brain...there are theoretical limits to the computer ability to process info beyond that, but they are immensely high.
I don't think you actually read the article I linked to, or if you did, you missed the point. What the research shows is that, yes, you could increase the brain's ability to do a certain thing, but there are negative consequences to that increase that might cancel out any benefits from it. Amplify a person's imagination too much and they become schizophrenic. Amplify their logic and systematic thought too much and they become autistic. Amplify their ability to discern patterns too much and they become paranoid. By analogy, you could engineer the body to have extra limbs or sense organs or muscles, but the added metabolic cost of having them might cancel out any gain from having them, or the amount of neurological connections that would have to be devoted to them might diminish one's mental or physical functionality. So there are limits to how much you can practically enhance a body's physical abilities, and the same may well be true for enhancing a brain.

So even if you did use external computer hardware to enhance the brain's performance, it might end up undermining the brain's performance in key ways as well, throwing off the balance that enables it to work. Human intelligence may already be at the point of diminishing returns -- or, to put it more optimistically, in a sort of "Goldilocks zone" for sentience, an optimal balance where our minds have enough complexity and dynamism to be conscious and creative but not so much that they become unstable.
I don't see why they can't raise the "optimal balance", you can raise brain performance, but what's to say they can't easily control other elements of the AI-human brain, sort of a self-aware safety net within the brain itself(yes I think I've seen this in SF before)that eliminates by products of increased peformance like schizophrenia. Or who is to say a virtual human/foglet brain simply isn't much hardier than a totally natural or augmented biological brain?
__________________
It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring. Carl Sagan
RAMA is offline   Reply With Quote
Old December 8 2011, 09:11 PM   #70
xortex
Commodore
 
Location: Staten Island, NY
Re: Some science fiction "firsts"

That would be the unrelated third thing - the child.
xortex is offline   Reply With Quote
Old December 8 2011, 09:32 PM   #71
Christopher
Writer
 
Christopher's Avatar
 
Re: Some science fiction "firsts"

RAMA wrote: View Post
I don't see why they can't raise the "optimal balance", you can raise brain performance, but what's to say they can't easily control other elements of the AI-human brain, sort of a self-aware safety net within the brain itself(yes I think I've seen this in SF before)that eliminates by products of increased peformance like schizophrenia. Or who is to say a virtual human/foglet brain simply isn't much hardier than a totally natural or augmented biological brain?
Are you arguing from science or from the desire to believe? Too many people cling to the Singularity as a matter of religious faith -- "the Rapture for geeks," as Ken McLeod calls it. Science demands healthy skepticism. And just in general, the future never turns out to be the way people expect it to be. With so many people today utterly convinced that the Singularity is inevitable, the more that convinces me that it won't happen, certainly not the way people expect.
__________________
Christopher L. Bennett Homepage -- Site update 4/8/14 including annotations for Rise of the Federation: Tower of Babel

Written Worlds -- My blog
Christopher is offline   Reply With Quote
Old December 8 2011, 10:10 PM   #72
RAMA
Vice Admiral
 
RAMA's Avatar
 
Location: NJ, USA
Re: Some science fiction "firsts"

Christopher wrote: View Post
RAMA wrote: View Post
I don't see why they can't raise the "optimal balance", you can raise brain performance, but what's to say they can't easily control other elements of the AI-human brain, sort of a self-aware safety net within the brain itself(yes I think I've seen this in SF before)that eliminates by products of increased peformance like schizophrenia. Or who is to say a virtual human/foglet brain simply isn't much hardier than a totally natural or augmented biological brain?
Are you arguing from science or from the desire to believe? Too many people cling to the Singularity as a matter of religious faith -- "the Rapture for geeks," as Ken McLeod calls it. Science demands healthy skepticism. And just in general, the future never turns out to be the way people expect it to be. With so many people today utterly convinced that the Singularity is inevitable, the more that convinces me that it won't happen, certainly not the way people expect.

Yes I've seen that idea of course, the difference with the Singularity, is there is a lot of data, models, accurate prediction track record and not simply faith, this is where it separates itself from end of the world cults and past futurists, which often were much more speculative and relied on linear models. Some of the best minds in their fields agree with many of the end results, if not all the specifics of the current predicted date of the singularity...I'm fully able admit the date can vary, but it's not a pie in the sky idea...there's a lot of groundwork. Others also admit the singularity scenerio may come to pass, but not in a positive light, this is also likely, which is why I argue that we need to accelerate as humans even moreso.

In terms of details...well, the singularity might happen, yet many of the details could be off...one technology might be substituted for the other. If you are doubting the technology, there are lots of examples of foglet work, AI, nanotech is now a $2 billion industry...after how many years? Roughly 20 since Engines of Creation.

One thing people are missing...it occurs to me at a time of accelerating change (which we are factually in) we are going to be able to make more predictions and better predictions of the future than we ever have, at least until there may be a singularity type breakdown, if it indeed happens.

One of the chief supporters of the positive singularity, lists a counter, point by point to the skeptics in his book and on his website...Kurzweil

Finally, regardless of the outcome, the discussion of the singularity has changed my point of view on both the future of SF and the world. Its no longer enough most ofthe time for me to see mundane ideas of the future with no info technology involved in the fabric of the culture, where staid, conventional, brute force technologies exist that don't take into account programmable matter and the like. "In Time" was a very good movie to me, but I don't see it as a realistic future in any way, its value lies in it's parable. I recall seeing a recent interview with a famous SF writer (I forget whom at the moment) who said hard SF literature is in a holding pattern as it takes into account the implications of the singularity...
__________________
It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring. Carl Sagan
RAMA is offline   Reply With Quote
Old December 8 2011, 11:12 PM   #73
Christopher
Writer
 
Christopher's Avatar
 
Re: Some science fiction "firsts"

^If you want me to take your argument at all seriously, don't mention Kurzweil. His beliefs seem more rooted in spirituality and wishful thinking than science. At the very least, I consider him overoptimistic.
__________________
Christopher L. Bennett Homepage -- Site update 4/8/14 including annotations for Rise of the Federation: Tower of Babel

Written Worlds -- My blog
Christopher is offline   Reply With Quote
Old December 9 2011, 12:58 AM   #74
Edit_XYZ
Fleet Captain
 
Edit_XYZ's Avatar
 
Location: At star's end.
Re: Some science fiction "firsts"

All 'models' predicting the technological singularity are based upon, they require continual exponential growth - of intelligence, of technology, etc.
Well, if history showed anything, it showed that exponential growth in anything other than abstract mathematics is not sustainable - regardless of your attempts to 'cheat' this rule.
Technology matures and can't be improved further; etc.

IF you can keep up continual exponential growth in the AI field (and the signs are that you can't), you may - or may not (perhaps 'intelligence' in humans is a mature 'technology') - be able to have a being more intelligent than humans, functioning. But, in any case, you won't be able to keep improving that intelligence; sooner or later, you'll hit a wall.
Singularity proponents gamble that this 'wall' is beyond the singularity - and they have no convincing arguments for it.

It's almost certain there isn't a logic fundamentally 'better' than the one known to us - meaning, we have already hit the wall in this area; you may have a being thinking faster than us (quantitatively), but not qualitatively 'better'.
Edit_XYZ is offline   Reply With Quote
Old December 9 2011, 01:27 AM   #75
RAMA
Vice Admiral
 
RAMA's Avatar
 
Location: NJ, USA
Re: Some science fiction "firsts"

I love this: "Leaving the Opera"!

__________________
It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring. Carl Sagan
RAMA is offline   Reply With Quote
Reply

Bookmarks

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



All times are GMT +1. The time now is 02:04 PM.

Powered by vBulletin® Version 3.8.6
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
FireFox 2+ or Internet Explorer 7+ highly recommended.