Robert Maxwell wrote:
I'm not afraid of technology or the future, dude. I welcome them. I wish the Singularity was right around the corner--the idea of being a transhuman sounds badass.
Doesn't mean I'm willing to delude myself into believing
it's about to happen, though.
Lots of people subconsciously put the brake on this, with knee jerk reactions to the implications post Singularity...its the unknown or our potential lack of place in the universe that scares people. Not being able to predict anything after a certain date may scare the willies out of anyone who is into science or science extrapolation! Reading Marovec in the 90s put me in a dark mood for a long time, so I stored it in the back of my head, only to realize there may be ways around the dreaded "AI holocaust". So far only Star trek (STTMP, possibly some eps of STNG) and The Matrix movies have explored what might happen if everybody didn't die off from clearly Singulatarian AI! Of course the Matrix also showed the teething problems before that came to pass( an understatement I suppose). Id like to see some braver visual SF out there, just as Brin himself has written about the Singularity.
It's not delusional to extrapolate from the best current evidence, something people do every day on a variety of topics. Just today I read an article from a skeptic, he agrees on 6 of 7 points of the Singularity, including the lack of knowing what happens behind a point of human level AI...something he also agrees is a given...he simply disagrees with Ray Kurweil's version of it. I'd say I don't know what will happen, only that if I accept some of the elements of it are likely or probable, it leads be to believe others are too. Everything else I say tries to demonstrate why I believe this way.