View Single Post
Old November 18 2012, 05:29 AM   #165
Crazy Eddie
Rear Admiral
 
Crazy Eddie's Avatar
 
Location: I'm in your ___, ___ing your ___
Re: What are your top 5 technologies of the next 15 years?

RAMA wrote: View Post
Sorry but the issue was directly addressed. You stated simply that exponentials don't continue indefinitely to which I reply this is true, but they develop to the point where a new paradigm takes over, and this is not fantasy, there are already 5 demonstrably true levels of paradigms that have taken place, Moore's Law is the 5th.
Apart from the fact that you have essentially conceded that Moore's law is unlikely to continue exponential growth indefinitely, this still ignores the fact that the next paradigm may or may not have anything at all to do with computer technology. If it is a shift in, say, nanotechnology (and it probably will be) the result would be another logistic curve, this time for mass production capacity; the same industrial products could be produced faster and faster by increasingly smaller and smaller manufacturing machines; by the time the curve starts to level off for the next paradigm shift, you start to get industrial machines the size of skittles that can eat a pile of sawdust and spit out a kitchen table.

The new paradigm wouldn't extend Moore's law to microprocessors at all; once computer technology hits its plateau stage, it cannot really be improved further (it won't get any smaller or faster or more powerful than it already is), but in the new paradigm the same computer can be manufactured considerably faster/easier/in larger numbers and for far smaller expense.

It is also true that exponentials are not infinite
If it's not infinite then it is, by definition, not exponential.

More importantly, without knowing exactly when the curve will begin to flatten out at saturation point, it's difficult to predict exactly where the technology will end up, especially since all other social/political/economic/military factors are still difficult to nail down. The point of diminishing returns has potential to sneak up on you unexpectedly if it involves factors you had previously ignored or judged unimportant just because you assumed they would be eventually mitigated.

How does this skirt the issue in any way?
Because you're assuming the paradigm shift renders the the flattening curve irrelevant. That's an assumption without a basis; it's entirely possible that scientists will make a breakthrough with quantum computers in the next thirty years, after which it begins to become exponentially more difficult to make any advancements at all.

So it does indeed show the main thrust of the curve(s) still continue... but not necessarily for computers.

The third, is Christopher's (supported by several software posters from this board) suggestion that software has not kept pace with this info curve, which is also demonstrably untrue based on the two articles I posted.
The articles demonstrate nothing of the kind. Software HASN'T kept up with those advances, for the specific reason that software engineers develop applications based on the end user's needs, NOT on the available processor power of the platform running it.

IOW, software isn't SUPPOSED to keep pace with processing power; processing power is a potential resource that engineers can exploit when demand for new capabilities begins to manifest, but in the end, those applications are driven by consumer demand first and foremost and technical capacity second.

Conclusion: the criticsm by exponential not being natural law or finite in info tech (and by extension anything that becomes an infotech) is not valid.
Nobody made that criticism, RAMA. The criticism from the get go was that the expanding curve engendered in Moore's law is unlikely to continue indefinitely, primarily because the exponential curve looks exactly like a logistic curve until the point where it starts to level off.

And there IS, in fact, an upper limit to how far microprocessors can be miniaturized or enhanced, especially once you get down to quantum computers and molecule-sized transistors.

The proof I cite foir software's expoenetial comes from an industry report as well as a government report.
But you're conflating hardware and software as if they were the same thing. They are not, not even close. Hardware can be considered a a virtual vessel in which to contain data and overlapping processes devoted to a specific task, which in turn enables larger and more sophisticated software applications to fill that vessel. But it is ALSO true that a larger number of smaller applications can be simultaneously run on the same hardware that wouldn't have been possible otherwise; the exponential growth in computer power would NOT, in that case, lead directly to an exponential growth in software capability, as the applications themselves could follow a more linear progression by very small increases in capability spread out over a much larger number of applications.

This is most obvious in the issue of digital storage. Flash memory and nonvolatile storage devices may eventually outperform hard drives by a considerable margin, but that DOES NOT mean that all future media formats will be pigeonholed into HD quality just because more systems can handle their storage and playback. Quantity as well as quality will increase, and depending on user needs, it may be the former more than the latter.

This has very serious implications for AI and therefore the singularity (see below).

It is very far from a one dimensional development, and as some of our conversations revolved around this, I'm surprised you're even bringing this up again or maybe you didn't realize why I was establishing those conditions allowing for the change.
I bring it up again because you failed to address, in every single case, the fact that the POTENTIAL for change in no way implies the APPROACH of change. Again, the issue here is that you are very easily impressed by pop-sci articles and have a tendency to accept (and in some cases, to volunteer yourself) the most optimistic projections of those technologies based purely on a best-case scenario. You essentially live in a world where inventors never go bankrupt, where startup companies never fail, where great ideas never get pushed to the wayside, where Cisco never shut down the entire Flipcam production line just because they were bored.

The sole basis for the singularity is a projection on the future capabilities of Expert Systems. Put very simply, the Singularity is what happens when expert systems gain the capability to design improved copies of themselves without human intervention; machine intelligence becomes superior to human intelligence to the point that humans no longer control the developmental process (hence it is a Singularity by analogy to a Black Hole: you cannot see beyond the event horizon represented by the Expert System because it is impossible to make meaningful predictions about the value system or decision-making process of such a system). Singularity theory assumes the exponential growth curve is either indefinite or will continue long enough to bring this about.

In the first place, as I and others have pointed out, this is a flawed assumption because the exponential growth of hardware has an inherent upper limit that we may be approaching more rapidly than you think. In the second place -- and vastly more importantly -- software development is driven by user needs, NOT by hardware capabilities. I have myself pointed out on MANY occasions, AIs and robots are capable of replacing humans in virtually any task you can think of, provided the right software and hardware specializations are developed; even the self-improving Expert System would be a more efficient software engineer than the best human in the industry. The thing is, none of these tasks would gain any benefit from machine SENTIENCE, as even the Expert System doesn't need to have any semblance of self-awareness, self-motivation or the ability to make abstract value judgements in order to effectively analyze the needs of end users and construct software applications accordingly. In fact, sentience would almost certainly make it LESS useful, as the ability to think beyond the scope of its task would be a distraction to eat up a significant portion of its (admittedly huge) processing power.

My overall point is that your projections of singularity theory are basically a combination of jubilant optimism of all things technical, combined with reading way too much sensationalist literature without thinking critically about how that process would actually take place.

As part of this info availabilty change, I don't just have to stick with magazines that are months out of date, I get multiple feeds of info especially on technological change right to my smartphone, literally thousands of articles through apps, email, etc.
We noticed.
__________________
The Complete Illustrated Guide to Starfleet - Online Now!
Crazy Eddie is offline   Reply With Quote