View Single Post
Old November 20 2012, 06:22 AM   #179
Crazy Eddie
Rear Admiral
 
Crazy Eddie's Avatar
 
Location: I'm in your ___, ___ing your ___
Re: What are your top 5 technologies of the next 15 years?

RAMA wrote: View Post
I've seen the numbers about the upper limits you mention(i have them in book form, I'll try and find a link), and they are higher than you think, not lower.
Unlikely, especially since you don't know how high I think they are.

The 6th paradigm will continue the curve already established
Unlikely, since you do not actually know what the next paradigm IS.

Moore's law is the 5th paradigm, and the various technologies to extend it have already appeared, the 6th generation ones either are in development, and in some cases already exist, but not in fully finished form. The fact there is more than one will tell you something, the fact that I can post breakthroughs on them almost every month is also telling..
First, if you can post monthly breakthroughs on them, then they're still part of the CURRENT paradigm, not the next one. They may extend the digital paradigm somewhat or help it take form, or -- alternately -- hasten the approach of its limiting factors. But they will not lead to the transition of a NEW paradigm without a fundamental shift in their most basic applications, after which the patterns of old paradigm cease to be meaningful.

This would be easier for you to understand if you compared the current (5th) paradigm with the previous two.

The paradigm shift rate (i.e., the overall rate of technical progress) is currently doubling (approximately) every decade
I'm beginning to wonder if you actually know what a "paradigm" is.

It’s obvious what the sixth paradigm will be after Moore’s Law runs out of steam during the second decade of this century.
Indeed. Which is why the next paradigm is unlikely to have anything whatsoever to do with Moore's law or microprocessors in general. Even 3D circuitry and quantum computing is only going to extend the present paradigm to a limited extent, and even then it may be part of the plateau stage where increasing power/complexity in three dimensional integrated circuits is considerably more expensive than it had been with 2D circuits. Once you reach the limits of 3D circuits, further advances run into that diminishing returns problem; the paradigm shifts to something OTHER than microprocessor technologies, and no new improvement can be made except over unbelievably long timescales for almost superficial levels of improvement.

Thus the (double) exponential growth of computing is broader than Moore’s Law, which refers to only one of its paradigms.
Yep. You clearly DON'T know what a "paradigm" is your anticipation of a paradigm shift is just another rhetorical device you're using to avoid taking the problem seriously.

Observers are quick to criticize extrapolations of an exponential trend on the basis that the trend is bound to run out of “resources.”
Resources has nothing to do with it. The logistic curve is a function based on a saturation point, wherein rapid progress can build on further progress in what seems to be an exponential curve until you reach a saturation point, where the system approaches maturity and the curve flattens out.

In this case, even if you had an infinite quantity of resources, that does not imply infinite growth potential; when microprocessors reach a point at which transistors cannot be further reduced and logic circuits cannot be further enhanced, then that's that, there's no more room for growth (at least, not any amount of growth that could be justified for the expense it would take).

The classical example is when a species happens upon a new habitat (e.g., rabbits in Australia), the species’ numbers will grow exponentially for a time, but then hit a limit when resources such as food and space run out.
Which ultimately has less to do with the resources available and more to do with the equilibrium point of reproductive rates vs. attrition rates. The limited resources (e.g. food) provide the saturation point, and therefore the curve flattens at the point where there are so many rabbits on the continent that the number that die from starvation is approximately equal to the number of live births.

You cannot cry "paradigm shift!" as an escape hatch for that, because an upper limit to microprocessor technology DOES exist, even accounting for innovative new forms of it. There is not even THEORETICALLY infinite growth potential there; even atomic-scale computers would eventually reach a point where they cannot be improved further. And so far, there is no reason to assume that the most radical theoretical limits are even applicable, since PRACTICAL limitations -- e.g. politics, consumer demand, economics, military pressures, and ordinary dumb luck -- are limiting factors as well.

The study by the government proves software keeps up with hardware development, in some cases it is mentioned, it surpasses it.
In this context, the software we're talking about is artificial intelligence, NOT storage capacity, NOT video or sound quality, NOT digital bandwidth and throughput. We're discussing the efficacy of computers not only as expert systems, but as self-examining thinking machines capable of taking roles traditionally performed by expert humans.

By nearly all accounts, the HARDWARE requirement for this was surpassed over a decade ago (even Kurzweil would admit this, which is why several of his 1990s predictions totally failed to pan out). Simply put, the software element to Strong AI just hasn't materialized at all, and in fact is lagging so far behind that the "bottom-up" AI theorists have spent the last couple of years lording it over everyone else with a collective "I told you so." That's why even Kurzweil is now talking about developing computer architectures that mimic the functioning of a human brain, because it's now obvious to EVERYONE that it isn't going to be fixed in software.

Software is important because it's the missing link between the higher processing speed and potential human level AGI.
But it isn't, though. Even in the highly unlikely event you could get a computer to model an existing human brain, it's still only a predictive simulation of that brain based on fixed parameters, not a genuine consciousness.

Of vastly greater import is the fact that outside of laboratory curiosity there's virtually zero market demand for conscious machine labor. UNCONSCIOUS labor is considerably easier to accomplish, especially since the few remaining tasks that require conscious labor can be performed by increasingly less intelligent/lower paid wage slaves.
__________________
The Complete Illustrated Guide to Starfleet - Online Now!
Crazy Eddie is offline   Reply With Quote