View Single Post
Old November 20 2012, 01:49 AM   #174
RAMA
Vice Admiral
 
RAMA's Avatar
 
Location: NJ, USA
Re: What are your top 5 technologies of the next 15 years?

newtype_alpha wrote: View Post
RAMA wrote: View Post
Sorry but the issue was directly addressed. You stated simply that exponentials don't continue indefinitely to which I reply this is true, but they develop to the point where a new paradigm takes over, and this is not fantasy, there are already 5 demonstrably true levels of paradigms that have taken place, Moore's Law is the 5th.
Apart from the fact that you have essentially conceded that Moore's law is unlikely to continue exponential growth indefinitely, this still ignores the fact that the next paradigm may or may not have anything at all to do with computer technology. If it is a shift in, say, nanotechnology (and it probably will be) the result would be another logistic curve, this time for mass production capacity; the same industrial products could be produced faster and faster by increasingly smaller and smaller manufacturing machines; by the time the curve starts to level off for the next paradigm shift, you start to get industrial machines the size of skittles that can eat a pile of sawdust and spit out a kitchen table.

The new paradigm wouldn't extend Moore's law to microprocessors at all; once computer technology hits its plateau stage, it cannot really be improved further (it won't get any smaller or faster or more powerful than it already is), but in the new paradigm the same computer can be manufactured considerably faster/easier/in larger numbers and for far smaller expense.

It is also true that exponentials are not infinite
If it's not infinite then it is, by definition, not exponential.

More importantly, without knowing exactly when the curve will begin to flatten out at saturation point, it's difficult to predict exactly where the technology will end up, especially since all other social/political/economic/military factors are still difficult to nail down. The point of diminishing returns has potential to sneak up on you unexpectedly if it involves factors you had previously ignored or judged unimportant just because you assumed they would be eventually mitigated.

Because you're assuming the paradigm shift renders the the flattening curve irrelevant. That's an assumption without a basis; it's entirely possible that scientists will make a breakthrough with quantum computers in the next thirty years, after which it begins to become exponentially more difficult to make any advancements at all.

So it does indeed show the main thrust of the curve(s) still continue... but not necessarily for computers.

The articles demonstrate nothing of the kind. Software HASN'T kept up with those advances, for the specific reason that software engineers develop applications based on the end user's needs, NOT on the available processor power of the platform running it.

IOW, software isn't SUPPOSED to keep pace with processing power; processing power is a potential resource that engineers can exploit when demand for new capabilities begins to manifest, but in the end, those applications are driven by consumer demand first and foremost and technical capacity second.

Nobody made that criticism, RAMA. The criticism from the get go was that the expanding curve engendered in Moore's law is unlikely to continue indefinitely, primarily because the exponential curve looks exactly like a logistic curve until the point where it starts to level off.

And there IS, in fact, an upper limit to how far microprocessors can be miniaturized or enhanced, especially once you get down to quantum computers and molecule-sized transistors.

But you're conflating hardware and software as if they were the same thing. They are not, not even close. Hardware can be considered a a virtual vessel in which to contain data and overlapping processes devoted to a specific task, which in turn enables larger and more sophisticated software applications to fill that vessel. But it is ALSO true that a larger number of smaller applications can be simultaneously run on the same hardware that wouldn't have been possible otherwise; the exponential growth in computer power would NOT, in that case, lead directly to an exponential growth in software capability, as the applications themselves could follow a more linear progression by very small increases in capability spread out over a much larger number of applications.

This is most obvious in the issue of digital storage. Flash memory and nonvolatile storage devices may eventually outperform hard drives by a considerable margin, but that DOES NOT mean that all future media formats will be pigeonholed into HD quality just because more systems can handle their storage and playback. Quantity as well as quality will increase, and depending on user needs, it may be the former more than the latter.

This has very serious implications for AI and therefore the singularity (see below).

It is very far from a one dimensional development, and as some of our conversations revolved around this, I'm surprised you're even bringing this up again or maybe you didn't realize why I was establishing those conditions allowing for the change.
I bring it up again because you failed to address, in every single case, the fact that the POTENTIAL for change in no way implies the APPROACH of change. Again, the issue here is that you are very easily impressed by pop-sci articles and have a tendency to accept (and in some cases, to volunteer yourself) the most optimistic projections of those technologies based purely on a best-case scenario. You essentially live in a world where inventors never go bankrupt, where startup companies never fail, where great ideas never get pushed to the wayside, where Cisco never shut down the entire Flipcam production line just because they were bored.

The sole basis for the singularity is a projection on the future capabilities of Expert Systems. Put very simply, the Singularity is what happens when expert systems gain the capability to design improved copies of themselves without human intervention; machine intelligence becomes superior to human intelligence to the point that humans no longer control the developmental process (hence it is a Singularity by analogy to a Black Hole: you cannot see beyond the event horizon represented by the Expert System because it is impossible to make meaningful predictions about the value system or decision-making process of such a system). Singularity theory assumes the exponential growth curve is either indefinite or will continue long enough to bring this about.

In the first place, as I and others have pointed out, this is a flawed assumption because the exponential growth of hardware has an inherent upper limit that we may be approaching more rapidly than you think. In the second place -- and vastly more importantly -- software development is driven by user needs, NOT by hardware capabilities. I have myself pointed out on MANY occasions, AIs and robots are capable of replacing humans in virtually any task you can think of, provided the right software and hardware specializations are developed; even the self-improving Expert System would be a more efficient software engineer than the best human in the industry. The thing is, none of these tasks would gain any benefit from machine SENTIENCE, as even the Expert System doesn't need to have any semblance of self-awareness, self-motivation or the ability to make abstract value judgements in order to effectively analyze the needs of end users and construct software applications accordingly. In fact, sentience would almost certainly make it LESS useful, as the ability to think beyond the scope of its task would be a distraction to eat up a significant portion of its (admittedly huge) processing power.

My overall point is that your projections of singularity theory are basically a combination of jubilant optimism of all things technical, combined with reading way too much sensationalist literature without thinking critically about how that process would actually take place.

As part of this info availabilty change, I don't just have to stick with magazines that are months out of date, I get multiple feeds of info especially on technological change right to my smartphone, literally thousands of articles through apps, email, etc.
We noticed.
I've already seen some of the arguments against exponentials, and aside from the counter which I posted (which are accurate) I've seen the numbers about the upper limits you mention(i have them in book form, I'll try and find a link), and they are higher than you think, not lower. While not infinite they do allow for the necessary power for a Singularity. The 6th paradigm will continue the curve already established, so your assumption that it will not is incorrect.

The maximum potential of matter and energy to contain intelligent processes is a valid issue. But according to my models, we won’t approach those limits during this century (but this will become an issue within a couple of centuries).
We also need to distinguish between the “S” curve (an “S” stretched to the right, comprising very slow, virtually unnoticeable growth–followed by very rapid growth–followed by a flattening out as the process approaches an asymptote) that is characteristic of any specific technological paradigm and the continuing exponential growth that is characteristic of the ongoing evolutionary process of technology. Specific paradigms, such as Moore’s Law, do ultimately reach levels at which exponential growth is no longer feasible. Thus Moore’s Law is an S curve. But the growth of computation is an ongoing exponential (at least until we “saturate” the Universe with the intelligence of our human-machine civilization, but that will not be a limit in this coming century). In accordance with the law of accelerating returns, paradigm shift, also called innovation, turns the S curve of any specific paradigm into a continuing exponential. A new paradigm (e.g., three-dimensional circuits) takes over when the old paradigm approaches its natural limit. This has already happened at least four times in the history of computation. This difference also distinguishes the tool making of non-human species, in which the mastery of a tool-making (or using) skill by each animal is characterized by an abruptly ending S shaped learning curve, versus human-created technology, which has followed an exponential pattern of growth and acceleration since its inception.
A specific paradigm (a method or approach to solving a problem, e.g., shrinking transistors on an integrated circuit as an approach to making more powerful computers) provides exponential growth until the method exhausts its potential. When this happens, a paradigm shift (i.e., a fundamental change in the approach) occurs, which enables exponential growth to continue.
Moore's law is the 5th paradigm, and the various technologies to extend it have already appeared, the 6th generation ones either are in development, and in some cases already exist, but not in fully finished form. The fact there is more than one will tell you something, the fact that I can post breakthroughs on them almost every month is also telling..

The paradigm shift rate (i.e., the overall rate of technical progress) is currently doubling (approximately) every decade; that is, paradigm shift times are halving every decade (and the rate of acceleration is itself growing exponentially). So, the technological progress in the twenty-first century will be equivalent to what would require (in the linear view) on the order of 200 centuries. In contrast, the twentieth century saw only about 25 years of progress (again at today’s rate of progress) since we have been speeding up to current rates. So the twenty-first century will see almost a thousand times greater technological change than its predecessor.
It’s obvious what the sixth paradigm will be after Moore’s Law runs out of steam during the second decade of this century. Chips today are flat (although it does require up to 20 layers of material to produce one layer of circuitry). Our brain, in contrast, is organized in three dimensions. We live in a three dimensional world, why not use the third dimension? The human brain actually uses a very inefficient electrochemical digital controlled analog computational process. The bulk of the calculations are done in the interneuronal connections at a speed of only about 200 calculations per second (in each connection), which is about ten million times slower than contemporary electronic circuits. But the brain gains its prodigious powers from its extremely parallel organization in three dimensions. There are many technologies in the wings that build circuitry in three dimensions. Nanotubes, for example, which are already working in laboratories, build circuits from pentagonal arrays of carbon atoms. One cubic inch of nanotube circuitry would be a million times more powerful than the human brain. There are more than enough new computing technologies now being researched, including three-dimensional silicon chips, optical computing, crystalline computing, DNA computing, and quantum computing, to keep the law of accelerating returns as applied to computation going for a long time.
Thus the (double) exponential growth of computing is broader than Moore’s Law, which refers to only one of its paradigms. And this accelerating growth of computing is, in turn, part of the yet broader phenomenon of the accelerating pace of any evolutionary process. Observers are quick to criticize extrapolations of an exponential trend on the basis that the trend is bound to run out of “resources.” The classical example is when a species happens upon a new habitat (e.g., rabbits in Australia), the species’ numbers will grow exponentially for a time, but then hit a limit when resources such as food and space run out.
The study by the government proves software keeps up with hardware development, in some cases it is mentioned, it surpasses it. I don't know what other proof you want. I'll take my proof over your claims any day. Software is important because it's the missing link between the higher processing speed and potential human level AGI.

Yes companies go bankrupt, countries pass stupid laws, there are depressions and recessions and war, and yet the upward curve has never stopped.
__________________
It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring. Carl Sagan
RAMA is offline   Reply With Quote