Honestly iPad's seem to be a great advancement over mankind. We could end up just using them as our full-time OS.
So what's my unscientific (though researched) view to what's possible:
Singularity..the actual moment when machine AGI is smarter than us before 2050, based on mathematics and models: 100%
Singularity before 2050...as a paradigm shift for all humanity as described by proponents: 75%
Computer passing Turing test before 2030: 95%
Brain uploading before 2040: 100%
Other transhuman tech common before 2040 including brain "downloading"(like matrix): 100%
Nanotech assemblers common before 2045: 90%
Nanotech material becoming common before 2030: 100%
Foglet technology before 2050: 80%
Possibility a singularity leads to takeover by machines AGI: 50%
Possibility of an artilect war: 40%
Untethered human acting robots or androids before 2040: 100%
Renewable energy technology taking the lead over traditional energy resources by 2040: 75%
Solar power satellites before 2040: 30%
Fusion power: 10 fusion plants by 2050: 90%
Pollution control through biotech and nanotech at a high level before 2040: 90%
Genetically customized drugs common before 2025: 90%
Water scarcity post 2020: 0%
Farming technologies ending hunger before 2040: 90%
MIssion to Mars before 2035: 50%
Asteroid mining before 2040: 40%
Mission to another star 2100..most likely by Von Neumann machines: 90%
RAMA
BTW write these down folks, especially you younger people.
Apart from the fact that you have essentially conceded that Moore's law is unlikely to continue exponential growth indefinitely, this still ignores the fact that the next paradigm may or may not have anything at all to do with computer technology. If it is a shift in, say, nanotechnology (and it probably will be) the result would be another logistic curve, this time for mass production capacity; the same industrial products could be produced faster and faster by increasingly smaller and smaller manufacturing machines; by the time the curve starts to level off for the next paradigm shift, you start to get industrial machines the size of skittles that can eat a pile of sawdust and spit out a kitchen table.Sorry but the issue was directly addressed. You stated simply that exponentials don't continue indefinitely to which I reply this is true, but they develop to the point where a new paradigm takes over, and this is not fantasy, there are already 5 demonstrably true levels of paradigms that have taken place, Moore's Law is the 5th.
If it's not infinite then it is, by definition, not exponential.It is also true that exponentials are not infinite
Because you're assuming the paradigm shift renders the the flattening curve irrelevant. That's an assumption without a basis; it's entirely possible that scientists will make a breakthrough with quantum computers in the next thirty years, after which it begins to become exponentially more difficult to make any advancements at all.How does this skirt the issue in any way?
The articles demonstrate nothing of the kind. Software HASN'T kept up with those advances, for the specific reason that software engineers develop applications based on the end user's needs, NOT on the available processor power of the platform running it.The third, is Christopher's (supported by several software posters from this board) suggestion that software has not kept pace with this info curve, which is also demonstrably untrue based on the two articles I posted.
Nobody made that criticism, RAMA. The criticism from the get go was that the expanding curve engendered in Moore's law is unlikely to continue indefinitely, primarily because the exponential curve looks exactly like a logistic curve until the point where it starts to level off.Conclusion: the criticsm by exponential not being natural law or finite in info tech (and by extension anything that becomes an infotech) is not valid.
But you're conflating hardware and software as if they were the same thing. They are not, not even close. Hardware can be considered a a virtual vessel in which to contain data and overlapping processes devoted to a specific task, which in turn enables larger and more sophisticated software applications to fill that vessel. But it is ALSO true that a larger number of smaller applications can be simultaneously run on the same hardware that wouldn't have been possible otherwise; the exponential growth in computer power would NOT, in that case, lead directly to an exponential growth in software capability, as the applications themselves could follow a more linear progression by very small increases in capability spread out over a much larger number of applications.The proof I cite foir software's expoenetial comes from an industry report as well as a government report.
I bring it up again because you failed to address, in every single case, the fact that the POTENTIAL for change in no way implies the APPROACH of change. Again, the issue here is that you are very easily impressed by pop-sci articles and have a tendency to accept (and in some cases, to volunteer yourself) the most optimistic projections of those technologies based purely on a best-case scenario. You essentially live in a world where inventors never go bankrupt, where startup companies never fail, where great ideas never get pushed to the wayside, where Cisco never shut down the entire Flipcam production line just because they were bored.It is very far from a one dimensional development, and as some of our conversations revolved around this, I'm surprised you're even bringing this up again or maybe you didn't realize why I was establishing those conditions allowing for the change.
We noticed.As part of this info availabilty change, I don't just have to stick with magazines that are months out of date, I get multiple feeds of info especially on technological change right to my smartphone, literally thousands of articles through apps, email, etc.
1. No thanks1) Autonomous cars. Maybe not fully like we saw in Demolition Man (and countless other movies) but we are already headed there.
2) Flexible, foldable display screens/touch interfaces. Your iPhone will be a pen and you'll unroll the screen out of it.
3) Your phone will be your wallet. All of your cards, everything, will be digitally stored on your phone and accessed via RFID.
4) You will control your iTV by talking to it.
5) Your computer will be able to satisfy you sexually...
1. No thanks1) Autonomous cars. Maybe not fully like we saw in Demolition Man (and countless other movies) but we are already headed there.
2) Flexible, foldable display screens/touch interfaces. Your iPhone will be a pen and you'll unroll the screen out of it.
3) Your phone will be your wallet. All of your cards, everything, will be digitally stored on your phone and accessed via RFID.
4) You will control your iTV by talking to it.
5) Your computer will be able to satisfy you sexually...
2. Touch screen interfaces are stupid
3. No thanks, I'd rather pay with cash if I'm buying something in person
4. No I won't
5. It already does, if internet porn counts?
Also, smart phones and tablets are stupid.
^ Actually, smartphones and tablets would be pretty neat if they weren't... well, smartphones and tablets.
I'm using a MacBook Air right now, as it is the only computer I own; I also have an iPad my dad gave me for Christmas last year and a 5 year old iPod touch. That's three devices I have where there should only be one.
If I could merge the MacBook and the iPad, it would be absolutely perfect; say, a touchscreen for when you need a tablet, and also have a wireless keyboard in the case for when you need a laptop.
Just seems to me tablets would be a lot more useful if developers gave you the option of using them as regular computers if that's what you really need, or switching seamlessly into "mobile mode" or something.
For large purchases, cards are preferable, obviously, but for small purchases, why should I make it easy for my transactions to be tracked? Privacy, etc...So you'd rather go to the bank and withdraw hundreds or thousands of pounds to pay for things like a new TV, new car etc... Using RFID technology in your mobile would be similar to contactless payment, or a card.
True the biggest issue is security at least with Chip and Pin technology in cards even if you lose or have your wallet stolen a person would have to know your pin to use your card.
Might make the roads safer with less stupid people behind the wheel, but as somebody with a triple-digit IQ and a good sense of car control, I'd rather drive properly, without any electronic nannying, let alone without any input whatsoever.As for automated cars it could massively improve capcity on roads, instead of having to keep two+ seconds behind a car, a computer would be able to run cars virtually bumper to bumper. It would be able to adapt the speed to the conditions so could potentially go at faster speeds, or slower depending on conditions. The biggest task is not so much the technological problems, they can be overcome. But the human element.
Not sure.You know I'm sure that there is this new tablet that's been released that you can attach a keybopard to, the Surface by Microsoft.
Honestly iPad's seem to be a great advancement over mankind. We could end up just using them as our full-time OS.
The iPad is probably undervalued as an extention of our intelligenc: it stores information as it expands on human memory, let's you connect wirelessly, and generally does things if used properly that are thought of for well rounded, intelligent human beings, such as reading, listening to music, etc. all while fitting in your hand, but lots of things I talk about will immerse you in such information, and likely change not just thought but patterns of thought...so think of iPad as an early wifi brain interface.
I posted this before..
![]()
Hawking's ibrain:
http://www.devicemag.com/2012/06/25/ibrain-to-hack-into-stephen-hawkings-brain/
So what's my unscientific (though researched) view to what's possible:
Singularity..the actual moment when machine AGI is smarter than us before 2050, based on mathematics and models: 100%
Singularity before 2050...as a paradigm shift for all humanity as described by proponents: 75%
Computer passing Turing test before 2030: 95%
Brain uploading before 2040: 100%
Other transhuman tech common before 2040 including brain "downloading"(like matrix): 100%
Nanotech assemblers common before 2045: 90%
Nanotech material becoming common before 2030: 100%
Foglet technology before 2050: 80%
Possibility a singularity leads to takeover by machines AGI: 50%
Possibility of an artilect war: 40%
Untethered human acting robots or androids before 2040: 100%
Renewable energy technology taking the lead over traditional energy resources by 2040: 75%
Solar power satellites before 2040: 30%
Fusion power: 10 fusion plants by 2050: 90%
Pollution control through biotech and nanotech at a high level before 2040: 90%
Genetically customized drugs common before 2025: 90%
Water scarcity post 2020: 0%
Farming technologies ending hunger before 2040: 90%
MIssion to Mars before 2035: 50%
Asteroid mining before 2040: 40%
Mission to another star 2100..most likely by Von Neumann machines: 90%
RAMA
BTW write these down folks, especially you younger people.
Apart from the fact that you have essentially conceded that Moore's law is unlikely to continue exponential growth indefinitely, this still ignores the fact that the next paradigm may or may not have anything at all to do with computer technology. If it is a shift in, say, nanotechnology (and it probably will be) the result would be another logistic curve, this time for mass production capacity; the same industrial products could be produced faster and faster by increasingly smaller and smaller manufacturing machines; by the time the curve starts to level off for the next paradigm shift, you start to get industrial machines the size of skittles that can eat a pile of sawdust and spit out a kitchen table.Sorry but the issue was directly addressed. You stated simply that exponentials don't continue indefinitely to which I reply this is true, but they develop to the point where a new paradigm takes over, and this is not fantasy, there are already 5 demonstrably true levels of paradigms that have taken place, Moore's Law is the 5th.
The new paradigm wouldn't extend Moore's law to microprocessors at all; once computer technology hits its plateau stage, it cannot really be improved further (it won't get any smaller or faster or more powerful than it already is), but in the new paradigm the same computer can be manufactured considerably faster/easier/in larger numbers and for far smaller expense.
If it's not infinite then it is, by definition, not exponential.It is also true that exponentials are not infinite
More importantly, without knowing exactly when the curve will begin to flatten out at saturation point, it's difficult to predict exactly where the technology will end up, especially since all other social/political/economic/military factors are still difficult to nail down. The point of diminishing returns has potential to sneak up on you unexpectedly if it involves factors you had previously ignored or judged unimportant just because you assumed they would be eventually mitigated.
Because you're assuming the paradigm shift renders the the flattening curve irrelevant. That's an assumption without a basis; it's entirely possible that scientists will make a breakthrough with quantum computers in the next thirty years, after which it begins to become exponentially more difficult to make any advancements at all.
So it does indeed show the main thrust of the curve(s) still continue... but not necessarily for computers.
The articles demonstrate nothing of the kind. Software HASN'T kept up with those advances, for the specific reason that software engineers develop applications based on the end user's needs, NOT on the available processor power of the platform running it.
IOW, software isn't SUPPOSED to keep pace with processing power; processing power is a potential resource that engineers can exploit when demand for new capabilities begins to manifest, but in the end, those applications are driven by consumer demand first and foremost and technical capacity second.
Nobody made that criticism, RAMA. The criticism from the get go was that the expanding curve engendered in Moore's law is unlikely to continue indefinitely, primarily because the exponential curve looks exactly like a logistic curve until the point where it starts to level off.
And there IS, in fact, an upper limit to how far microprocessors can be miniaturized or enhanced, especially once you get down to quantum computers and molecule-sized transistors.
But you're conflating hardware and software as if they were the same thing. They are not, not even close. Hardware can be considered a a virtual vessel in which to contain data and overlapping processes devoted to a specific task, which in turn enables larger and more sophisticated software applications to fill that vessel. But it is ALSO true that a larger number of smaller applications can be simultaneously run on the same hardware that wouldn't have been possible otherwise; the exponential growth in computer power would NOT, in that case, lead directly to an exponential growth in software capability, as the applications themselves could follow a more linear progression by very small increases in capability spread out over a much larger number of applications.
This is most obvious in the issue of digital storage. Flash memory and nonvolatile storage devices may eventually outperform hard drives by a considerable margin, but that DOES NOT mean that all future media formats will be pigeonholed into HD quality just because more systems can handle their storage and playback. Quantity as well as quality will increase, and depending on user needs, it may be the former more than the latter.
This has very serious implications for AI and therefore the singularity (see below).
I bring it up again because you failed to address, in every single case, the fact that the POTENTIAL for change in no way implies the APPROACH of change. Again, the issue here is that you are very easily impressed by pop-sci articles and have a tendency to accept (and in some cases, to volunteer yourself) the most optimistic projections of those technologies based purely on a best-case scenario. You essentially live in a world where inventors never go bankrupt, where startup companies never fail, where great ideas never get pushed to the wayside, where Cisco never shut down the entire Flipcam production line just because they were bored.It is very far from a one dimensional development, and as some of our conversations revolved around this, I'm surprised you're even bringing this up again or maybe you didn't realize why I was establishing those conditions allowing for the change.
The sole basis for the singularity is a projection on the future capabilities of Expert Systems. Put very simply, the Singularity is what happens when expert systems gain the capability to design improved copies of themselves without human intervention; machine intelligence becomes superior to human intelligence to the point that humans no longer control the developmental process (hence it is a Singularity by analogy to a Black Hole: you cannot see beyond the event horizon represented by the Expert System because it is impossible to make meaningful predictions about the value system or decision-making process of such a system). Singularity theory assumes the exponential growth curve is either indefinite or will continue long enough to bring this about.
In the first place, as I and others have pointed out, this is a flawed assumption because the exponential growth of hardware has an inherent upper limit that we may be approaching more rapidly than you think. In the second place -- and vastly more importantly -- software development is driven by user needs, NOT by hardware capabilities. I have myself pointed out on MANY occasions, AIs and robots are capable of replacing humans in virtually any task you can think of, provided the right software and hardware specializations are developed; even the self-improving Expert System would be a more efficient software engineer than the best human in the industry. The thing is, none of these tasks would gain any benefit from machine SENTIENCE, as even the Expert System doesn't need to have any semblance of self-awareness, self-motivation or the ability to make abstract value judgements in order to effectively analyze the needs of end users and construct software applications accordingly. In fact, sentience would almost certainly make it LESS useful, as the ability to think beyond the scope of its task would be a distraction to eat up a significant portion of its (admittedly huge) processing power.
My overall point is that your projections of singularity theory are basically a combination of jubilant optimism of all things technical, combined with reading way too much sensationalist literature without thinking critically about how that process would actually take place.
We noticed.As part of this info availabilty change, I don't just have to stick with magazines that are months out of date, I get multiple feeds of info especially on technological change right to my smartphone, literally thousands of articles through apps, email, etc.
The maximum potential of matter and energy to contain intelligent processes is a valid issue. But according to my models, we won’t approach those limits during this century (but this will become an issue within a couple of centuries).
We also need to distinguish between the “S” curve (an “S” stretched to the right, comprising very slow, virtually unnoticeable growth–followed by very rapid growth–followed by a flattening out as the process approaches an asymptote) that is characteristic of any specific technological paradigm and the continuing exponential growth that is characteristic of the ongoing evolutionary process of technology. Specific paradigms, such as Moore’s Law, do ultimately reach levels at which exponential growth is no longer feasible. Thus Moore’s Law is an S curve. But the growth of computation is an ongoing exponential (at least until we “saturate” the Universe with the intelligence of our human-machine civilization, but that will not be a limit in this coming century). In accordance with the law of accelerating returns, paradigm shift, also called innovation, turns the S curve of any specific paradigm into a continuing exponential. A new paradigm (e.g., three-dimensional circuits) takes over when the old paradigm approaches its natural limit. This has already happened at least four times in the history of computation. This difference also distinguishes the tool making of non-human species, in which the mastery of a tool-making (or using) skill by each animal is characterized by an abruptly ending S shaped learning curve, versus human-created technology, which has followed an exponential pattern of growth and acceleration since its inception.
A specific paradigm (a method or approach to solving a problem, e.g., shrinking transistors on an integrated circuit as an approach to making more powerful computers) provides exponential growth until the method exhausts its potential. When this happens, a paradigm shift (i.e., a fundamental change in the approach) occurs, which enables exponential growth to continue.
The paradigm shift rate (i.e., the overall rate of technical progress) is currently doubling (approximately) every decade; that is, paradigm shift times are halving every decade (and the rate of acceleration is itself growing exponentially). So, the technological progress in the twenty-first century will be equivalent to what would require (in the linear view) on the order of 200 centuries. In contrast, the twentieth century saw only about 25 years of progress (again at today’s rate of progress) since we have been speeding up to current rates. So the twenty-first century will see almost a thousand times greater technological change than its predecessor.
It’s obvious what the sixth paradigm will be after Moore’s Law runs out of steam during the second decade of this century. Chips today are flat (although it does require up to 20 layers of material to produce one layer of circuitry). Our brain, in contrast, is organized in three dimensions. We live in a three dimensional world, why not use the third dimension? The human brain actually uses a very inefficient electrochemical digital controlled analog computational process. The bulk of the calculations are done in the interneuronal connections at a speed of only about 200 calculations per second (in each connection), which is about ten million times slower than contemporary electronic circuits. But the brain gains its prodigious powers from its extremely parallel organization in three dimensions. There are many technologies in the wings that build circuitry in three dimensions. Nanotubes, for example, which are already working in laboratories, build circuits from pentagonal arrays of carbon atoms. One cubic inch of nanotube circuitry would be a million times more powerful than the human brain. There are more than enough new computing technologies now being researched, including three-dimensional silicon chips, optical computing, crystalline computing, DNA computing, and quantum computing, to keep the law of accelerating returns as applied to computation going for a long time.
Thus the (double) exponential growth of computing is broader than Moore’s Law, which refers to only one of its paradigms. And this accelerating growth of computing is, in turn, part of the yet broader phenomenon of the accelerating pace of any evolutionary process. Observers are quick to criticize extrapolations of an exponential trend on the basis that the trend is bound to run out of “resources.” The classical example is when a species happens upon a new habitat (e.g., rabbits in Australia), the species’ numbers will grow exponentially for a time, but then hit a limit when resources such as food and space run out.
Why writing it down? It's just religious faith-based nonsense.
1. No thanks
2. Touch screen interfaces are stupid
3. No thanks, I'd rather pay with cash if I'm buying something in person
4. No I won't
5. It already does, if internet porn counts?
Also, smart phones and tablets are stupid.
For large purchases, cards are preferable, obviously, but for small purchases, why should I make it easy for my transactions to be tracked? Privacy, etc...
The Privacy Illusion
It has come to my attention that many of my readers in the United States believe they have the right to privacy because of something in the Constitution. That is an unsupportable view. A more accurate view is that the government divides the details of your life into two categories:
1. Stuff they don't care about.
2. Stuff they can find out if they have a reason.
Keep in mind that the government already knows the following things about you:
1. Where you live
2. Your name
3. Your income
4. Your age
5. Your family members
6. Your social security number
7. Your maiden name
8. Where you were born
9. Criminal history of your family
10. Your own criminal record
11. Your driving record
12. Your ethnicity
13. Where you work and where you used to work
14. Where you live and where you used to live
15. Names of your family members
16. The value of your home now
17. The amount you paid for your home
18. The amount you owe on your home
19. Your grades in school
20. Your weight, height, eye color, and hair color
The government doesn't know your medical history. But your doctor does, and he'll give it to the government if they produce a warrant.
The government doesn't know your spending details. But your bank and your credit card company do. And the government can subpoena bank records anytime it cares enough to do so. The government can't always watch you pay for stuff with cash, but don't expect that to last. At some point in the next twenty years, physical currency will be eliminated in favor of digital transactions.
Unlikely, especially since you don't know how high I think they are.I've seen the numbers about the upper limits you mention(i have them in book form, I'll try and find a link), and they are higher than you think, not lower.
Unlikely, since you do not actually know what the next paradigm IS.The 6th paradigm will continue the curve already established
First, if you can post monthly breakthroughs on them, then they're still part of the CURRENT paradigm, not the next one. They may extend the digital paradigm somewhat or help it take form, or -- alternately -- hasten the approach of its limiting factors. But they will not lead to the transition of a NEW paradigm without a fundamental shift in their most basic applications, after which the patterns of old paradigm cease to be meaningful.Moore's law is the 5th paradigm, and the various technologies to extend it have already appeared, the 6th generation ones either are in development, and in some cases already exist, but not in fully finished form. The fact there is more than one will tell you something, the fact that I can post breakthroughs on them almost every month is also telling..
I'm beginning to wonder if you actually know what a "paradigm" is.The paradigm shift rate (i.e., the overall rate of technical progress) is currently doubling (approximately) every decade
Indeed. Which is why the next paradigm is unlikely to have anything whatsoever to do with Moore's law or microprocessors in general. Even 3D circuitry and quantum computing is only going to extend the present paradigm to a limited extent, and even then it may be part of the plateau stage where increasing power/complexity in three dimensional integrated circuits is considerably more expensive than it had been with 2D circuits. Once you reach the limits of 3D circuits, further advances run into that diminishing returns problem; the paradigm shifts to something OTHER than microprocessor technologies, and no new improvement can be made except over unbelievably long timescales for almost superficial levels of improvement.It’s obvious what the sixth paradigm will be after Moore’s Law runs out of steam during the second decade of this century.
Yep. You clearly DON'T know what a "paradigm" is your anticipation of a paradigm shift is just another rhetorical device you're using to avoid taking the problem seriously.Thus the (double) exponential growth of computing is broader than Moore’s Law, which refers to only one of its paradigms.
Resources has nothing to do with it. The logistic curve is a function based on a saturation point, wherein rapid progress can build on further progress in what seems to be an exponential curve until you reach a saturation point, where the system approaches maturity and the curve flattens out.Observers are quick to criticize extrapolations of an exponential trend on the basis that the trend is bound to run out of “resources.”
Which ultimately has less to do with the resources available and more to do with the equilibrium point of reproductive rates vs. attrition rates. The limited resources (e.g. food) provide the saturation point, and therefore the curve flattens at the point where there are so many rabbits on the continent that the number that die from starvation is approximately equal to the number of live births.The classical example is when a species happens upon a new habitat (e.g., rabbits in Australia), the species’ numbers will grow exponentially for a time, but then hit a limit when resources such as food and space run out.
In this context, the software we're talking about is artificial intelligence, NOT storage capacity, NOT video or sound quality, NOT digital bandwidth and throughput. We're discussing the efficacy of computers not only as expert systems, but as self-examining thinking machines capable of taking roles traditionally performed by expert humans.The study by the government proves software keeps up with hardware development, in some cases it is mentioned, it surpasses it.
But it isn't, though. Even in the highly unlikely event you could get a computer to model an existing human brain, it's still only a predictive simulation of that brain based on fixed parameters, not a genuine consciousness.Software is important because it's the missing link between the higher processing speed and potential human level AGI.
Which doesn't change the fact that it is a religious faith-based worldview. The meaningful element here is that you have already internalized your articles of faith:Why writing it down? It's just religious faith-based nonsense.
Unlike any of the various Raptures, the Singularity is a technological event, caused by ordinary humans, doing ordinary science, building ordinary technology which follows the ordinary laws of physics. It does not involve any religious or divine powers. It doesn’t involve outside intervention by superior or alien beings. And it’s completely within our control as a species- it will only happen when we go out and make it happen.
We use essential cookies to make this site work, and optional cookies to enhance your experience.