What are your top 5 technologies of the next 15 years?

Discussion in 'Science and Technology' started by RAMA, Apr 7, 2012.

  1. data909

    data909 Ensign Newbie

    Joined:
    Aug 29, 2012
    Honestly iPad's seem to be a great advancement over mankind. We could end up just using them as our full-time OS.
     
  2. RAMA

    RAMA Admiral Admiral

    Joined:
    Dec 13, 1999
    Location:
    USA
    The iPad is probably undervalued as an extention of our intelligenc: it stores information as it expands on human memory, let's you connect wirelessly, and generally does things if used properly that are thought of for well rounded, intelligent human beings, such as reading, listening to music, etc. all while fitting in your hand, but lots of things I talk about will immerse you in such information, and likely change not just thought but patterns of thought...so think of iPad as an early wifi brain interface.

    I posted this before..;)

    [​IMG]

    Hawking's ibrain:

    http://www.devicemag.com/2012/06/25/ibrain-to-hack-into-stephen-hawkings-brain/

    BTW write these down folks, especially you younger people.
     
  3. JoeZhang

    JoeZhang Vice Admiral Admiral

    Joined:
    Jan 9, 2008
    Why writing it down? It's just religious faith-based nonsense.
     
  4. Kemaiku

    Kemaiku Admiral Admiral

    Joined:
    Dec 23, 2004
    Location:
    Northern Ireland
    Where would I get two stone tablets at this hour?
     
  5. Crazy Eddie

    Crazy Eddie Vice Admiral Admiral

    Joined:
    Apr 12, 2006
    Location:
    Your Mom
    Apart from the fact that you have essentially conceded that Moore's law is unlikely to continue exponential growth indefinitely, this still ignores the fact that the next paradigm may or may not have anything at all to do with computer technology. If it is a shift in, say, nanotechnology (and it probably will be) the result would be another logistic curve, this time for mass production capacity; the same industrial products could be produced faster and faster by increasingly smaller and smaller manufacturing machines; by the time the curve starts to level off for the next paradigm shift, you start to get industrial machines the size of skittles that can eat a pile of sawdust and spit out a kitchen table.

    The new paradigm wouldn't extend Moore's law to microprocessors at all; once computer technology hits its plateau stage, it cannot really be improved further (it won't get any smaller or faster or more powerful than it already is), but in the new paradigm the same computer can be manufactured considerably faster/easier/in larger numbers and for far smaller expense.

    If it's not infinite then it is, by definition, not exponential.

    More importantly, without knowing exactly when the curve will begin to flatten out at saturation point, it's difficult to predict exactly where the technology will end up, especially since all other social/political/economic/military factors are still difficult to nail down. The point of diminishing returns has potential to sneak up on you unexpectedly if it involves factors you had previously ignored or judged unimportant just because you assumed they would be eventually mitigated.

    Because you're assuming the paradigm shift renders the the flattening curve irrelevant. That's an assumption without a basis; it's entirely possible that scientists will make a breakthrough with quantum computers in the next thirty years, after which it begins to become exponentially more difficult to make any advancements at all.

    So it does indeed show the main thrust of the curve(s) still continue... but not necessarily for computers.

    The articles demonstrate nothing of the kind. Software HASN'T kept up with those advances, for the specific reason that software engineers develop applications based on the end user's needs, NOT on the available processor power of the platform running it.

    IOW, software isn't SUPPOSED to keep pace with processing power; processing power is a potential resource that engineers can exploit when demand for new capabilities begins to manifest, but in the end, those applications are driven by consumer demand first and foremost and technical capacity second.

    Nobody made that criticism, RAMA. The criticism from the get go was that the expanding curve engendered in Moore's law is unlikely to continue indefinitely, primarily because the exponential curve looks exactly like a logistic curve until the point where it starts to level off.

    And there IS, in fact, an upper limit to how far microprocessors can be miniaturized or enhanced, especially once you get down to quantum computers and molecule-sized transistors.

    But you're conflating hardware and software as if they were the same thing. They are not, not even close. Hardware can be considered a a virtual vessel in which to contain data and overlapping processes devoted to a specific task, which in turn enables larger and more sophisticated software applications to fill that vessel. But it is ALSO true that a larger number of smaller applications can be simultaneously run on the same hardware that wouldn't have been possible otherwise; the exponential growth in computer power would NOT, in that case, lead directly to an exponential growth in software capability, as the applications themselves could follow a more linear progression by very small increases in capability spread out over a much larger number of applications.

    This is most obvious in the issue of digital storage. Flash memory and nonvolatile storage devices may eventually outperform hard drives by a considerable margin, but that DOES NOT mean that all future media formats will be pigeonholed into HD quality just because more systems can handle their storage and playback. Quantity as well as quality will increase, and depending on user needs, it may be the former more than the latter.

    This has very serious implications for AI and therefore the singularity (see below).

    I bring it up again because you failed to address, in every single case, the fact that the POTENTIAL for change in no way implies the APPROACH of change. Again, the issue here is that you are very easily impressed by pop-sci articles and have a tendency to accept (and in some cases, to volunteer yourself) the most optimistic projections of those technologies based purely on a best-case scenario. You essentially live in a world where inventors never go bankrupt, where startup companies never fail, where great ideas never get pushed to the wayside, where Cisco never shut down the entire Flipcam production line just because they were bored.

    The sole basis for the singularity is a projection on the future capabilities of Expert Systems. Put very simply, the Singularity is what happens when expert systems gain the capability to design improved copies of themselves without human intervention; machine intelligence becomes superior to human intelligence to the point that humans no longer control the developmental process (hence it is a Singularity by analogy to a Black Hole: you cannot see beyond the event horizon represented by the Expert System because it is impossible to make meaningful predictions about the value system or decision-making process of such a system). Singularity theory assumes the exponential growth curve is either indefinite or will continue long enough to bring this about.

    In the first place, as I and others have pointed out, this is a flawed assumption because the exponential growth of hardware has an inherent upper limit that we may be approaching more rapidly than you think. In the second place -- and vastly more importantly -- software development is driven by user needs, NOT by hardware capabilities. I have myself pointed out on MANY occasions, AIs and robots are capable of replacing humans in virtually any task you can think of, provided the right software and hardware specializations are developed; even the self-improving Expert System would be a more efficient software engineer than the best human in the industry. The thing is, none of these tasks would gain any benefit from machine SENTIENCE, as even the Expert System doesn't need to have any semblance of self-awareness, self-motivation or the ability to make abstract value judgements in order to effectively analyze the needs of end users and construct software applications accordingly. In fact, sentience would almost certainly make it LESS useful, as the ability to think beyond the scope of its task would be a distraction to eat up a significant portion of its (admittedly huge) processing power.

    My overall point is that your projections of singularity theory are basically a combination of jubilant optimism of all things technical, combined with reading way too much sensationalist literature without thinking critically about how that process would actually take place.

    We noticed.
     
  6. Geckothan

    Geckothan Fleet Captain Fleet Captain

    Joined:
    Mar 7, 2009
    Location:
    People's Republic of Britainistan
    1. No thanks
    2. Touch screen interfaces are stupid
    3. No thanks, I'd rather pay with cash if I'm buying something in person
    4. No I won't
    5. It already does, if internet porn counts?

    Also, smart phones and tablets are stupid.
     
  7. Crazy Eddie

    Crazy Eddie Vice Admiral Admiral

    Joined:
    Apr 12, 2006
    Location:
    Your Mom
    ^ Actually, smartphones and tablets would be pretty neat if they weren't... well, smartphones and tablets.

    I'm using a MacBook Air right now, as it is the only computer I own; I also have an iPad my dad gave me for Christmas last year and a 5 year old iPod touch. That's three devices I have where there should only be one.

    If I could merge the MacBook and the iPad, it would be absolutely perfect; say, a touchscreen for when you need a tablet, and also have a wireless keyboard in the case for when you need a laptop.

    Just seems to me tablets would be a lot more useful if developers gave you the option of using them as regular computers if that's what you really need, or switching seamlessly into "mobile mode" or something.
     
  8. Geckothan

    Geckothan Fleet Captain Fleet Captain

    Joined:
    Mar 7, 2009
    Location:
    People's Republic of Britainistan
    Something like the Surface, only with a better keyboard and with better, less locked in hardware/software would definitely be neat, but pointless, when you consider that ‘Ultrabook’-type laptops are just as portable (albeit not as convenient, as laptops can only be used when they're opened up or have a mouse/keyboard/screen plugged into them) and have more room inside to cool better hardware.

    A decent laptop and a decent phone are far better than a tablet, anyway. If you have things set up properly, you can just set up some kind of ad hoc wireless connection (with IPSec over the top of it) between the laptop and the phone and use it to keep your data synced over some kind of network filesystem/file transfer protocol or some proprietary syncing system (screw using "cloud" services, seriously), and to get internet on the laptop (without having to go through dirty, insecure public wifi).
     
    Last edited: Nov 18, 2012
  9. MacLeod

    MacLeod Admiral Admiral

    Joined:
    Mar 8, 2001
    Location:
    Great Britain
    So you'd rather go to the bank and withdraw hundreds or thousands of pounds to pay for things like a new TV, new car etc... Using RFID technology in your mobile would be similar to contactless payment, or a card.

    True the biggest issue is security at least with Chip and Pin technology in cards even if you lose or have your wallet stolen a person would have to know your pin to use your card.

    As for automated cars it could massively improve capcity on roads, instead of having to keep two+ seconds behind a car, a computer would be able to run cars virtually bumper to bumper. It would be able to adapt the speed to the conditions so could potentially go at faster speeds, or slower depending on conditions. The biggest task is not so much the technological problems, they can be overcome. But the human element.

    You know I'm sure that there is this new tablet that's been released that you can attach a keybopard to, the Surface by Microsoft.
     
  10. Geckothan

    Geckothan Fleet Captain Fleet Captain

    Joined:
    Mar 7, 2009
    Location:
    People's Republic of Britainistan
    For large purchases, cards are preferable, obviously, but for small purchases, why should I make it easy for my transactions to be tracked? Privacy, etc...

    Might make the roads safer with less stupid people behind the wheel, but as somebody with a triple-digit IQ and a good sense of car control, I'd rather drive properly, without any electronic nannying, let alone without any input whatsoever.
     
  11. MacLeod

    MacLeod Admiral Admiral

    Joined:
    Mar 8, 2001
    Location:
    Great Britain
    Well the thing about cards is they can be insured against loss, you can't insure cash. The sooner we move to a cashless society the better.

    A card fits easily into a pocket and weighs a lot less than a pocket full of change.

    Many modern cars come with electronic driver aids, traction control etc... Isn't that a form of electronic nannying?

    And no driver is perfect, every driver makes a mistake now and then. True some more than others and when I had a field based job and was driving tens of thousands of miles a year I saw plenty.
     
  12. Crazy Eddie

    Crazy Eddie Vice Admiral Admiral

    Joined:
    Apr 12, 2006
    Location:
    Your Mom
    Not sure.

    Does it HAVE to be by Microsoft?
     
  13. RAMA

    RAMA Admiral Admiral

    Joined:
    Dec 13, 1999
    Location:
    USA
    http://www.kurzweilai.net/ibm-simul...lion-synapses-on-worlds-fastest-supercomputer
     
  14. RAMA

    RAMA Admiral Admiral

    Joined:
    Dec 13, 1999
    Location:
    USA
    I've already seen some of the arguments against exponentials, and aside from the counter which I posted (which are accurate) I've seen the numbers about the upper limits you mention(i have them in book form, I'll try and find a link), and they are higher than you think, not lower. While not infinite they do allow for the necessary power for a Singularity. The 6th paradigm will continue the curve already established, so your assumption that it will not is incorrect.

    Moore's law is the 5th paradigm, and the various technologies to extend it have already appeared, the 6th generation ones either are in development, and in some cases already exist, but not in fully finished form. The fact there is more than one will tell you something, the fact that I can post breakthroughs on them almost every month is also telling..

    The study by the government proves software keeps up with hardware development, in some cases it is mentioned, it surpasses it. I don't know what other proof you want. I'll take my proof over your claims any day. Software is important because it's the missing link between the higher processing speed and potential human level AGI.

    Yes companies go bankrupt, countries pass stupid laws, there are depressions and recessions and war, and yet the upward curve has never stopped.
     
  15. RAMA

    RAMA Admiral Admiral

    Joined:
    Dec 13, 1999
    Location:
    USA

    Unlike any of the various Raptures, the Singularity is a technological event, caused by ordinary humans, doing ordinary science, building ordinary technology which follows the ordinary laws of physics. It does not involve any religious or divine powers. It doesn’t involve outside intervention by superior or alien beings. And it’s completely within our control as a species- it will only happen when we go out and make it happen. Your claim isn't logical.

    RAMA
     
  16. gturner

    gturner Admiral

    Joined:
    Nov 9, 2005
    Location:
    Kentucky
    Well, what is illogical is to assume humans will be first with the singularity when species with smaller brains will be surpassed by circuitry first, and gain greater benefits from shifting away from organic brains. Cats, for instance, will require a much smaller die size, and the leap they make from chasing mice to shopping online will be greater than the leap we make from shopping online to - shopping online faster.

    Almost immediately after the felingularity, instead of half the web being pictures of cats, 99% of the web will be pictures of cats. The other 1% will be pictures cats took of their primitive, organic, two-legged housemates.
     
  17. Bisz

    Bisz Rear Admiral Rear Admiral

    Joined:
    Dec 20, 1999
    Location:
    Ontario, Canada
    ...so, you're a luddite?
     
  18. Bisz

    Bisz Rear Admiral Rear Admiral

    Joined:
    Dec 20, 1999
    Location:
    Ontario, Canada
    Sorry, you have no privacy.

    Here is a wonderful blog entry by Scott Adams, the author of Dilbert.

    It goes on at length, read the rest here:
    http://dilbert.com/blog/entry/the_privacy_illusion/
     
  19. Crazy Eddie

    Crazy Eddie Vice Admiral Admiral

    Joined:
    Apr 12, 2006
    Location:
    Your Mom
    Unlikely, especially since you don't know how high I think they are.

    Unlikely, since you do not actually know what the next paradigm IS.

    First, if you can post monthly breakthroughs on them, then they're still part of the CURRENT paradigm, not the next one. They may extend the digital paradigm somewhat or help it take form, or -- alternately -- hasten the approach of its limiting factors. But they will not lead to the transition of a NEW paradigm without a fundamental shift in their most basic applications, after which the patterns of old paradigm cease to be meaningful.

    This would be easier for you to understand if you compared the current (5th) paradigm with the previous two.

    I'm beginning to wonder if you actually know what a "paradigm" is.

    Indeed. Which is why the next paradigm is unlikely to have anything whatsoever to do with Moore's law or microprocessors in general. Even 3D circuitry and quantum computing is only going to extend the present paradigm to a limited extent, and even then it may be part of the plateau stage where increasing power/complexity in three dimensional integrated circuits is considerably more expensive than it had been with 2D circuits. Once you reach the limits of 3D circuits, further advances run into that diminishing returns problem; the paradigm shifts to something OTHER than microprocessor technologies, and no new improvement can be made except over unbelievably long timescales for almost superficial levels of improvement.

    Yep. You clearly DON'T know what a "paradigm" is your anticipation of a paradigm shift is just another rhetorical device you're using to avoid taking the problem seriously.

    Resources has nothing to do with it. The logistic curve is a function based on a saturation point, wherein rapid progress can build on further progress in what seems to be an exponential curve until you reach a saturation point, where the system approaches maturity and the curve flattens out.

    In this case, even if you had an infinite quantity of resources, that does not imply infinite growth potential; when microprocessors reach a point at which transistors cannot be further reduced and logic circuits cannot be further enhanced, then that's that, there's no more room for growth (at least, not any amount of growth that could be justified for the expense it would take).

    Which ultimately has less to do with the resources available and more to do with the equilibrium point of reproductive rates vs. attrition rates. The limited resources (e.g. food) provide the saturation point, and therefore the curve flattens at the point where there are so many rabbits on the continent that the number that die from starvation is approximately equal to the number of live births.

    You cannot cry "paradigm shift!" as an escape hatch for that, because an upper limit to microprocessor technology DOES exist, even accounting for innovative new forms of it. There is not even THEORETICALLY infinite growth potential there; even atomic-scale computers would eventually reach a point where they cannot be improved further. And so far, there is no reason to assume that the most radical theoretical limits are even applicable, since PRACTICAL limitations -- e.g. politics, consumer demand, economics, military pressures, and ordinary dumb luck -- are limiting factors as well.

    In this context, the software we're talking about is artificial intelligence, NOT storage capacity, NOT video or sound quality, NOT digital bandwidth and throughput. We're discussing the efficacy of computers not only as expert systems, but as self-examining thinking machines capable of taking roles traditionally performed by expert humans.

    By nearly all accounts, the HARDWARE requirement for this was surpassed over a decade ago (even Kurzweil would admit this, which is why several of his 1990s predictions totally failed to pan out). Simply put, the software element to Strong AI just hasn't materialized at all, and in fact is lagging so far behind that the "bottom-up" AI theorists have spent the last couple of years lording it over everyone else with a collective "I told you so." That's why even Kurzweil is now talking about developing computer architectures that mimic the functioning of a human brain, because it's now obvious to EVERYONE that it isn't going to be fixed in software.

    But it isn't, though. Even in the highly unlikely event you could get a computer to model an existing human brain, it's still only a predictive simulation of that brain based on fixed parameters, not a genuine consciousness.

    Of vastly greater import is the fact that outside of laboratory curiosity there's virtually zero market demand for conscious machine labor. UNCONSCIOUS labor is considerably easier to accomplish, especially since the few remaining tasks that require conscious labor can be performed by increasingly less intelligent/lower paid wage slaves.
     
  20. Crazy Eddie

    Crazy Eddie Vice Admiral Admiral

    Joined:
    Apr 12, 2006
    Location:
    Your Mom
    Which doesn't change the fact that it is a religious faith-based worldview. The meaningful element here is that you have already internalized your articles of faith:
    - The Singularity is coming
    - The Singularity will be a good thing
    - Those who believe in the singularity will be the first to benefit from it.

    The rest of this is you RATIONALIZING what you've already decided to believe. Several times, you attempted to claim that it's not irrational because it doesn't appeal to the supernatural. That is a distinction without a difference; just because you've replaced the Book of Revelations with Ghost in the Shell doesn't make your worldview any less faith-based.