I've actually countered both those arguments multiple times.
And your counters were found to be flawed every single time. For example:
When one "S" curve finishes, it continues to the next paradigm.
Which doesn't address the fact that the NEW paradigm in of exponential growth may not have anything whatsoever to do with the power of computers. It could just as easily be an exponential growth in the power density of rechargeable batteries. You keep assuming that the paradigm shift would reestablish the previous growth curve of computerized growth when there is zero reason to believe it would do anything of the kind (it's unlikely that it would, actually, given that it is a PARADIGM SHIFT rather than a "momentary interruption of an orderly pattern").
You do realize this quote you just cut and pasted doesn't contain any actual DATA, right? These are supposition based on vague generalities. Even intel's "estimates" are exactly that, and you didn't even bother to directly reference them.
A post with the following link relevant to
software issues:
A govt report from advisors on the rate of software growth, disproving the software misconception
Found that the QUANTITY of software applications is growing at an exponential rate. The power and capabilities of those applications (not to mention the efficacy thereof) is another question entirely.
The implication in Moore's law isn't merely that computers are getting more complex. It's that they're getting MORE POWERFUL for a given size. The government study you cited found the exact opposite is true of software: software applications are growing in number and in complexity even as their capabilities remain relatively unchanged. In fact, as a lot of users have begun to suspect, it's increasingly becoming the case that newer software applications are actually LESS powerful than their predecessors because their core programming -- that is, what that application has been designed to do -- has become bogged down with a lot of extra features that add complexity to the overall package that draws greater resources than actually required for the activity in question. We're fast approaching the point where you can't even run a basic word processor without high-speed internet, a cloud server, and at least 3GB of unused ram.
This has also been explained to you as part of the discussion as to why we probably shouldn't be so impressed by the explosion of recorded data on storable medium, given how much of that data consists of cat gifs and twitter posts.
We've already proved consciousness is derived from the brain, so your statement that personality can't be uploaded is supposition..
No it's a plain statement of fact, as your statement directly implies:
Consciousness is derived from the brain.
Consciousness is not known to be derived from computers, ergo there is no evidence that a simulation of a brain would actually achieve consciousness.
Similarly:
Alcohol is derived from fermentation.
Alcohol is not known to be derived from computers, ergo there is no evidence that a simulation of fermentation would actually produce real alcohol.
These are facts, not supposition. Furthermore, I already conceded that AIs will get better and better at emulating humans and that a high-fidelity imitation of human consciousness is perfectly achievable even with EXISTING technology. With clever enough programming, you can get an AI to imitate a human, or even for that matter a SPECIFIC human. Simulated consciousness, like simulated alcohol, is far from impossible.
But that is NOT brain-uploading. It is not, in fact, even the most efficient use for an AI, and is unlikely to ever be anything more than a really creepy, off-putting novelty by trans-humanists.