View Single Post
Old September 13 2012, 04:51 AM   #75
Crazy Eddie
Rear Admiral
 
Crazy Eddie's Avatar
 
Location: I'm in your ___, ___ing your ___
Re: Envisioning the world of 2100

gturner wrote: View Post
newtype_alpha wrote: View Post
gturner wrote: View Post
^Even the cellular level of detail would be unnecessary once we figure out how brains encode and process information with cells (moving from the physical hardware to an abstraction of a machine, even sets of equations that describe an odd analog circuit, and finally a description of what that circuit does).
But then we wouldn't need simulations, we could get much better results by building synthetic brains that behave on a hardware level fairly similar to the real mccoy.
Well, I guess that would be the point, along with perhaps mapping a human brain onto the new hardware so people think they're upgrading to a new, improved physicality.

Even a fairly rough copy would probably do fine, since people are constantly forgetting, misremembering, getting hammered, and getting in wrecks and we still have little trouble accepting the continuity of their existence. For most, it would probably be less of a behavioral change than sobering up, finding Jesus, or surviving a date with Lindsay Lohan.

One step in this process might be getting a good enough copy to run, after which the copy could be monitored with vastly more precision than something like an MRI scan of the brain. Then we could start experimenting with how thinking actually works, moving up a layer of abstraction so we can design a better brain, or understand how to just add knowledge to an existing one at the core level, instead of trying to "teach" it via external sensory inputs, leading to "I need a pilot program for a B-212 helicopter."
I imagine it would be similar to the project to map the human genome. If you find a way to quantify patterns of human thought -- some formal system of memetics, let's say -- you could probably develop a baseline for digital capture or transfer of thought patterns from one person to another.

One thing to consider, though, is that human beings have different kinds of memory that are stored different ways. Your pilot program for the B-212 would probably be downloaded as a set of memories copied from an actual helicopter pilot; you suddenly remember taking three years of pilot training with five years flying gunships in 'Nam. But since you've never BEEN to Vietnam and you don't know what the instructor looks like, your memory will vary slightly from the actual pilot they were copied from; you're mapping new data on top of old and the old data gives (wrong) context to the new.


Ryan8bit wrote: View Post
newtype_alpha wrote: View Post
For example one way to calculate the square root of a number is to generate a random number between zero and the number to be square rooted, if you square the random number selected and it turns out to be larger than the original number, you select a random number between zero and the previous number that was squared, you keep on doing this until you come to a close approximation of the square root of the number.
And using this method in an actual calculator means the calculator will produce the square root of a number only 2% of the time. This gets even worse if you use a formula that includes a square root; it will literally NEVER get the right answer no matter how many times you punch in the formula, because all of its calculating processes are based on trial and error randomness.
What?

I know it varies by calculator, but the gist of what Mars says is how calculators perform square roots. They find something approximate and then refine to a specific level.
Incorrect. Calculators produce their results by logical relationships hard wired directly into their circuitry. Basically, it's a series of voltage gates that physically play out the AND/OR/NAND/NOR/etc logical processes. There's nothing "random" about it; it's essential a conversion from one data type (binary/boolean) to a more easily readable one (base ten decimal).

Software-based calculators (javascript, for example) are even simpler, since they can perform logical operations on whole numbers without resorting to boolean relationships (although, deep down, that's what computers are doing when they run a javascript anyway).
__________________
The Complete Illustrated Guide to Starfleet - Online Now!
Crazy Eddie is offline   Reply With Quote