gturner wrote:
^Even the cellular level of detail would be unnecessary once we figure out how brains encode and process information with cells (moving from the physical hardware to an abstraction of a machine, even sets of equations that describe an odd analog circuit, and finally a description of what that circuit does).

But then we wouldn't need simulations, we could get much better results by building synthetic brains that behave on a hardware level fairly similar to the real mccoy.
Mars wrote:
Silvercrest wrote:
Mars wrote:
One can for instance predict the weather without simulating the path of every molecule and atom in our atmosphere ... The human brain can be simulated in similar detail.

Ah, so it can be simulated to a rough degree of accuracy, but no better? And you can guess what it will do some of the time, and other times it will behave totally freakishly and unpredictably?
Not a human being I'd want to hang around with. Keep it away from sharp implements.

Humans are unpredictable anyway, having approximations will just create different random numbers.

GIGO, remember? Garbagein, garbage out. If you feed incomplete or inaccurate data into the computer, you get an incomplete or inaccurate simulation. In computer programming languages, this can be as little as a decimal point out of place, a value being negative that should be positive, a recursive process that shouldn't be recursive or two logical statements that contradict in a hardtosee way.
Keep in mind that "incomplete and inaccurate" for a SIMULATION is different than it would be for a human; you could end up with a simulation that does nothing but laugh uncontrollably 24 hours a day, or a simulation that says the same three word phrase every time you blink at it, or a simulation that twitches from time to time in random directions but otherwise doesn't move.
Bad data means a bad simulation: garbage in, garbage out.
For example one way to calculate the square root of a number is to generate a random number between zero and the number to be square rooted, if you square the random number selected and it turns out to be larger than the original number, you select a random number between zero and the previous number that was squared, you keep on doing this until you come to a close approximation of the square root of the number.

And using this method in an actual calculator means the calculator will produce the square root of a number only 2% of the time. This gets even worse if you use a formula that includes a square root; it will literally NEVER get the right answer no matter how many times you punch in the formula, because all of its calculating processes are based on trial and error randomness.