Ryan8bit wrote:
newtype_alpha wrote:
For example one way to calculate the square root of a number is to generate a random number between zero and the number to be square rooted, if you square the random number selected and it turns out to be larger than the original number, you select a random number between zero and the previous number that was squared, you keep on doing this until you come to a close approximation of the square root of the number.

And using this method in an actual calculator means the calculator will produce the square root of a number only 2% of the time. This gets even worse if you use a formula that includes a square root; it will literally NEVER get the right answer no matter how many times you punch in the formula, because all of its calculating processes are based on trial and error randomness.

What?
I know it varies by calculator, but the gist of what Mars says is how calculators perform square roots. They find something approximate and then refine to a specific level. It's not correct 2% of the time, and I'm not really sure where you even get that percentage from.

Each time the computer generates a random number its from a narrower range of numbers than when the previous number was selected, you find out if the number is too high or low by squaring it or multiplying it by itself, if its too high then the previous number selected becomes the upper end of the range of possible numbers, if it is too low then it becomes the lower end. A human brain is not a very good numerical calculator.