^Even the cellular level of detail would be unnecessary once we figure out how brains encode and process information with cells (moving from the physical hardware to an abstraction of a machine, even sets of equations that describe an odd analog circuit, and finally a description of what that circuit does).
We certainly would try to simulate a computer application by modelling an Intel processor down to the charge state of its transistors (like they were RF amplifiers). That's what we'd do if we had no idea how a transistor worked, or the design rules for digital circuits, and didn't understand microcode, machine language, and algorithms.
Eventually we'll get there, and that will be a wonderful and frightening time, as people play around with machines that begin to show true intelligence, or run simulations of copies of dead people's brains that remember and feel, and try to figure out what really goes on in a cat's brain. Then, of course, we take advantage of the speed of electrons compared to neurons, unlimited storage capacity, networked connectivity, and build our doomsday philosopher who says "42."