There doesn't seem to be much interest in this but I thought I would update the thread anyway. Again, the reason I have been researching this particular topic is to form an idea about the workings of a the computer systems from the 22nd century, specifically the NX-01.
Anyway, I have finally come across some useful information in the last week and found one of the missing pieces of the puzzle: Quads. This is a throwaway tech term used to describe the data ratios of the computer systems of that era. Since our modern systems are constantly changing, using the term "bytes" would have been counter productive and could possible be dated in a few years. So this new term was invented to make things a bit more vague in order to maintain believability. However, I had no idea what the scale would be or how those shows used it. TNG had a more established set of rules concerning the use of its technical terms. Unfortunately, Voyager used them inconsistently and that was throwing my research off completely. Once I figured that out, I was able to put things in order and form a base for comparison.
Data ratios are: Milliquads (1), Quads (100), Kiloquads (1000), Megaquad (1 Million), Gigaquad (1 Billion), Teraquad (1 Trillion)
The computer data capacity on the 1701-D was a total of 630,000 kiloquads with each isolinear chip having a capacity of 2.15 kiloquads.
(Voyager's memory capacity, even taking into account the massive amounts of inconsistencies, calculates out to be 75 trillion times that of the Galaxy class ship. The good people over at Ex Astris Scientia also point out that if they used the same 2.15 isolinear chip in the system, the amount of space required to hold the Voyager's computer core would equal 3,875 Borg cubes!

I think some of the writers may have been very very confused.)
So for the NX-01, I basically started working backwards from the figures I collected. This is what I have.
The computer core's data capacity would be 21 Kiloquads with each data card having a capacity of 1.5 quads.
I also wanted to pin down the core's speed. In the 24th Century, a starship's core utilizes a subspace field to increase the system's working capacity past the speed of light. This would push the linear calculations into something like several billion calculations per nanosecond. However, I would postulate that the systems aboard the NX were not super luminal and would be subject to the limitations of the speed of light. So I did some calculations and figured out that this system would be able to produce about 2.5 trillion linear calculations per millisecond or about 174 billion per data card.
(Today's super computers hold a speed record of 500 trillion FLOPS per second. I have postulated that this is about the same operating capacity as one NX data card. And with 14364 data cards making up the computer core, you have a system that is about 15000 times more efficient than what we have today. Since any system we are producing right now would be retarded by the speed of light, our ability to produce a faster and faster computer is going to stall out quickly anyway so speed is not really the issue with the core. So working under that assumption, I am inclined to think that the need for speed would give way to the need for smaller and smaller processor sizes, which decrease the optical lag time and increase the amount of information being processed in a required space. And the more processors you can cram into an area, the larger the workload that could be handled. But with the inevitable breakthroughs in subspace mechanics and field theory, the speed limitations could have been broken within twenty or thirty years of the NX launch.)
The processing capabilities are handled exclusively by a network of optical transtators. (Transtators are the basis of much of the technology found in the 23rd and 24th centuries. It only makes sense that their use started somewhere.)
The system uses optical data storage (to explain the large data cards seen in the show) which employs a molecular coding membrane for a data storage. (I would theorize that this storage technology would later be made more compact and the coding membrane condensed into a coiled microscopic "tape". This "memory tape" is seen being used in 23rd Century starships. This technology would be abandoned in the 24th century for the more efficient isolinear chip.) The processing system uses a combination of optical transmission and signaling to transmit information from one processor or one terminal to another.
The computer system would also be decentralized to a degree. The computing systems aboard the NX-01 would be more like a “neural” network with processors and sub-processors distributed all over the ship with the central “brain” being the computer core. Each computer station would have its own processing and memory but then that would be linked to a sub-processor network for the deck and that network would be monitored or controlled by the computer core. The computer core would also store information in a large archived database. If one console or processor went down, another would be able to take over with a quick update downloaded from the deck sub-processor or the core. As for the core, it would be broken down into different sections or “lobes” (24 total). These lobes are responsible for archiving information from the deck sub-processors, managing the ships redundant systems and processing large clusters of information to big to be processed by the sub-processors.
Any thoughts or suggestions? Anything someone with a computer background might have to offer would be greatly appreciated. I may have missed something from the "now" that needs to be incorporated so any advice would be appreciated. To be honest, I am pulling most of his out of my butt so if anyone has another take on the subject, I would love to hear about it.