• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Trek Computer Systems

AstroSmurf

Vice Admiral
Admiral
Recently, I have been doing some research into the capabilities of the Trek Starship Computer Core for a project of mine. I have been looking for things like computational speeds and memory capacity but so far I have come up with nothing… zilch. I expect the writers/producers intentionally left this information vague so it didn’t hinder storytelling potential. Even if it was hinted at on a show or a movie, I have been unable to find anything. Short of actually rewatching all the episodes again, I am at a dead end.

So I thought I would pop in here and see what you folks thought. Any ideas? Any guesses or hypothesis you might like to make? Feel free to pontificate on any of it, software, materials, etc. I am looking for information about any of the eras, not just the 24th Century. Granted this information will be applied to the NX-01 but I can work backwards or forwards to fill in the information I need.

Thanks. :bolian:
 
In Star Trek: Voyager's episode "Concerning Flight", Voyager's computer core was stolen. At some point in the episode, the core's processing power was revealed. The episode is season 4 episode 11.
 
^^^
One of the baddies says that it is capable of "simultaneous access to 47 million data channels, transluminal processing at 575 trillion calculations per nanosecond, operational temperature margins from 10 Kelvin to 1,790 Kelvin."
 
Technically, the thing that was stolen was a computer processor not the actual core. So I am not sure how much information you can get from that unless they gave the actual number of processors. Then you could come up with a complete figure.

575 trillion calculations a second? Does anyone else think that number is a little high?

But thanks for the information though. It is at least a step in the right direction. :bolian:
 
There doesn't seem to be much interest in this but I thought I would update the thread anyway. Again, the reason I have been researching this particular topic is to form an idea about the workings of a the computer systems from the 22nd century, specifically the NX-01.


Anyway, I have finally come across some useful information in the last week and found one of the missing pieces of the puzzle: Quads. This is a throwaway tech term used to describe the data ratios of the computer systems of that era. Since our modern systems are constantly changing, using the term "bytes" would have been counter productive and could possible be dated in a few years. So this new term was invented to make things a bit more vague in order to maintain believability. However, I had no idea what the scale would be or how those shows used it. TNG had a more established set of rules concerning the use of its technical terms. Unfortunately, Voyager used them inconsistently and that was throwing my research off completely. Once I figured that out, I was able to put things in order and form a base for comparison.


Data ratios are: Milliquads (1), Quads (100), Kiloquads (1000), Megaquad (1 Million), Gigaquad (1 Billion), Teraquad (1 Trillion)

The computer data capacity on the 1701-D was a total of 630,000 kiloquads with each isolinear chip having a capacity of 2.15 kiloquads.


(Voyager's memory capacity, even taking into account the massive amounts of inconsistencies, calculates out to be 75 trillion times that of the Galaxy class ship. The good people over at Ex Astris Scientia also point out that if they used the same 2.15 isolinear chip in the system, the amount of space required to hold the Voyager's computer core would equal 3,875 Borg cubes! :eek: I think some of the writers may have been very very confused.)


So for the NX-01, I basically started working backwards from the figures I collected. This is what I have.

The computer core's data capacity would be 21 Kiloquads with each data card having a capacity of 1.5 quads.


I also wanted to pin down the core's speed. In the 24th Century, a starship's core utilizes a subspace field to increase the system's working capacity past the speed of light. This would push the linear calculations into something like several billion calculations per nanosecond. However, I would postulate that the systems aboard the NX were not super luminal and would be subject to the limitations of the speed of light. So I did some calculations and figured out that this system would be able to produce about 2.5 trillion linear calculations per millisecond or about 174 billion per data card.

(Today's super computers hold a speed record of 500 trillion FLOPS per second. I have postulated that this is about the same operating capacity as one NX data card. And with 14364 data cards making up the computer core, you have a system that is about 15000 times more efficient than what we have today. Since any system we are producing right now would be retarded by the speed of light, our ability to produce a faster and faster computer is going to stall out quickly anyway so speed is not really the issue with the core. So working under that assumption, I am inclined to think that the need for speed would give way to the need for smaller and smaller processor sizes, which decrease the optical lag time and increase the amount of information being processed in a required space. And the more processors you can cram into an area, the larger the workload that could be handled. But with the inevitable breakthroughs in subspace mechanics and field theory, the speed limitations could have been broken within twenty or thirty years of the NX launch.)

The processing capabilities are handled exclusively by a network of optical transtators. (Transtators are the basis of much of the technology found in the 23rd and 24th centuries. It only makes sense that their use started somewhere.)

The system uses optical data storage (to explain the large data cards seen in the show) which employs a molecular coding membrane for a data storage. (I would theorize that this storage technology would later be made more compact and the coding membrane condensed into a coiled microscopic "tape". This "memory tape" is seen being used in 23rd Century starships. This technology would be abandoned in the 24th century for the more efficient isolinear chip.) The processing system uses a combination of optical transmission and signaling to transmit information from one processor or one terminal to another.

The computer system would also be decentralized to a degree. The computing systems aboard the NX-01 would be more like a “neural” network with processors and sub-processors distributed all over the ship with the central “brain” being the computer core. Each computer station would have its own processing and memory but then that would be linked to a sub-processor network for the deck and that network would be monitored or controlled by the computer core. The computer core would also store information in a large archived database. If one console or processor went down, another would be able to take over with a quick update downloaded from the deck sub-processor or the core. As for the core, it would be broken down into different sections or “lobes” (24 total). These lobes are responsible for archiving information from the deck sub-processors, managing the ships redundant systems and processing large clusters of information to big to be processed by the sub-processors.

Any thoughts or suggestions? Anything someone with a computer background might have to offer would be greatly appreciated. I may have missed something from the "now" that needs to be incorporated so any advice would be appreciated. To be honest, I am pulling most of his out of my butt so if anyone has another take on the subject, I would love to hear about it.
 
I've always taken the use of "quad" to mean that 24th-century computers aren't binary (base 2) or even trinary (base 3) -- they're quaternary (base 4) somehow! This was, at least in part, confirmed (for me) when Janeway talks about Starling's binary computer.
 
Quads. This is a throwaway tech term used to describe the data ratios of the computer systems of that era. Since our modern systems are constantly changing, using the term "bytes" would have been counter productive and could possible be dated in a few years. So this new term was invented to make things a bit more vague in order to maintain believability.

What were the capacity/processing stats that Data quoted of himself? I seem to remember that being in bits/bytes.

We may be able to estimate the figure other ways. Consider power used to create/run Moriarty program, which was superior to Data. Space needed for EMH, which we'd imagine be similar or less than Data.
 
But also take into consideration that Voyager uses Bio-neural gel-packs.
A new technology really which primarily serves to process information faster, but also, Voyager was launched just prior to the destruction of the Enterprise-D which also suggests that Voyager's computer core could have undergone drastic changes from the grounds up before it's launch.
And the Ent-D did numerous discoveries ... perhaps SF was able to analyze some of the Cytheiran technology and come up with a larger storage capacity for their computers in the process ?
 
Hi Anti-drone.

Ok, I've been thinking about your posts here this evening.

I think it's safe to assume that a quad is not simply a multiple of binary, somewhere beyond the gigabyte or the terabyte.

@sonicranger -- A base 4 version of binary?? -- quaternary -- would at best provide only double the storage space of binary, given the same number of "bits" - which is no real improvement over the system in use today.

However... Base 4 might be useful if there is a 4-state logic to use along side quaternary that gives massive improvements over boolean logic.

Furthermore, there's no reason to believe that a quad is a digital quantity.

One possibility is to combine digital and analogue data - coupling a binary element (0 or 1) alongside an analogue element (0 to 1). This couple may be a quad? A snazzy new logic would need to be invented to use alongside these couples. It would feature new unary and binary operators that we must be able to physically reproduce in microprocessors.



Another possibility is that there are data systems superior to digital.

Bit = BI-nary digi-T
Quad = QUA-ntum D-igit?

We've all no doubt heard of quantum mechanics. Have heared of slipstreams and the teleportation of laser beams that the physics boffins do. There is a great untapped power there somewhere. Quantum mechanics is all about information; the storage, the transportation, and the verity of it.

Until we observe information (a quantum event), that information is uncertain. Uncertainty is an analogue quantity, not a black/white digital state.

Consider taking a string of information, encoding it into a set of uncertain quantum states (analogue), and multiplex these into a single quantum packet which is stored as a single (uncertain) particle.

The storage capacity is in how precisely the codec hardware is in multiplexing and retrieving, which determines how long these strings can be.

I'm fantasising here, btw. But I hope it gives some insight, that digital is not necessarily optimal or future-proof.

Quantum mechanics may allow us to split the bit just like atoms were 100 years ago. And we may find we can double that bit subdivision once every two years, just as fruitfully as we double bit volumes today. :-)


But having said that, binary still has a lot of mileage left. One ounce of silicon has 6E+23 atoms. If each of these atoms has its nuclear magnetic axis orientated up or down to represent bits, we have enough atoms in one ounce to allocate 1000 terabytes to each and every person on the planet, or something like that.

Windows would just about be happy with a ram expansion that big. :-)


TTFN - Jadzia x
 
There doesn't seem to be much interest in this but I thought I would update the thread anyway. Again, the reason I have been researching this particular topic is to form an idea about the workings of a the computer systems from the 22nd century, specifically the NX-01.

In "Chosen Realm," the AOTW delete the information in the NX-01's computers about the Expanse's spheres. A display shows that 19.3 XB of data was lost. This could be a reference to exabytes, though the correct initialism for that is EB.

Memory Alpha has information on references to byte-based capacities in Trek. Same for quads.
 
Quantum computers are in their infancy but are quite real. David Deutch calls the basic computation unit a "qubit" but it seems reasonable that "quad" could take over sometime in the Trekverse.
 
^ Actually the Memory Alpha page about quads is where I finally figured out why I was having such problems. :rommie: (See my above post.)

As for the XB... I missed that and I have no idea what to do with it either. I liked using quads better since it doesn't really mean anything.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top