• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

The original TOS Enterprise...

knightgrace

Commander
Red Shirt
Okay, here is the problem, both in universe, and the real world.

In universe, the Enterprise is the result of two centuries of starflight development. Two centuries plus (Nomad reference? Of Independent logic. The implications are that Independent Logic capable computers aren't that much more capable than a human, and most likely less capable than a qualified human professional grade. In other words, a human professional has a higher level of understanding than a machine, with the typical example that a common human being greater than a common Independent Logic... quantitatively I am not sure just how much help is provided to the average individual human. But logic argues that it must be good enough to meet everyday common needs, otherwise, why use it for anything?

Let us say that before Duotronics, was around, the predecessor system, could be a helpful assistant to a Master of whatever ( Science, Law, medicine, philosophy, and so on), but inferior to the particular human. Duotronics took it up to Doctoral level.

Such that the Constitution class didn't precisely evolve as one would expect, but was going from one step to another. Why wasn't it evolved? Too few steps.

But why in universe was the Star Trek Phase II Enterprise the way that it was? Too few steps, remember?

Now here is the real problem: the Enterprise-D. Too radical for the number of steps. If your computers are barely capable, then how do you get to the Enterprise-D, so soon?

It isn't logical.

Now for the real world.

When Matt Jeffries, was working on the design of the Enterprise, he went through, it is said thousands of drawings. In other words actual evolution.
 
Two hundred years and that is what they came up with?

All of the computer advances, virtual nothing.

Matt Jeffrey's in a matter of months came up with that???

Under the direction of Gene Roddenberry et el -> forced evolution in the real world case.

Twiddling thumbs in terms of technology in the fictional case.

Consider that they had Nomad technology for sixty years and little was done.

Now look at the Advancement of the past six years...in the real world.


Major disconnect here
 
How?

There were no personal computers in 1964. Those didn't really start to appear until the 70s. Nomad was a fusion of a human probe and an alien probe that was probably more advanced. Nomad itself wouldn't be much more advanced than the Voyager space probes in those days writing.

Computer technologies leaps and bounds didn't start happening until a decade or two later, so the writers wouldn't know. That they happened on what would closely resemble early 21st century computers for the 23rd century is mindly amazing and somewhat dependent on inventors seeing Star Trek and going, "how do I make that"?

Sci-fi of the 1950s to 1970s had humans advance in space design and interplanetary colonization. Instead we had the computer be revolutionized and space languished after the 1980s Challenger disaster.
 
Last edited:
Real life isn't a game of Civilization. There isn't some technology tree that neatly leads from a to b to c to d at x-number of years per step. Advances occur when people happen to get the idea or when prerequisitics and surrounding conditions call for them.

For the former, two millenia ago, a few dozen years and a few dozen miles separated two natural philosophers, one of whom speculated that disease might be caused by organisms too small for the eye to see attacking the inside of the body, and the other of whom noted that curved baubles of glass had a magnifying effect when you looked through them, which was dismissed as a useless curiosity. Now, if those people had been a bit closer, so that someone who had the idea of these tiny illness-causing animals happened to look through a curved piece of glass comparing a clean water source to a tainted one, we could've had the germ theory of disease and the fundamentals of modern sanitation in the first century B.C., something that would've radically altered human history. On the other hand, if the doctors who were offended by the concept that their dirty hands were spreading illness had their way and successfully suppressed the germ theory, we still might not have it today.

As for the latter, think about modern computing. In the middle of last century, computing hardware was too large and cumbersome, so the dominant model was one central "main frame" accessed by individual operators though numerous "terminals." The technology miniaturized, and we shifted to a personal computer model, where all of the computing hardware is located within the individual terminal. In the '90s, there were attempts to shift back to a mainframe model, using the internet rather than having the user and the server have to be in the same facility, referred to as "network computing," but the technology wasn't there yet. It wasn't until the 21st century and high-speed, and more importantly, cellular internet that the network computer model became common again, except now we call it "the Cloud" and our individual "terminals" are (or can be) capable of fulfilling most computing needs locally. Except now we've got large language and visual diffusion models which take too much time or memory to run locally, so those have to be run on extremely powerful servers that are heavily subsidized by companies hoping the technology will eventually become, if not useful, then somehow profitable.

The behind-the-scenes info on TOS suggests they use an obsolete model of computing; Spock's station is said to be the "library computer console," suggesting that none of the other control panels on the bridge can look up information from the ship's database, and when they do, they usually carry data disks; on the other hand, we don't know what kind of constraints they're operating under. There could be important practical reasons to keep the ship's systems firewalled; PADDs with critical documents might be hand-delivered to ensure chain-of-custody and prevent man-in-the-middle or phishing attacks where false or misleading documents are delivered electronically with no confirmation that they came from where they claim.
 
How?

There were no personal computers in 1964. Those didn't really start to appear until the 70s. Nomad was a fusion of a human probe and an alien probe that was probably more advanced. Nomad itself wouldn't be much more advanced than the Voyager space probes in those days writing.

Computer technologies leaps and bounds didn't start happening until a decade or two later, so the writers wouldn't know. That they happened on what would closely resemble early 21st century computers for the 23rd century is mindly amazing and somewhat dependent on inventors seeing Star Trek and going, "how do I make that"?

Sci-fi of the 1950s to 1970s had humans advance in space design and interplanetary colonization. Instead we had the computer be revolutionized and space languished after the 1980s Challenger disaster.
Please remember that computer technology in the real world was driven by military necessity. What this meant as mentioned by you and the next post down is the incredible shrinkage of the computer.

The AN/FSQ-7 of the 1950s occupied some 2,000 square meters of floor space. Two of them were in each SAGE installation. Why two? In case the prime unit went down and couldn't be broken back online fast enough. Furthermore, they had young women on roller skates going up and down the isles just to change out vacuum tubes - fast. With the total number of tubes involved, at least one was always burning out. Reliable computing technology had to he developed.
Another example: the North American A-5A Vigilante made use of the first transistoraised digital computer in an aircraft for navigation, and the first ones burned out in fifteen minutes or so...

Now about "Independent Logic ", I don't think that this was true Artificial Intelligence, but somewhere close to it. It is the difference between Siri and non speaking simple computers. Not quite able to make the leap into full blown A. I. But good enough to affect some change. But here's the problem where the extensions to this technology?

What happens to put it another way, when you combine Independent Logic and Computer Aided Design?


Let's look at Xerox Park. It invented the Graphical User Interface. Which Apple Computers brought to light ten years later in the Apple Macintosh...

In order to totally replace Windows and GUIs completely, just how good does an Independent Logic have to be???

Such that in early twenty-first century...
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top