• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Hows today's tech match Star Treks?

I've been thinking about this quiet a bit over the last two weeks ever since Michio Kaku, on his show Science Fantastic, talked about the idea of some sort of faster then light speed being closer then we think thanks to research with the Hadron Collider.

I'm starting to think that there's a lot things in the Star Trek universe we're ahead of the curve compared to Star Trek and I'm really interested to see what you guys think.

Here's some instances:

iPadds / Nocks / Kindles - I think many everyone of these tools have easily surpassed PADDs now. They've accomplished everything a PADD does and better.

Computer Interfaces - I think our computer interface system very shortly is going to start to look very different then Star Trek's LCARs UI looks and maybe more similar to the HUI Interface in Earth: Final Conflict. Think about it, we've already developed tactile interfaces (iPad, Windows Mobile 7), virtual interfaces (Microsoft Kinect) and the base design of the UIs of the LCARs system seems very much inferior to what we can design today.

Computer Mainframes - With Quantum Computers getting closer and closer to being a reality, we're getting close to the point where we could easily imagine our computers are going to be (at base level) stronger then what was seen at the end of Voyager. Voyager's computers were scaled based on the idea it could move a Terraflop of information and information is measured in Terraquads. Welp, we're already at the point on the edge of computer development that we can already do that, but Quantum computers simply are capable of doing the most advance commands seen in Star Trek.

So, what do you guys think ? How do we compare ?


Actually things like the comunicater, PADD, and computers from star trek, haven't been surpassed by mondern technology.
We aren't even close to build any warp like ships or probes. Teleporters are impossible, we are not close to holodeck technology.
Star trek will always be ahead of the curve.
When you look at these things them seem old, but truly they are above the curve.
the small voyager computer outmatches any computer today, seeing as it would take a large room computer to be able to keep up with it.
 
^At the rate information technology is progressing, we will overtake Star Trek level computers fairly soon.
Strictly speaking, no, it doesn't. It warps space to achieve an effect faster than light. The ships never actually attain a velocity faster than light.

And the computer does the same thing. The Tech Manual says the computers make a non-propulsive subspace field to achieve data processing at FTL speeds.
So, it's in a warp bubble that is not moving. How does that "speed up" the computer? A warp bubble doesn't accelerate the passage of time.
 
Your IT so I can see why you would say that our computers will soon surpass treks. I tend to disagree, we might match the amount of information, we might one day(not soon) match to processing power , but unless it can fit in a portable drink cooler and not require a room thats very cool and gets un-interrupted power, well will won't surpass it any time soon.
Progress only depends on wether you have the means to progress, and they way the worlds economy is looking, it will probably slow until all of us gets back on our feets. Or atleast the jump won't be made in the U.S
 
I think I've said before, how I believe that bigger processing power and memory don't necessarily make a better computing experience.

There's an old saying "What intel giveth, microsoft taketh away", which we can verify by seeing how Office 2010 running on a modern computer is no faster than Office 2000 running on a computer that's 10 years older.

A lot of modern software (and operating systems) seem to suffer from this "Inverse Moore's Law", where the hardware demands are disproportionately big when we consider what few new features the software provides.

I think we need to put a stop to that trend.
 
I think I've said before, how I believe that bigger processing power and memory don't necessarily make a better computing experience.

There's an old saying "What intel giveth, microsoft taketh away", which we can verify by seeing how Office 2010 running on a modern computer is no faster than Office 2000 running on a computer that's 10 years older.

A lot of modern software (and operating systems) seem to suffer from this "Inverse Moore's Law", where the hardware demands are disproportionately big when we consider what few new features the software provides.

I think we need to put a stop to that trend.

There are several reasons processing power is sacrificed (wasted) in modern PCs:

* More complex operating systems (Windows)
* More demanding graphics applications (games in particular)
* Widespread use of virtual machines/interpreted languages/other forms of abstraction

Once we account for the first two, most of the remaining "power drain" comes from the latter category. In the old days, everything was written in machine language. Then we got high-level languages like C, which are still pretty fast--often only marginally slower than hand-tuned machine code. But today we have a plethora of languages which are interpreted at runtime rather than compiled, and there's often a massive performance loss as a result. Many applications support scripting through languages like Python and Lua, and any scripts written in those languages are going to be slower and more resource-intensive than any pure machine language code. Likewise, you have languages like Java which don't run natively but run inside a virtual machine, and that layer of abstraction has a real cost in terms of processing power. There are also schemes like XUL, Firefox's XML-based interface definition language. I think that one is notoriously slow. Sure, you could write your user interface in C or something similarly close to the bare metal, but that shuts out the majority of users from customizing the interface or writing their own.

This is all driven by developers wanting to reuse code and make their programs more maintainable and accessible. Developer time is expensive whereas hardware is cheap, so when it comes time to allocate development time, bean counters don't want developers to waste months reinventing the wheel when there's a perfectly adequate toolkit out there that already does most of what they want. But if that toolkit is written in Java or Python or .NET, it's going to have those performance tradeoffs built right into it.

While it's certainly possible that all software could be written in C or assembly/machine language, it would dramatically increase the development costs of most software. Developers with strong skills in those areas aren't as common and they don't come cheap, and if you're going to hire them you sure as hell don't want them to waste their time building libraries that have been created a dozen times over in other languages.

So, we end up with a software landscape where the application just has to be "fast enough" for current hardware. It makes very little business sense to eke every last cycle out of the CPU by writing things close to the bare metal, and most developers have no interest in doing that. I sure don't.
 
The problem with that philosophy is that software will continue to be 'just fast enough' on relatively new hardware. It'll never be impressive. That's a future I want us to avoid.

eg, Today's hardware is unlikely to be enough to run Office 2020. Can we expect it to need a terabyte of hard disk space and 20 gigabytes of ram, and at least 8 processor cores?


Well written software shouldn't need to be retired, and I think the unix philosophy had that ideal in mind with the recommendation that each application should be kept simple and focus on doing one thing well.

And it feels like our attitude to hardware has been shaped by the rising demands of software.

For example, 1GB is not considered a lot of memory nowadays, but that attitude has come about only because of modern software demanding in excess of that much memory. People have got too used to seeing memory being abused, and that distorts people's idea of what memory is capable of if used properly.

To me 1GB is a lot of memory. It always has been a lot and always will be a lot. It'll be capable of doing just as much in year 2020 as it was capable of doing in year 2000.

The mathematics of it and algorithms using it is constant, it is only our attitude to it which becomes distorted over time.


My idea of the future is having software that runs like lightning, with a GUI that responds to my clicking and typing within a millisecond.

That goal doesn't need massive improvements in hardware. What it needs most is well written software.
 
right. good point. One problem if we contuine using modern tech and don't want to exceed then we will never get trek tech any time soon. Now a fair system being that games don't be so advanced that a 2 year old computer can't run them.
 
The problem with that philosophy is that software will continue to be 'just fast enough' on relatively new hardware. It'll never be impressive. That's a future I want us to avoid.

eg, Today's hardware is unlikely to be enough to run Office 2020. Can we expect it to need a terabyte of hard disk space and 20 gigabytes of ram, and at least 8 processor cores?


Well written software shouldn't need to be retired, and I think the unix philosophy had that ideal in mind with the recommendation that each application should be kept simple and focus on doing one thing well.

And it feels like our attitude to hardware has been shaped by the rising demands of software.

For example, 1GB is not considered a lot of memory nowadays, but that attitude has come about only because of modern software demanding in excess of that much memory. People have got too used to seeing memory being abused, and that distorts people's idea of what memory is capable of if used properly.

To me 1GB is a lot of memory. It always has been a lot and always will be a lot. It'll be capable of doing just as much in year 2020 as it was capable of doing in year 2000.

The mathematics of it and algorithms using it is constant, it is only our attitude to it which becomes distorted over time.


My idea of the future is having software that runs like lightning, with a GUI that responds to my clicking and typing within a millisecond.

That goal doesn't need massive improvements in hardware. What it needs most is well written software.

Well, it would be ideal if the various frameworks, toolkits, and runtime environments were made to be substantially faster and more efficient. We have a much greater dependence on third-party toolkits than we did in the past. Software packages used to be relatively self-contained, monolithic entities. Today, many software applications are rather "thin" and rely on other packages to provide most of the environment and functionality.

One thing I think has resulted in an enormous slowdown of software is the proliferation of XML. Text processing is slow, especially parsing hierarchical lexical trees like XML. The more applications we base around things like XML, the slower the whole shebang gets and the more processing power we need just to get back to status quo.

I think that once this transition is complete--or once we have substantially more efficient text processing algorithms/libraries--we will see a much greater improvement. But right now we're still kind of transitioning from the model of using binary data files for everything to using human-readable text files (e.g. XML.) I'm sure there are some applications where we'll always use binary blobs (images, etc.) but there's still a lot of stuff we can migrate into the world of text.

There are advantages (and disadvantages) to this transition but I think, on the whole, it will be a good thing for both developers and end users.
 
right. good point. One problem if we contuine using modern tech and don't want to exceed then we will never get trek tech any time soon. Now a fair system being that games don't be so advanced that a 2 year old computer can't run them.


Games are a little different because as time has gone on, we've added new layers of computation to them (physics, lighting, shadows, shaders, etc), and the existing layers have increased in detail. Graphics and video take up a lot of memory, and that's unavoidable.

But a word processor doesn't change that much from one year to the next. It's not much more than a white rectangle with formatted text on it. I could do that on my windows 95 computer with 8MB of memory. A gigabyte is 128 times as much. Does a modern word processor really give me 128 times as much functionality over what I had before?
 
right. good point. One problem if we contuine using modern tech and don't want to exceed then we will never get trek tech any time soon. Now a fair system being that games don't be so advanced that a 2 year old computer can't run them.


Games are a little different because as time has gone on, we've added new layers of computation to them (physics, lighting, shadows, shaders, etc), and the existing layers have increased in detail. Graphics and video take up a lot of memory, and that's unavoidable.

But a word processor doesn't change that much from one year to the next. It's not much more than a white rectangle with formatted text on it. I could do that on my windows 95 computer with 8MB of memory. A gigabyte is 128 times as much. Does a modern word processor really give me 128 times as much functionality over what I had before?

Actually, it does. If you compare what word processors could do 20 years ago vs. what they can do now, it's insane. You could safely argue that 90% of the features are completely useless and unnecessary for the vast majority of users, though.
 
Actually, it does. If you compare what word processors could do 20 years ago vs. what they can do now, it's insane. You could safely argue that 90% of the features are completely useless and unnecessary for the vast majority of users, though.

Which just accentuates the problem. Why are %90 of people buying software they don't utilize? If people made the shift to buying software with just the functions they need but runs faster than say, anything by Microsoft, we might see the software industry change it's focus from feature bloat to efficiency. Leave the special functions to niche software used by niche users.
 
One of the biggest areas I see we have to catch up with Star Trek computers is the voice interface.

We are nowhere near their level yet.
 
In Trek things are able to move, relative to the real universe, faster than light without any time effects occuring. So saying something is traveling at "5000 c" isn't saying it'd get somewhere before leaving but that it's simply traveling, relative to the universe, really damn fast.

Trekker44747, as I already said:
As per special relativity, if an object can travel faster than light with respect to another frame of reference - such as the universe - then this object CAN travel backwards into time.

It makes no difference that, from its own POV, the object does not travel faster than light.

So saying that the computers being FTL means they'd violate causality is sort of ignoring the "reality" of it. The computer can just make limitless complex calculations in an instant perhaps to the point where maybe it does know the answer to something before it really knows the question but it relays this information to the user in real time for the best use.

Actually, saying that computers are FTL means they have the capability to break causality.
 
Actually, it does. If you compare what word processors could do 20 years ago vs. what they can do now, it's insane. You could safely argue that 90% of the features are completely useless and unnecessary for the vast majority of users, though.

Which just accentuates the problem. Why are %90 of people buying software they don't utilize? If people made the shift to buying software with just the functions they need but runs faster than say, anything by Microsoft, we might see the software industry change it's focus from feature bloat to efficiency. Leave the special functions to niche software used by niche users.

Ah, but there are already lightweight word processors out there, most of which are free. That's why there's no money in them and why Microsoft doesn't make them. Well, they have Works, which is sort of a stripped-down Office, but it's a piece of garbage.

The world of lightweight, single-purpose software has more or less been taken over by the freeware/open source communities. It's commercial software that's gotten progressively more bloated and feature-laden.
 
The world of lightweight, single-purpose software has more or less been taken over by the freeware/open source communities. It's commercial software that's gotten progressively more bloated and feature-laden.

One of the problems with open source is lack of clear objectives. One of the consequences of that is with popular open source software attracting programmers with too much ambition for the project, and the software becomes over-featured.

There's also no intrinsic drive for open source to be made truly efficient, because like with proprietary software, there's this 'just-fast-enough-on-relatively-new-hardware' mindset.

Thirdly, free software often feels awkward to use. Programmers don't usually put much effort into aesthetics, leading to interfaces which feel disorganized and clunky. Consider GIMP as an example.

There's also those authors who insist on having highly graphical interfaces where with every button click the window is frozen for 5 seconds while controls are faded in/out. Useless additions that spoil an otherwise perfectly good piece of software.

But some freeware can feel very well made, and in my experience, most of it tends to be closed source. Consider the Pazera suite as an example.

Also older versions of commercial software can run a lot smoother than the newer versions, and are less bloated, even though we will sacrifice one or two useful features by downgrading. Consider Paint Shop Pro for example.
 
Last edited:
Sometimes individual during hard times can only afford things that just work, you might see change to Sojourner's post, perhaps when we get a better economic situation.
 
Trekker44747, as I already said:
As per special relativity, if an object can travel faster than light with respect to another frame of reference - such as the universe - then this object CAN travel backwards into time.

In the real world, yes. In Trek's universe special relativity has been worked around, most likely because the object isn't really "moving faster than light" but warping space in means that it's slower speeds translate to "FTL" speeds in the universe.
 
He might have been confused, saying that in the trek universe this is possible might have prevented the arguement. Just saying, as a person who gets confused most of the time.
 
One thing I think has resulted in an enormous slowdown of software is the proliferation of XML. Text processing is slow, especially parsing hierarchical lexical trees like XML. The more applications we base around things like XML, the slower the whole shebang gets and the more processing power we need just to get back to status quo.

I really dislike XML. I avoid it as much as I can. I agree with you that it probably is responsible for a lot of slowness and padding of data.

I do like flat files for configuration, and in my projects I usually use my own variation on the INI file. I won't bother describing how that works right now, but it is very versatile and easily editable with notepad.

What software shouldn't be doing (but I fear it does do) is "think" in XML. It would be bad if data remains in memory as XML formatted text and has to be parsed/re-encoded every time that data is accessed or changed.

Software should convert data into binary where it is to be used internally. It only needs to be parsed once, and data only needs to be text encoded once in a final output, like when the user orders a file to be saved to disk... and only then if the user is expected to want to edit that data.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top