And before anyone mentions upconverting DVD players: Not the same thing. All *that* means is that you're getting the standard def DVD image to not have like 2 feet of black around all sides when viewing on an HDTV. It's NOT making the actual image into HD, or even any increase in quality at all.
I believe those machines actually use a fancy resize algorithm to scale the SD to HD resolution. You don't gain much from it, but it can be better than letting the TV scale the image (depending on what the TV itself would do). Some resize algorithms are much better than others.
Still, it's certainly not even remotely similar to going back to the film and digitizing it at much higher resolution. You can experiment with sharpening algorithms too, which is what some HD TVs do and probably what some of these upconverting DVD players do too. You start to wander into the territory of "creating something from nothing" with magic mathematical algorithms and whether it's better or not becomes subjective for each person.
I've been wondering what the limits of film are when it comes to digitizing. What is the scanning resolution limit of an analog source such as film? Can you go nuts and scan it at unbelievably high resolution and then drop it back down to 1080p and have a nice supersampling effect?
Star Trek's video-for-effects heritage is a major problem. Fancy resizing of that video from 480i to 1080p isn't going to go so well compared to the film work. Also, dumping all of the model work for CGI would be a mixture of good and bad IMO because that model work is part of those series as much as anything else. On the other hand, new shots of CGI E-D would be awesome too.