I've been wondering what the limits of film are when it comes to digitizing. What is the scanning resolution limit of an analog source such as film? Can you go nuts and scan it at unbelievably high resolution and then drop it back down to 1080p and have a nice supersampling effect?
That is what is done for some feature films. They scan the original camera negative at 4K resolution and then do the digitial intermediate color correction grading at 2K and master at 2K.
As far as the resolution limit of 35mm motion picture film it depends on the film stock and its speed (due to grain)
for motion picture film it is a vertical film with a smaller exposed area and thus smaller pixels. Motion Picture film is scanning at 4K resoultions for larger films as per the Digital Cinema Initiative (DCI) specs. 4K Master 4096x2160 pixels.
also current [KODAK] t-grain films have some image detail at greater than 5.5K. It is a small amount, but nevertheless there.
SMPTE journal Volume 111 Number 2, February/March 2002
So your answer is around 4-5k of resolution. Major A-list feature films shot on 35mm do a all-4k pipeline now starting with "Spiderman 2" in 2004 as 'SpiderMan 2
– The first digital intermediate on a new Hollywood film to be done entirely at 4K resolution'
Technically in 1993 'Snow White and the Seven Dwarfs'
was the first film to be scanned at 4K , restored, and mastered in 4K.
new shots of CGI E-D would be awesome too.
Eden FX's Gabriel Köerner built a new CGI LightWave model for the Enterprise-D in 2004.
swaaye see this:
small design differences than the original TNG CGI NCC-1701-D then check out Enterprise-D's appearance in Star Trek: Enterprise's series finale, "These Are the Voyages..." for a pretty good idea (even if only in standard def.)
from the above post