At this point, knowing a TNG style remaster is not on the cards, I’d settle for an upscale and remaster with improved contrast, colour, etc. What are the chances Paramount/CBS would do this?
The DVD picture is like mud. I can improve it to an extent with my HDMI upscaler and changing the picture settings on my tv. DS9 deserves better, but it’s still very much the forgotten, unloved middle child even thirty years on.
DS9 deserves the full treatment, yes. Anything less isn't the "go big or go home" thing that so many seem to like.
This isn't ENT either, where they did the quality treatment for the live action film and cut in (decently upscaled) CGI. Even then, the same rule for ANY enlarging techniques remains the same: The larger the source material's size, the more you can stretch it without making its limitations show. A native 720P can be expanded and edges tweaked to meet 1080P a lot easier than taking 480i (two alternating 240p fields with empty space between alternating each line stitched together in the temporal span of one frame, and think 29.97 frames per second as well. Regardless of source, those two fields - especially when there's movement - need extra treatment since surrounding fields will have other movement that isn't in harmony, which yields other problems. In fairness, deinterlacing technology has improved TONS over the decades, but that's just the first part of the battle.)
AI is just an umbrella term covering numerous procedural functions, which include - for each frame - selective edge unsharp masking, contrast adjustment, artificial color gamut altering, and even selective blurring - which is an amusing contraindication, at least with the AI tools I purchased for 2d/poster work. It's
good, but it's by no means perfect. Detail color in minute areas cannot be extrapolated on, because the true colors are not picked up on by the algorithms. The limited gamut would lead to more blooming/overexposed areas and the potential for crushing of dark areas into black blobs also exists. Try re-calibrating that on the TV, when the source material is artificially improved yet also lessened in some ways.In fast motion video, especially natively videotaped shows de-interlaced before any other enhancing is done, there is artifacting and even straight vertical lines looking wavy. Is that really "HD"?
And despite it all, plenty of areas
still look like mud.
For native videotape material, all of these tools do yield a genuinely better image. But it doesn't begin to come close to actual HD resolution and the differences can still be spotted 4 miles away.
There also needs to be more than a hobbyist's YouTube channel to really begin to showcase improvements, since even after setting PQ to 1080P or more, it's more often mud with sharp edges. Show a SD original next to genuine HD and shrink HD down to the same resolution as SD and you will see a phenomenal difference on that, which you will not see for the upscaling. Or, better yet, take a proper remastered HD piece next to an AI one. Some clips do exist. The differences are genuinely impressive in what AI can do... and what it cannot.
But AI, which isn't the Venus drug, isn't a panacea. Neither is deinterlacing on its own, as otherwise the streaming episodes wouldn't look as awful either.
Oh, one quick question: What was the point of storing all of the film negs in the first place? They didn't do it on a whim, even for season one. They'd save more money by throwing them away, no? They have archived copies of the master videotapes (probably on D1 tape, whereas the BBC more often uses D3 but I digress...) May as well let the negs rot away, and the vinegar effect has been known to kick in even unexpectedly, depending on quality of the films themselves. Preserve and get the most out of it if you can.