• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Poll Would you purchase a legit remaster of DS9?

Would you purchase a legit remaster of DS9?

  • Yes

    Votes: 58 84.1%
  • No

    Votes: 11 15.9%

  • Total voters
    69
1oDVLFi.jpg

If only...
Oddly enough, that's the exact same plate used for Emissary sans Bajor.
 
A high res transfer of the existing master tapes would do wonders. I wouldn't pay for a DVD/BR set though. I don't buy DVD's in general
 
A high res transfer of the existing master tapes would do wonders. I wouldn't pay for a DVD/BR set though. I don't buy DVD's in general

I wouldn't pay 1,000 for the master tapes, that's lame.
If you're a corporate nut, you can skim 1k and pay literally nothing for an upscale, and in many cases the upscales are freely available for fans.
If you upscaled ship shots, like B5, and kept the acting in HD, it would be worth it.
But I'm not gonna bite if it's a bluray release with an SD master.
There's literally no point in it.
Like....the format is for HD content, not HD content, the whole point is that a new bluray release has to look as good as what's in "What We Left Behind".
AI isn't worth it. IMO, interlacing artifacts are part of the problem even on the best upscales. You need a progressive source.
35mm film via a 4k scan, and iConform assembly using upscaled master tapes to produce an EDL is the only path forward.
Upscaling is a tool, but the final release shouldn't ever be upscaled footage alone.
It's lame.
So...I agree with you.
The master tapes are only 480i, they're not high res.
35mm film negatives?
Those are naturally high resolution.
Unless you can invent an AI that can fill in gaps better than a human visual cortex.
(We're still using algorithms, and they aren't remotely as good as a human visual cortex.
People love saying "Cutting Edge" technology, but it's not cutting edge, it's automated 2000s cleanup tech (20 years later it can finally multitask, and the Vending Machine from 20 years ago can take a cashier's job but still hasn't...what a time to be alive?)
AI circle jerk get boring.
So the types of AI that are being talked about?
An LLM that can better read metadata and feed content to you...that's about it.
It's nothing that didn't already exist.
Also, it's a rather boring technology, that most people have no use for, or plainly is useless unless you're an insecure CEO (Most of them seem to be too insecure to make money.)
 
Last edited:
All the people who have seen the Voyager documentary know.
They had to use Ai upscaling for that doc apparently because the Research division was resolved.
Dave Zappone summarized it on facebook, but on the DS9 documentary they went further.
So to get that footage, you'd have to get a license from CBS, then they'd go through shooting scripts.
Anyone could make such a request through CBS.
The Site is still up.
It now redirects to Paramount's main site, when you click on the link you need to.
So, in 2019, Craig Weiss' CBS Digital moved and rebranded as CBS VFX, and sold off outdated film scanners.
As of 2025, Adam Savage did a small doc on the Paramount Archive.
They had to consolidate 3 archives including CBS' archive this year?
Why would they do that?
Couldn't have anything to do with Skydance and Paramount merging could it?
So, that archive consolidation might be why they froze the DS9 remaster at some point a year ago (December 2024.)
Now, if you were doing a bulk scan of Voyager or DS9's film negatives, you don't need shooting scripts, that's automated via "AI"(Image recognition, an upconverted tape master is used, for image recognition, it uses that tape to produce an EDL, and the new scan is automatically cut to the EDL.)
That technology was mature by 2015.

Finally, you can scale down remastered footage to 480p and it still looks better than the DVDs by far, because a digital remaster from today is going to surpass a 1990s tape.
I'm also assuming Paramount has some kind of AI upscaling algorithm it's using for streaming, but not using on Pluto.
Emissary looks like shit on Pluto, and looks like AI shit on Paramount+.

Based on the timeline, I wouldn't expect to hear about a DS9 remaster until 2028, the 35th anniversary.
DS9 and Voyager kept getting pummeled due to ROI problems and corporate mergers.
Paramount remains interested, it seems.
DS9 might be finished.
But it would take 3-5 years to make the DS9 remaster.
It likely started in 2022, or 2023, got frozen again in 2024(2018? first freeze?), picked up or will pick up in 2025.
Scanning would take a little less than 18 months.
X files took 18 months to complete, all 201 original episodes.
DS9 is only 7 seasons.
If you ran Voyager and DS9 concurrently, they'd be ready by the end of the decade.
If those remasters are happening, they'll be quiet, and every footprint I've found stopped in 2024.
It will be just as silent as Babylon 5.
An AI upscale will be the foundation for that though.
It's part of the process.

VFX assets survive and can be replaced.
surviving scene files can use these new assets which can be sourced from STO, or the Roddenberry Archive, or a network of modellers who'd probably be happy to work on these.
One problem may be missing motion control footage.

Another reason why they haven't seen the light of day, might be old Paramount.
IMO, they were letting contracts lapse for this content around the globe, specifically Star Trek.
Why?
If TNG didn't make a lot of money, Paramount wanted money, all the money.
They didn't want Netflix taking a cut of their profits.
Paramount and Netflix wanted to make a third season of Jericho...Paramount kept demanding more money from Netflix until Netflix said "to hell with it."
David Ellison wants to share in the ecosystem.
But a year or two prior?
Paramount wanted all of the money.
Hence why I think TNG made back it's money, but didn't hit the excellent sales target.
To be profitable you need 3x, Paramount probably wanted 5x.
TNG was estimated to cost 9-12 million, DS9 and Voyager are estimated to run that same amount.
42 TV series by my count have now been remastered, and everyone wants to trash me on "None of those have VFX like DS9".
Not true, every one of them does, you just don't notice half of them because they're not Cardassian ships.
Some of them are walls and in the case of Mad Men, Vomit Machines that are painted out.
Every one of these has more VFX than you'd care to list.
I'll even argue DS9 probably has less VFX shots than Mad Men half the time.
Also, Mad Men is a droning and dull show that makes me contemplate suicide.
If I had to watch Mad Men again, I'd prefer Don Draper come let me die and assume my identity so he has to watch it.
I suppose the worst part about Principle Skinner the series was the fact that Don Draper never actually marketed a product called Steamed Hams.
I'd much rather watch DS9, it's fun.

Everyone wants to bitch about Mad Men's remaster, but Warner did that.
Warner is notorious for half-assing remaster projects.
They now want to fix Mad Men, or have uploaded fixes.
Which tells me that might have been the most Mad Men thing ever...a marketing stunt to get attention.
As within 48 hours they seemingly found out they had the right masters, but uploaded the wrong ones (VFX free ones?)
Even though, upscaled 1080p for Mad Men probably would have looked fine, and it didn't yet require the full 4k treatment.
However, that's Warner's call, and as of yet, they aren't part of the Skydance or Netflix family.
People on social media bemoan that DS9 wasn't remastered, but Mad Men was.
LMAO.
Warner doesn't own DS9, they do own Mad Men.
Paramount remasters shows from 2017, that ran on Netflix in 4k, and were shot on 35mm film.
I'd bitch about that...like it would do any good.
I might as well bitch to the President about his questionable choices, I'm sure he cares what I think.
 
Last edited:
I haven't seen it.

But it depends what the person is talking about when they say "AI". It's such a broad brush stroke and it could mean anything.

Are they referring to AI upscaling if that's the context?
What other context could it be? You are arguing that AI will be indistinguishable, whereas the people who have viewed To The Journey are clearly disappointed with the results to the point they feel 455 broke promises made in the crowdfunding campaign.
 
What other context could it be? You are arguing that AI will be indistinguishable, whereas the people who have viewed To The Journey are clearly disappointed with the results to the point they feel 455 broke promises made in the crowdfunding campaign.


They couldn't get the film negatives to rescan, because CBS dissolved the department that does that.
Zappone explained that.
They knew that going into this documentary, as the film scanners were sold off in 2019, when CBS Digital moved and became CBS VFX.
The only way to get remastered footage now is to wait for CBS to do an official remaster and bulk scan of the film negatives.
No clue when that's going to happen.
The doc had to use upscaled footage, as CBS didn't reach back out or offer them help.
I wouldn't argue with the AI people about fidelity, they'll take any slop served to them and call it cutting edge tech, and then when you find out exactly how it mauls an image, which isn't hard, they'll tell you that you haven't seen real AI yet.
IMO upscaling is good if you have a nice 2k source, but 480i isn't gonna cut it.
ATM the goal is to make a diffusion upscaler, but those suck too, they maul the image worse than a currently traditional upscaler, which is literally just automated edge enhancement, and a few bicubic techniques.

To get that footage, the cost likely shot up, (ETC.)
 
Last edited:
the classification of what technology the average Joe would consider "AI" on a project like this is not a clean line.

LMAO, it is, they used Topaz AI, which doesn't compare to a rescan.
That's what caused the "I love Lucy" criticism in recent memory.

People should be able to distinguish what "AI" means by the subject.
No one here is saying they used an LLM, in the context of this, it means "Image Enhancement techniques".
Of course, it doesn't really enhance the image, it mauls it.

Refine it further.
Did they use "Optical Flow Super Resolution", "Sub Pixel CNN", a Diffusion Upscaler, a chroma correction tool, face enhancement, or some other neural stuff.
Or automated edge enhancement and the like?
There's a single clip available of that on a podcast somewhere, and it's got subtle neural artifacts.
It doesn't look great, and I doubt it wowed audiences like Mojo's redone FX did.

The film makers either used Illuminate's "SmartRez", a dishonorable act, as it mauls footage worse than anything else on the market.
Or Topaz AI, which when used conservatively can get something soft. (formatted mostly for an HD screen, but if you pay attention to that image, you'll find that there's a lot of things that are off.)
The point of high definition, is that it's High Definition, not an upscale.
Upscaling is a bastardization of the technology, and the studio abusing it's customers.

Those master tapes are interlaced and filled with analog bugs, and there's no software that can completely remove interlacing or checkerboarding.
Those artifacts are baked into those masters.

I can now count 42 times a Television series has been remastered from Film Negatives, and one thing that also bugs me, people say, those shows didn't have VFX...
ALL SHOWS HAVE VFX AND DIGITAL FX, IDGAF. Mad Men should be a raging clue right about now.
42 times also means that doing this is very common, as long as the show managed to get a proper series finale.
DS9 and Voyager completed their runs, got a series finale, and managed a 4.0 rating.
They were successful.
I guarantee at least DS9 is more popular than Mad Men.
 
Last edited:
What other context could it be? You are arguing that AI will be indistinguishable, whereas the people who have viewed To The Journey are clearly disappointed with the results to the point they feel 455 broke promises made in the crowdfunding campaign.
No I'm not. AI indistinguishable from WHAT? AI and automation will be used lots in filmmaking. I doubt you know what grading work has gone on to something you see and if someone did it manually or used automated help. And they're certainly not doing it with a pencil by hand.

Disappointed with the result... in WHAT? This is my point. I haven't seen it. And now there's broken promises which is something else I don't understand.

AI is a very broad term. It's bordering on "I don't want a TV show that has involved computers."
 
No I'm not. AI indistinguishable from WHAT? AI and automation will be used lots in filmmaking. I doubt you know what grading work has gone on to something you see and if someone did it manually or used automated help. And they're certainly not doing it with a pencil by hand.

Disappointed with the result... in WHAT? This is my point. I haven't seen it. And now there's broken promises which is something else I don't understand.

AI is a very broad term. It's bordering on "I don't want a TV show that has involved computers."


If you want nuance, read, if TL;DR, to summarize...
"Garbage in, Garbage out."

I think AI color grading is okay, as well as dustbusting, scratch removal, and automated assembly.
However, again, when people invoke AI, they're talking about enhancement of 480i not automated corrections, or automated assembly, these are two different things. (image enhancement of a crap source, upscaling has yet to deliver on it's promise, because HD is about more than pixels, it's about true image fidelity, there are multiple forms of resolution, temoral, color gradient, fine detail, it has as much to do with a quality scan, as it does a good film negative. You can't shit pixels out on a screen and expect it to look good.),
Usually, when an error is made, what are they bitching about?
They're bitching about what amounts to a forgery, technically it's High Definition, because it meets a shallow standard for pixel resolution.
But it doesn't meet a fidelity standard. In and of itself, that makes it a forgery.
Abuse of the tools, or a bad source that looks worse with upscaling, that corpies are going to lie about and say it looks excellent.
What is my thesis statement here?
Anyone on one of these boards is talking about AI in regards to (Image enhancement, and diffusion upscaling...context matters. They aren't talking about LLMs, LAMs, NLP, or Robotics, anyone trying to correct them is being pedantic, and they know it, pedantic to gaslight them. Pro AI people want to say you don't know where AI is being used, and I think that's Pedantic wordplay meant to gaslight people who are against diffusion upscaling, or enhancement from crap sources like a 90s tape. Be it D2 or Digibetacam.)
(The Pro AI people want an image to maul, to prove they can do better when they obviously can't, and they want to shut everyone else up so they can simply promote mauling an image. Gaslighting any critics into silence.)
You know exactly what you're doing there...STOP IT.
(It's an attempt to reframe the debate, people who are anti AI are now positioned as dumb, while the pro AI are positioned as genius outsiders.
Private Equity would go with an upscale, because Private Equity gives zero fucks about quality, or the future. If you're promoting this, you're saying Private Equity is right, and you don't respect the art form, or the craft. Private Equity is the most insider position, and it ruins every market it gets involved with, it's a freeloading investment strategy, that wants to invest little, but become extremely wealthy off of nothing. It demands a miracle, and no investment or good faith.)

It's Lucasfilm shit.
The theatrical cut of Star Wars was eventually released on DVD, because the market "wanted Laserdisc rips."
That's what Lucasfilm said in '04.
The market was ripping laserdiscs because they preferred the original 1977 cut.
How far did that market go when Corporate had the capacity to make money by respecting what the consumer wanted?
How did the LucasFilm undermine itself?
Laderdisc rips. from fans.
DVD rips later. Both are GOUT.
Harmy Despecialized.
Mike Verta
4k77, and TN1.
All of these were acts of Piracy to preserve what the fan community actually wanted.
Lucasfilm undermined itself due to arrogance, and willful stupidity.
They could have been making money off of this particular cut for decades, and they refused to do that.
The one thing you don't want Paramount to do.
You can't tell me that isn't the most retarded corporate take on what the market was saying.
It's historic.
So, like making a wish with a Djinn, you have to be extremely specific, lest you end up with a Wishmaster nightmare.

Buffy the Vampire Slayer? DNR, excessive.
Aliens? True Lies? Excessive DNR and Edge Enhancement, as well as AI enhanced faces (what they used I couldn't tell you, but I suspect Topaz.)
People are going to be more lenient on fan upscales than a studio product. People expect a studio to DO IT RIGHT.

In other cases, people are made at old sitcoms that were shot on tape being cropped to 16:9, and being mauled by SMARTREZ IT from Illuminate.
While iConform from illuminate is a godsend for remaster projects.
SMARTREZ IT, is an unholy abomination that ruins everything.

For Upscaling, we have diffusion methods, object enhancement, and other models (Even Topaz is using the same open source variant models you can find around the internet, they just made a front end for it.)
I wouldn't be surprised if "SMARTREZ IT" is just a group at Illuminate that use Topaz.
There's a method to get better tape signals, DOMESDAY is the open source version, but imo, studios probably had a similar method way back in the 2010s.
If you're remastering from film, you don't need Domesday, just a signal that can help for image recognition, since for the majority of the output, you're not going to be seeing that master tape template.

So, I've done a little more research into regular technical matters regarding how a remaster might look going down.

You have 5 key programs that would be used.

iConform, we've covered, although there isn't a lot of info on how it works.
Roughly, it takes an upscaled video tape, matches a newly scanned film or digital file, matches them with image recognition, essentially using the master tape as a template to cut a high definition version of the episode.
The original master tape is used to produce a new EDL (Edit Decision List), where the 2k or 4k scan is then conformed into a High Definition version.
Babylon 5 is probably what a rough master of DS9 would look like, HD, then it cuts into SD until the new FX are composited, and then inserted into the final product.
It doesn't need shooting scripts or a research department, although illuminate Execs say they utilize and preserve those as a matter of protocol.

MTI's DRS system,
https://mtifilm.com/software/drs-nova/

This is a standard industry tool for scratch removal and dust cleanup.
DRS stands for Digital Restoration System. (This is the proprietary tool for scratch cleanup and dust busting. You could consider it AI, but it's been in use since 2008, and was used to remaster the Godfather trilogy into 4k, after the OCN had to be rebuilt.)

Adobe After Effects, I don't need to explain.
But this does tie into the Rodenberry Archive, which probably has a lot of remastered material from TNG:R, which could be seamlessly integrated into a DS9 remaster.

Nuke, or a package for After Effects like Flame would likely be used to generate new phaser fire, transporter beams, photon torpedos, tracter beams, and recompositing.

People say, DS9 had a lot of digital FX. TNG had a lot of analog FX that became digital FX by the time it was remastered. This is literally the same thing. It’s replacing effects.

People love to tout DS9's massive use of "Digital Effects", which TNG did with an analog computer way back in the 80s, and early 90s.
Then Digital FX replaced that.
In principle, it's the same damn thing, you have a raw image plate, and you put a phaser plate over it.
It was designed to be very simple.
In principle, remastering it, would be just like TNG, but in 4k.
It would be doing what was done on an analog or digital computer in the 90s, but doing it in After FX.
With 23-45 minutes of heavy CG, and perfectly recreated Assets, it's not as big a deal as everyone makes it out to be.
These FX are inferior to the ones in Stargate Atlantis, and should be very easy to do.
TNG was done in 3 years, DS9 could be done in 3 years, but probably needs a little padding.
Same applies to Voyager.

Finally, budget-wise, When optical effects weren’t used anymore they switched to Digital Effects teams. The budget for a remaster means they’re going to jump from style to the other.

Finally, most AI tools that are around today, are tools that existed 20 years ago, just multiple tools and some kind of Action Model.
This will become more common as LAMs (Large Action Models) become available.
If the Vending Machine with a LAM, LLM, NLP and robotic arm were going to take a cashier's job, it only needed the robotic arm in '08 to do that, buttons and grids were common then.
You could direct the SOB without talking to it, and it worked fine.

The AI circle jerk is going to pop a bubble and cause another recession, as AI pleateaued for the most part in '2010.
.com crash 2.1, a firmware story.
We've had how many recessions since 9/11?
It has to be more than 4.
We now have a bunch of useless tools that get talked up, and everyone who hears the "too good to be true" promises, treat AI like a magic box.
You can mount an AI into a robot, and have it shoot you with an airsoft pistol, but again, that's also been a thing since idk, dronestrikes in the Iraq War.
AI is a steaming pile of marketing bullshit.
The only improvement has been ubiquitization through cheaper and more powerful hardware.
i.e. Hardware Engineers, and Software Engineers developed better systems.
But overall, this was a little step, not a giant leap that everyone makes it out to be.
Context is key, we're not talking about Surveillance, enhanced drones, or economic reconfiguration, we're talking about upscaling vs. legit rescans and rebuilds (The only thing you should regard as a remaster.) and not digital enhancement.
There is no shame in wanting a legitimate remaster over a garbage upscale.

1080p onward was meant to show that, not upscales.
Anyone trying to shame you for that is pretty dumb.
Anyone trying to conflate a rebuild as being illegitimate is also kinda dumb.

We're not anti-tech, we're anti-bullshit.

There's a reason, AI upscaling is bad.
Take Gul Dukat, watch any upscale of "Necessary Evil", Gul Dukat's brow is tilted to slightly obscure his eye.
The upscaler puts a blurry double image thing there, because even with the best source, it has little information to work with.
When you pay attention to the image, (As you're supposed to do with a visual medium, because it's a fucking visual medium.)
I'm supposed to be ashamed for wanting a remaster, because I'm selfish and want to watch it in HD, (That is the dumbest shaming tactic I've ever read, yeah, I want to see it in HD
Thanks captain Obvious. If I didn't want to watch it in HD, then I wouldn't be here.)
Your head starts to hurt, the image is unattractive. Your brain can't really tell what it's looking at.
The image has been mauled.

Andromeda's Ai upscale is borderline unwatchable because of this.
The image is washed out, flat, too soft, it hurts to watch.
Scaline CRTs had a few tricks to accomodate this, and they were tiny, so you were across the room looking at a postage stamp (a very bright one) through a shadow mask.
65inch 4k displays are a very different beast.
They demand true fidelity. Rainbow artifacts, Dot Craw, Leftover interlacting artifacts are all going to be misread by an image upscaler.

"The European Broadcasting Union argued against the use of interlaced video in production and broadcasting, recommending 720p 50 fps (frames per second) as then-current production format and working with the industry to introduce 1080p50 as a future-proof production standard until the early 2010s, which offered higher vertical resolution, better quality at lower bitrates, and easier conversion to other formats such as 720p50 and 1080i50.[14][15] The main argument was that no mattered how complex the deinterlacing algorithm may be, the artifacts in the interlaced signal cannot be eliminated because some information is lost between frames."

Yves Faroudja, the founder of Faroudja Labs and Emmy Award winner for his achievements in deinterlacing technology, stated that "interlace to progressive does not work" and advised against using interlaced signals.[2][16]

People also expect software or computers to get cheaper, and they're not.
In the year 2000, Digital FX software had plateaued, and was the exact same as what they were using in 2010 to do LOST.
BTW, they apparently weren't using GPUs by 2010 to do VFX for TV shows, still rendering by CPU.

College Textbooks cost an arm and a leg, and are financed by Federal Subsidy, they're overpriced, and don't get updated yearly as much as rearranged.
Planned obsolescence and price gouging ring a bell?
This problem and this mindset has a negative impact on more than the film industry.
It's a nationwide problem, that affects food, education, and your policy makers.

You need terabytes of memory, you need GPUs.
You need computers (obviously, and good ones)
You need film scanners.
You also need bulk software licenses that cost thousands of dollars a month, or year until a project is complete, along with a support team.
It's already plateaued, and I'm sure Paramount is already in possession of the Hardware and the Software, and the amount of memory needed to complete a project.
They have film scanners, they have computers, they have GPUs, they have render farms, bulk software licenses for MTI's DRS System and After Effects. they have partners(Illuminate, FotoKem and Paramount have been joined at the hip for sometime.), and they have at least a data center to retrieve things.
It's not going to get any cheaper than it is right now.
A rip of the master tape that is high signal quality, probably requires about 600 gigs per episode anyway.
If you wanted a nice product, you wouldn't want to waste memory on 480i tapes, so a shit tape signal is what you'd use as a template, because you'd want to allocate memory to a high res bulk scan.

Here's a list, I made.
To view this content we will need your consent to set third party cookies.
For more detailed information, see our cookies page.


Finally...So, you had 3 separate film archives, now you have one, you probably don't need several redundant departments for that, and reconfiguring the film archive meant that there were some problems for third parties, Dave Zappone's desire to get remastered footage was a third party indie project, not a studio project.
Bulk Scans are different from wanting a select 20 minutes of exact footage from 176 episodes of a TV show.
A bulk scan is the only way to get that right now.

With regards to Zappone's doc, just like with WWLB, the expanded budget was to acquire remastered footage of actors sourced from a rescan of 35mm film elements.
People feel like Zappone broke his promise.
He didn't have a choice, because the division was dissolved, and Paramount/CBS was avoidant, because things changed after 2019, and he knew that going in.
Craig Weiss, Mike Okuda, and Dave Zappone probably all stay in touch, and have to because they're involved with the Roddenberry Archive.
They would also be in touch with John Van Citters, as he would be the one who has to go to an executive to greenlight these projects, just like Alex Kurtzman.

This is to help the Average Joe in discourse, you know...to expand the public discourse on this topic....
There has been an entire community dedicated to bitching about bad DVD, bluray and other releases for decades.
That is a pretty large section of the Internet, who developed tools themselves to attempt to fix these problems.
This group has existed since at least the late 90s, and run several hundred forums to this day.
DOOM9 is such a forum. One of the best examples.
Many of the people there are self taught.
This group of people isn't ignorant, even though AI dweebs pretend that they are.
You can read ad nauseum about them bitching about Edge Enhancement, and Ad nauseum again on the first Bluray release of "The Dark Knight" and it's abuse of Edge Enhancement.
Don't mention the Star Wars Special Editions to them, they'll lose their collective shit.
Calling them ignorant is the greatest projection of all time, they're far from ignorant.

If you want a true remaster of this content and care, read everything I say, research it, and own it, repeat it.
It's the only way you're going to drill this into the industry, so more products can't be mauled by these people.
The AI upscaling approach dishonors the integrity of the source material, the brand, and the history of the studio.
Know the industry, know the tools, and combat this odd private equity mentality, it's bad for the consumer, and it's catastrophic for film conservation.
An AI upscale of anything shot on 35mm film is unacceptable. That's an absolute statement. PERIOD.
 
Last edited:
The European Broadcasting Union argued against the use of interlaced video in production and broadcasting, recommending 720p 50 fps (frames per second) as then-current production format

I remember the arguments - had an interesting chat about it at IBC and at smpte. Ultimately panel manufacturers want to sell new panels, and that means selling bigger numbers every few years. That at least has stopped now (as have gimmicks like 3d and curved screens)

In Europe few people have a tv set up to make 4k worthwhile for the resolution. The enforced progressive and HDR however are benefits. For live though, for me, latency is killer. I’d rather watch 1080i25 over the tv than watch 2160p 40 seconds later over IP, hearing the neighbours cheer when the goal goes in.

If I remember rightly much of the queens funeral was filmed in 4k for archive, but we transmitted 1080i25.

When you pay attention to the image

80% of Americans are in their phone while watching TV. Very few people pay attention.

How far did that market go when Corporate had the capacity to make money by respecting what the consumer wanted?

Star Wars was a rare example of the artist winning out over the bean counters. Lucas didn’t want the originals out there, he wanted to keep changing it to bring it “closer to his vision”.
 
I remember the arguments - had an interesting chat about it at IBC and at smpte. Ultimately panel manufacturers want to sell new panels, and that means selling bigger numbers every few years. That at least has stopped now (as have gimmicks like 3d and curved screens)

In Europe few people have a tv set up to make 4k worthwhile for the resolution. The enforced progressive and HDR however are benefits. For live though, for me, latency is killer. I’d rather watch 1080i25 over the tv than watch 2160p 40 seconds later over IP, hearing the neighbours cheer when the goal goes in.

If I remember rightly much of the queens funeral was filmed in 4k for archive, but we transmitted 1080i25.



80% of Americans are in their phone while watching TV. Very few people pay attention.



Star Wars was a rare example of the artist winning out over the bean counters. Lucas didn’t want the originals out there, he wanted to keep changing it to bring it “closer to his vision”.

With all due respect....
For the most part you're right.
But the tape master also looks like shit on a smartphone, and a 4k phone only makes it look worse.

No.2,

To view this content we will need your consent to set third party cookies.
For more detailed information, see our cookies page.

Personally, I don't care what they do on their phone, I'm there to watch a high quality movie off a bluray, not fucking around on my phone.

I also don't watch sports, so watching a movie in progressive makes sense.
Sports, IDGAF, Sports aren't a DS9 remaster, which is what matters to me, and why I made the post.
1080p 24p with a rescan of the original film elements or bust.
Sports and Gamers worry about input lag.
Film and television are what I'm interested in.
When MicroLED displays come out in the next thousand years, input lag goes away.
Until then, specialty interlacing for sports, but as I said, I'm not a sports fan anyway, it's irrelevant to DS9.
Sans Baseball, but again, a movie isn't a sports broadcast.

As for 3D, I think if you could solve the autostereoscopy problem (That was a format war),
and curve the screen, it would've worked out.
Alone those technologies are crap, and eye tracking is terrible.
IMO.
4V on a curved screen would be very nice, if they figured out a good automatic diffraction technique.
Bring on Tensor displays!!!
They might even make me want to watch sports.
 
Last edited:
If you want nuance, read, if TL;DR, to summarize...
"Garbage in, Garbage out."

I think AI color grading is okay, as well as dustbusting, scratch removal, and automated assembly.
However, again, when people invoke AI, they're talking about enhancement of 480i not automated corrections, or automated assembly, these are two different things. (image enhancement of a crap source, upscaling has yet to deliver on it's promise, because HD is about more than pixels, it's about true image fidelity, there are multiple forms of resolution, temoral, color gradient, fine detail, it has as much to do with a quality scan, as it does a good film negative. You can't shit pixels out on a screen and expect it to look good.),
Usually, when an error is made, what are they bitching about?
They're bitching about what amounts to a forgery, technically it's High Definition, because it meets a shallow standard for pixel resolution.
But it doesn't meet a fidelity standard. In and of itself, that makes it a forgery.
Abuse of the tools, or a bad source that looks worse with upscaling, that corpies are going to lie about and say it looks excellent.
What is my thesis statement here?
Anyone on one of these boards is talking about AI in regards to (Image enhancement, and diffusion upscaling...context matters. They aren't talking about LLMs, LAMs, NLP, or Robotics, anyone trying to correct them is being pedantic, and they know it, pedantic to gaslight them. Pro AI people want to say you don't know where AI is being used, and I think that's Pedantic wordplay meant to gaslight people who are against diffusion upscaling, or enhancement from crap sources like a 90s tape. Be it D2 or Digibetacam.)
(The Pro AI people want an image to maul, to prove they can do better when they obviously can't, and they want to shut everyone else up so they can simply promote mauling an image. Gaslighting any critics into silence.)
You know exactly what you're doing there...STOP IT.
(It's an attempt to reframe the debate, people who are anti AI are now positioned as dumb, while the pro AI are positioned as genius outsiders.
Private Equity would go with an upscale, because Private Equity gives zero fucks about quality, or the future. If you're promoting this, you're saying Private Equity is right, and you don't respect the art form, or the craft. Private Equity is the most insider position, and it ruins every market it gets involved with, it's a freeloading investment strategy, that wants to invest little, but become extremely wealthy off of nothing. It demands a miracle, and no investment or good faith.)

It's Lucasfilm shit.
The theatrical cut of Star Wars was eventually released on DVD, because the market "wanted Laserdisc rips."
That's what Lucasfilm said in '04.
The market was ripping laserdiscs because they preferred the original 1977 cut.
How far did that market go when Corporate had the capacity to make money by respecting what the consumer wanted?
How did the LucasFilm undermine itself?
Laderdisc rips. from fans.
DVD rips later. Both are GOUT.
Harmy Despecialized.
Mike Verta
4k77, and TN1.
All of these were acts of Piracy to preserve what the fan community actually wanted.
Lucasfilm undermined itself due to arrogance, and willful stupidity.
They could have been making money off of this particular cut for decades, and they refused to do that.
The one thing you don't want Paramount to do.
You can't tell me that isn't the most retarded corporate take on what the market was saying.
It's historic.
So, like making a wish with a Djinn, you have to be extremely specific, lest you end up with a Wishmaster nightmare.

Buffy the Vampire Slayer? DNR, excessive.
Aliens? True Lies? Excessive DNR and Edge Enhancement, as well as AI enhanced faces (what they used I couldn't tell you, but I suspect Topaz.)
People are going to be more lenient on fan upscales than a studio product. People expect a studio to DO IT RIGHT.

In other cases, people are made at old sitcoms that were shot on tape being cropped to 16:9, and being mauled by SMARTREZ IT from Illuminate.
While iConform from illuminate is a godsend for remaster projects.
SMARTREZ IT, is an unholy abomination that ruins everything.

For Upscaling, we have diffusion methods, object enhancement, and other models (Even Topaz is using the same open source variant models you can find around the internet, they just made a front end for it.)
I wouldn't be surprised if "SMARTREZ IT" is just a group at Illuminate that use Topaz.
There's a method to get better tape signals, DOMESDAY is the open source version, but imo, studios probably had a similar method way back in the 2010s.
If you're remastering from film, you don't need Domesday, just a signal that can help for image recognition, since for the majority of the output, you're not going to be seeing that master tape template.

So, I've done a little more research into regular technical matters regarding how a remaster might look going down.

You have 5 key programs that would be used.

iConform, we've covered, although there isn't a lot of info on how it works.
Roughly, it takes an upscaled video tape, matches a newly scanned film or digital file, matches them with image recognition, essentially using the master tape as a template to cut a high definition version of the episode.
The original master tape is used to produce a new EDL (Edit Decision List), where the 2k or 4k scan is then conformed into a High Definition version.
Babylon 5 is probably what a rough master of DS9 would look like, HD, then it cuts into SD until the new FX are composited, and then inserted into the final product.
It doesn't need shooting scripts or a research department, although illuminate Execs say they utilize and preserve those as a matter of protocol.

MTI's DRS system,
https://mtifilm.com/software/drs-nova/

This is a standard industry tool for scratch removal and dust cleanup.
DRS stands for Digital Restoration System. (This is the proprietary tool for scratch cleanup and dust busting. You could consider it AI, but it's been in use since 2008, and was used to remaster the Godfather trilogy into 4k, after the OCN had to be rebuilt.)

Adobe After Effects, I don't need to explain.
But this does tie into the Rodenberry Archive, which probably has a lot of remastered material from TNG:R, which could be seamlessly integrated into a DS9 remaster.

Nuke, or a package for After Effects like Flame would likely be used to generate new phaser fire, transporter beams, photon torpedos, tracter beams, and recompositing.

People say, DS9 had a lot of digital FX. TNG had a lot of analog FX that became digital FX by the time it was remastered. This is literally the same thing. It’s replacing effects.

People love to tout DS9's massive use of "Digital Effects", which TNG did with an analog computer way back in the 80s, and early 90s.
Then Digital FX replaced that.
In principle, it's the same damn thing, you have a raw image plate, and you put a phaser plate over it.
It was designed to be very simple.
In principle, remastering it, would be just like TNG, but in 4k.
It would be doing what was done on an analog or digital computer in the 90s, but doing it in After FX.
With 23-45 minutes of heavy CG, and perfectly recreated Assets, it's not as big a deal as everyone makes it out to be.
These FX are inferior to the ones in Stargate Atlantis, and should be very easy to do.
TNG was done in 3 years, DS9 could be done in 3 years, but probably needs a little padding.
Same applies to Voyager.

Finally, budget-wise, When optical effects weren’t used anymore they switched to Digital Effects teams. The budget for a remaster means they’re going to jump from style to the other.

Finally, most AI tools that are around today, are tools that existed 20 years ago, just multiple tools and some kind of Action Model.
This will become more common as LAMs (Large Action Models) become available.
If the Vending Machine with a LAM, LLM, NLP and robotic arm were going to take a cashier's job, it only needed the robotic arm in '08 to do that, buttons and grids were common then.
You could direct the SOB without talking to it, and it worked fine.

The AI circle jerk is going to pop a bubble and cause another recession, as AI pleateaued for the most part in '2010.
.com crash 2.1, a firmware story.
We've had how many recessions since 9/11?
It has to be more than 4.
We now have a bunch of useless tools that get talked up, and everyone who hears the "too good to be true" promises, treat AI like a magic box.
You can mount an AI into a robot, and have it shoot you with an airsoft pistol, but again, that's also been a thing since idk, dronestrikes in the Iraq War.
AI is a steaming pile of marketing bullshit.
The only improvement has been ubiquitization through cheaper and more powerful hardware.
i.e. Hardware Engineers, and Software Engineers developed better systems.
But overall, this was a little step, not a giant leap that everyone makes it out to be.
Context is key, we're not talking about Surveillance, enhanced drones, or economic reconfiguration, we're talking about upscaling vs. legit rescans and rebuilds (The only thing you should regard as a remaster.) and not digital enhancement.
There is no shame in wanting a legitimate remaster over a garbage upscale.

1080p onward was meant to show that, not upscales.
Anyone trying to shame you for that is pretty dumb.
Anyone trying to conflate a rebuild as being illegitimate is also kinda dumb.

We're not anti-tech, we're anti-bullshit.

There's a reason, AI upscaling is bad.
Take Gul Dukat, watch any upscale of "Necessary Evil", Gul Dukat's brow is tilted to slightly obscure his eye.
The upscaler puts a blurry double image thing there, because even with the best source, it has little information to work with.
When you pay attention to the image, (As you're supposed to do with a visual medium, because it's a fucking visual medium.)
I'm supposed to be ashamed for wanting a remaster, because I'm selfish and want to watch it in HD, (That is the dumbest shaming tactic I've ever read, yeah, I want to see it in HD
Thanks captain Obvious. If I didn't want to watch it in HD, then I wouldn't be here.)
Your head starts to hurt, the image is unattractive. Your brain can't really tell what it's looking at.
The image has been mauled.

Andromeda's Ai upscale is borderline unwatchable because of this.
The image is washed out, flat, too soft, it hurts to watch.
Scaline CRTs had a few tricks to accomodate this, and they were tiny, so you were across the room looking at a postage stamp (a very bright one) through a shadow mask.
65inch 4k displays are a very different beast.
They demand true fidelity. Rainbow artifacts, Dot Craw, Leftover interlacting artifacts are all going to be misread by an image upscaler.

"The European Broadcasting Union argued against the use of interlaced video in production and broadcasting, recommending 720p 50 fps (frames per second) as then-current production format and working with the industry to introduce 1080p50 as a future-proof production standard until the early 2010s, which offered higher vertical resolution, better quality at lower bitrates, and easier conversion to other formats such as 720p50 and 1080i50.[14][15] The main argument was that no mattered how complex the deinterlacing algorithm may be, the artifacts in the interlaced signal cannot be eliminated because some information is lost between frames."

Yves Faroudja, the founder of Faroudja Labs and Emmy Award winner for his achievements in deinterlacing technology, stated that "interlace to progressive does not work" and advised against using interlaced signals.[2][16]

People also expect software or computers to get cheaper, and they're not.
In the year 2000, Digital FX software had plateaued, and was the exact same as what they were using in 2010 to do LOST.
BTW, they apparently weren't using GPUs by 2010 to do VFX for TV shows, still rendering by CPU.

College Textbooks cost an arm and a leg, and are financed by Federal Subsidy, they're overpriced, and don't get updated yearly as much as rearranged.
Planned obsolescence and price gouging ring a bell?
This problem and this mindset has a negative impact on more than the film industry.
It's a nationwide problem, that affects food, education, and your policy makers.

You need terabytes of memory, you need GPUs.
You need computers (obviously, and good ones)
You need film scanners.
You also need bulk software licenses that cost thousands of dollars a month, or year until a project is complete, along with a support team.
It's already plateaued, and I'm sure Paramount is already in possession of the Hardware and the Software, and the amount of memory needed to complete a project.
They have film scanners, they have computers, they have GPUs, they have render farms, bulk software licenses for MTI's DRS System and After Effects. they have partners(Illuminate, FotoKem and Paramount have been joined at the hip for sometime.), and they have at least a data center to retrieve things.
It's not going to get any cheaper than it is right now.
A rip of the master tape that is high signal quality, probably requires about 600 gigs per episode anyway.
If you wanted a nice product, you wouldn't want to waste memory on 480i tapes, so a shit tape signal is what you'd use as a template, because you'd want to allocate memory to a high res bulk scan.

Here's a list, I made.
To view this content we will need your consent to set third party cookies.
For more detailed information, see our cookies page.


Finally...So, you had 3 separate film archives, now you have one, you probably don't need several redundant departments for that, and reconfiguring the film archive meant that there were some problems for third parties, Dave Zappone's desire to get remastered footage was a third party indie project, not a studio project.
Bulk Scans are different from wanting a select 20 minutes of exact footage from 176 episodes of a TV show.
A bulk scan is the only way to get that right now.

With regards to Zappone's doc, just like with WWLB, the expanded budget was to acquire remastered footage of actors sourced from a rescan of 35mm film elements.
People feel like Zappone broke his promise.
He didn't have a choice, because the division was dissolved, and Paramount/CBS was avoidant, because things changed after 2019, and he knew that going in.
Craig Weiss, Mike Okuda, and Dave Zappone probably all stay in touch, and have to because they're involved with the Roddenberry Archive.
They would also be in touch with John Van Citters, as he would be the one who has to go to an executive to greenlight these projects, just like Alex Kurtzman.

This is to help the Average Joe in discourse, you know...to expand the public discourse on this topic....
There has been an entire community dedicated to bitching about bad DVD, bluray and other releases for decades.
That is a pretty large section of the Internet, who developed tools themselves to attempt to fix these problems.
This group has existed since at least the late 90s, and run several hundred forums to this day.
DOOM9 is such a forum. One of the best examples.
Many of the people there are self taught.
This group of people isn't ignorant, even though AI dweebs pretend that they are.
You can read ad nauseum about them bitching about Edge Enhancement, and Ad nauseum again on the first Bluray release of "The Dark Knight" and it's abuse of Edge Enhancement.
Don't mention the Star Wars Special Editions to them, they'll lose their collective shit.
Calling them ignorant is the greatest projection of all time, they're far from ignorant.

If you want a true remaster of this content and care, read everything I say, research it, and own it, repeat it.
It's the only way you're going to drill this into the industry, so more products can't be mauled by these people.
The AI upscaling approach dishonors the integrity of the source material, the brand, and the history of the studio.
Know the industry, know the tools, and combat this odd private equity mentality, it's bad for the consumer, and it's catastrophic for film conservation.
An AI upscale of anything shot on 35mm film is unacceptable. That's an absolute statement. PERIOD.

Cool story.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top