Instead of posting the usual technobabble, here are visual demonstrations, from another TV show from source material where the original VT edit existed, a 35mm restoration, followed with one one but two examples of AI enhancing utilities:
VT original (source: TrekCore, presumably deinterlaced from the DVD to create ersatz 480P from two 480i fields composited to make one P frame.)
A $100 utility, GigaPixel AI, can interpolate, remap, and upscale and find some possible details by carefully using edge enhancement techniques.
Upscaled with Topaz AI. Topaz has a suite of filters, which are generally quite good. But it, like every other AI tool, can't add detail - as the following image will concretely prove:
Yon blu-ray. Now I could have played games and color correct to make their histograms identical, since the 35mm remasters are cooler and lacking that warm glowing warming glow, but look at the actual details around Worf's head, the door frame in the corner, or the door placard.
I could shrink the 35mm image down to 480p as well as the upscaled versions and it would still have the most innate details. Even with the post's default image size settings where all images are shrunken slightly, the difference is fairly straightforward. Look at "11" on the door label alone - not even the AfterImage equivalent filters to clean motion video (some use very similar algorithms) are going to bring out that black stripe between the red ones, or more nuanced detail within the wall and other textures. Colors too, especially the denser detailing around Worf's face.
Given enough time, finding something with quilted tunics to point out how the modern AI methods make everything look like fuzzy wax despite the enhanced edge sharpening elsewhere... I could have found a more dense image to really show how AI can't hold a candle. It still depends on source material...
In short, looking at the details, only one of these images comes even remotely close to genuine "HD". The technology has come along way, but it's still clearly no contest (no pun intended), and the AI better suited for material where no higher quality/film master exists. Or if the source is high enough in pixel density, 480P as a source is still impracticable next to a 1080P source - which most modern films are rendered in. The greater the source material density, the higher you can upscale - back in the print media days, some printers would risk stretching from 300PPI to 150PPI as, after that point, quality degradation is more readily visible.
On edit - consolidating posts:
^^for best results, use a 24" or higher monitor at its native resolution (which, nowadays, is 1920x1080 I'd presume). To compare, a 6" smartphone or 10" tablet, especially one at 500PPI to help shroud any dead pixels, won't show too much difference, since HD is meant for larger screens as a general rule - which is the literal elephant in the room, metaphorically speaking...:
https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship