That article proves exactly nothing. Its "science" is shoddy and dubious at best. And number ten is such an utterly ridiculous position to take that it completely negates the rest of the article. People hear better than other people, and thus they have a better aural acuity than others.I'm very sorry, but your analogy with reference to CD is flawed scientifically, and has been disproved: The 10 Biggest Lies In Audio
As far as analog is better, you can through any skewed "fact" or science at the issue as you want, that doesn't make it true.
For one thing, the article assumes vinyl is the preferable medium for analog output. It isn't, and it never was.
But even beyond that, the reason fundamental reason analog is better is because sound is analog. The notes an instrument produces are analog and what the ear interrupts is analog. So any digital representation must be converted twice.
This is why studios have really expensive conversion equipment plugged into the sound boards and why audiophiles spend $1000s on DACs. Admittedly, the technology has come a long way in the last few years, and the cheaper ones are way better than they used to be, but that doesn't change the fact that I would have much rather spent the money on the one I bought on better speakers--or even I high-end pair of cans. But I digress...
The point is, no matter how high the sample-rate might be, in the end, a digital copy is still an incomplete reproduction.
People can argue all they want to about how those bits and pieces are inaudible--and they are--but that doesn't negate their importance. They're what add the nuance to the sound--that "warm" feeling. People can't actually hear them, but they are the proverbial glue that holds the waveform together.
The only reason people don't notice they're missing is because the brain is a powerful doohickey and does an amazing job of filling in those gaps on its own.
That doesn't change the fact that a digital representation, any digital representation, is just a facsimile.
As such, it's no different when it comes to film.
Now, in this case the input side is a bit more existential. It's simplest to just say "life" or "the world" is analog. Or rather, Hamill and Guinness are two "analogs" standing in a desert.
But the rest of the equation is the same: analog>digital>analog. And all that nuance, whether we can see it or not, gets lost.
The reasons media, and society as whole, are all economical in nature and have little to do with quality:
Digital is easy to store, as it takes little physical space and doesn't ever degrade.
It's easy to distribute and/or transport.
It can be copied infinitely without risk of quality loss.
These are all things that save companies money and, really, make things more convenient for the end-user.
*Even high-definition is more of a byproduct of outside technology than it is specifically being digital.
Non the less, if someone created a device that do all these things and still keep it analog, then we'd all still be using analog.