Discussion in 'TV & Media' started by tomswift2002, Mar 8, 2021.
Your the one who comes across as clueless, along with Timby and JWolf. Your the idiot.
Once again, check out Murder Among the Mormons and tell me you can't tell the difference...
For shits and giggles...
You don't really sound like someone who knows what they are talking about. The way you wrote that comes off forced- oversatured with jargon.
I work with 1.5gbit HD and 12gbit 4k, you don't get much higher bitrate than that. We do have some quad-sdi (we live in a legacy world, and throwing some sdi cables across the lot is far safer than trying to share IP between different production vehicles), but on the whole 4K is streaming - mainly 2110 (some 8 bit, some 10 bit, and I mainly work in 50 rather than 60 so bitrates are a bit variable), some 2022-6. Working on our forward looking architecture and I don't envisage any SDI routers beyond small routers (40, 64, maybe 128 in large trucks, if we build any more). Certainly we'll have one or two A/B 2110 leaf/spine networks instead of two 1500-square sdi matrices by the end of the decade, or we'll be even more decentralised than we are now.
The highest the spec goes for 4k bluray is a measly 144mbit, that's nearly 100:1 compression compared with streaming.
I don't do much in the way of 4k compressed - still have relatively few events generating it (and this year events are rather short on the ground - although looks like Tokyo and Amsterdam are on), but 50, maybe 100mbit h265 are the ballpark numbers of what's usually considered to be high quality.
The last major event streaming I did in October was plain simple HD at 60mbit h264 (2 RTP streams with a video payload of 60mbit, about 6mbit of audio I think - it was dolby-e pass through, and some overhead of course). Could have gone higher, but it's not necessary - even with the transmission chain, in double blind tests people can't tell the difference.
So Streaming vs Disk
Streaming vs Disk is not what determines the quality, nor indeed is bitrate (although obviously bitrate is linked). You'll get far better quality out of a 10Mbit netflix source stream with multi-pass encoding h265 than you will with the same material on a 12mbit blueray encoded with h264, but you won't get better from a blueray 20mbit stream than you will from a netflix 20mbit stream with the same encoder and settings. It's all data, doesn't matter how it's transported, what matters is how it's generated and decoded.
The level that someone can tell there are artifacts varies, and I'd agree that there's plausible an impact in the window that netflix are claiming (did I see topping out at 12mbit for 4k? That feels low to me, even with a specifically tuned encoder) - and I think that in any case reducing the top bitrate is a shot in their own foot given their audience, the ease of measuring bitrate, and the difficulty in measuring quality, but that's not a streaming problem, that's a business decision from netflix. No reason that you can't stream 50mbit or 100mbit, it's a business decision
Any measure you can use to objectively measure quality (psnr, ssim, vmaf) can be quite trivially rigged (see arguments at events like demuxed), so "bitrate" is the only thing people have to compare. Which is a shame, as a good h265 encoder at 20mbit will trounce a poor h265 encoder at 40mbit. At decent bitrates your viewing setup is far more important though.
If you want to measure quality, you need to do large scale double blind subjective experiment with a variety of sources (not just things like the EBU test tape), following ITU-R BT 500, and even then you still wouldn't win over audiophile types (the ones that claim they can hear CD 'compression')
Enjoy your blurays. I know where you're coming from with complaining about "streaming", and I've seen some awful streaming in the past (the final battle of Winterfell in GoT on nowtv was awful, was about 5mbit in HD. I cancelled shortly after that), but ultimatly the convienience and the power of streaming will prevail and those bluerays will become rarer at time goes on.
You seem like you know something... so, I have to ask... can you tell the difference between VHS and "4K" streaming?
At least they can punctuate.
Not really. Bandwith can be crap but streaming services aren't trying to feed you a 240p image filtered through a crab net. If you're having issues it's not the provider, it's your setup and bandwith issues.
Literally the definition of first world problems.
The only thing that might explain the differences in opinions is if tomswift2002 watched streaming contact from 2 feet away on an 80" screen.
@Paul Weaver what do you think of the streaming quality of Netflix, Disney+, and Amazon Prime Video?''
I really like your explanation of how compressed video works.
That would make blind.....which would certainly line up with his MO.
Never had too many issues with Paramount + when it was CBS All Access. Maybe a bit of pixel issues and one freeze. But it's still better than Fox's player. Theirs is so bad that it makes tv shows look like a pixelated mess that's hardly playable.
Again, you're the idiot. Streaming is garbage and looks like VHS.
Wow. Came back three weeks later, for that?
I've been extremely busy.
But streaming doesn't compare at all to Blu-Ray or 4K Blu-Ray. And bitrate does play a huge part, since if the information is not there, then it's not there and the computer is having to "guess" at what is required in that moment. And the heavier the compression and lower the bitrate, the less information the computer has to work with on playback resulting in lower quality.
Maybe in your bumblefuck corner of Canada internet speeds are crummy enough that streaming video looks unsatisfactory, but I'm fairly certain that you are the only person on this planet who thinks the picture quality of a high-def streamed video is no better than the picture quality from a VHS tape. This is an incredibly ridiculous hill you've chosen to die on, but hey, you do you.
Maybe, you'll get some time freed up, soon.
Yeah, this won’t end well.
Separate names with a comma.