Finally today, I wanted to say a few things about the whole grain reduction issue. There's been a lot of talk on the subject lately here at The Bits and on various forums around the Net. And certainly, since the whole Patton/Longest Day controversy last year, it's been a real issue of concern for some Blu-ray fans. So I thought it might be a good time to try and introduce a little common sense on the subject. Let's give it the old college try, shall we? Here goes...
A Few Words (of Common Sense) About Grain Reduction on Blu-ray
First of all, I've heard from some in the home video industry that the issue of grain reduction is really just "a matter of taste" - that some consumers want grain while others want none of it. And I'm hear to tell you that, no..., it's NOT a matter of taste. The physical, photochemical film process, which has dominated filmmaking for close to a century now, has certain characteristics that define its look. One of those is the presence of grain in the image. Now, the amount of grain visible in the image depends on many factors: the type of film stock used, various camera settings, the age of the negatives and surviving prints, etc. But the bottom line is that a certain amount of grain in the image is one of the very things that make film LOOK LIKE FILM. Saying that it's a matter of taste is like saying that whether or not to colorize a black and white film image is a matter of taste. Sure, there might be some consumers who don't care or even like colorized film, but I think most film enthusiasts would agree, it's just not appropriate. The whole goal of a Blu-ray presentation should be to recreate, as well as possible, the best original theatrical experience of a film in the home. If that film was a digital presentation, shot natively on HD video, then you wouldn't expect there to be grain. If it's a CG-animated production, you likewise don't expect grain. But when it's a vintage catalog title, shot on photochemical film, you expect it to LOOK like photochemical film. In other words, you EXPECT SOME GRAIN. People who say that film grain is "a matter of taste" tend to fall into two camps: studio marketing folk who don't really understand film and younger consumers (generally under the age of 30), most of whom have NEVER seen their favorite films in a theatre, but rather grew up watching them on DVD. To both groups I say: You need to go out and educate yourselves. That grain isn't "noise". Some of it is supposed to be there.
Now as a friend who works in mastering recently told me, some grain/noise reduction is ALWAYS done on digital, high-definition masters. That's just the nature of the process. Some films need grain reduction due to serious age issues or the fact that occasionally individual reels or shots have more grain than the rest of the film, and there's a need to make the presentation more uniform looking so those shots/reels don't stand out. Grain reduction is a legitimate tool of digital mastering. The issue is not grain reduction per se, but the EXCESSIVE use of it. Here's the problem when grain reduction is excessive: Very often, you're not just stripping out film grain... you're stripping out ALL KINDS of fine image detail. Like fabric textures, facial pores, rain drops, the leaves on trees, etc. That goes against the whole point of the added resolution of high-definition. The result is that backgrounds start looking static and character's faces start looking like they're molded out of clay. That's a problem, and it should be a problem for any serious fan of film. You want to see examples of really serious grain reduction offenders on catalog Blu-ray films? Go look at Fox's aforementioned Patton and The Longest Day. Now compare them to examples of great, exceptionally film-like catalog transfers, like Fox's South Pacific or MGM's The Battle of Britain, and Criterion's Chunking Express - or almost ANY title from Criterion for that matter. There is a REAL difference, and those of you who dismiss this issue owe it to yourselves to investigate and educate yourselves.
One of the things I do when evaluating the quality of a Blu-ray image of an older film, is to look specifically at the grain structure. I usually search for a scene with a bright background - sky, a blank wall, etc. I pause the image, and then move forward a frame at a time. You SHOULD be able to see the grain structure very slightly changing from frame to frame as you step forward. If I can see this, I next look at the overall level of image detail - it's almost always exceptional. On the other hand, if I can't see any grain at all, when I start looking for image detail, I almost always find that it's been substantially reduced, such that the image looks unnaturally soft. And that's a problem.
ON THE OTHER HAND... those of us who are Blu-ray and high-definition enthusiasts DO have to be a little more understanding of some of the difficulties faced by the studios as they work to deliver the best possible image quality on the format. Here's what I mean by that...
I'd ask you enthusiasts to think back to the early days of DVD. You might remember that when the studios started doing a lot of anamorphic transfers, a lot of mastering techs were still using too much edge-enhancement - something they'd been doing with analog video (and rightly so) for years. But when DVD started really exploding, it took some of them a little time to realize they had to dial the use of edge-enhancement back. The same is true today of DNR. I'm guessing excessive grain reduction was the norm with DVD transfers (even in HD) for years, but it just wasn't an issue then because DVD didn't have the resolution to really show that. Now that Blu-ray DOES, these mastering guys are having to adjust, and that's a slow and uneven process. This is one of the big reasons that the studios are discovering that HD transfers done even just a couple years ago aren't up to the standards of Blu-ray release, and have to be redone. I'd bet in a year or so, this will be a non-issue - except in cases where studios make difficult financial decision to just reuse older "off-the-shelf" HD transfers. But my feeling is that for MOST of the new transfers done today, these guys have learned to be more careful about the excessive use of DNR.
We enthusiasts all want "perfect" every time, but looking back, some of those early DVD transfers were terrible by the standards of quality even just a few years later. What is state of the art in terms of quality changes and improves over time with each new format. That's just how things work. The Blu-ray transfers we see in a few years will make many of today's discs look paltry by comparison. But we shouldn't let that prevent us from enjoying the improvements we already see. We should still hold the industry's feet to the fire on these issues, but we should also try keep things in perspective and not let some of these issues become the ONLY thing we talk about or care about or take into consideration.
So yes, excessive grain reduction is a real issue here with high-definition. And the industry needs to be aware of it, and try to improve - which (by and large) I think they are. But you Blu-ray fans need to keep all this in perspective too. Just because a little too much grain reduction has been used on a particular title, that ALONE shouldn't be reason enough for most of you to dismiss the title completely. Consider the extras, the audio improvements, the degree to which the image - even with too much DNR - is improved over the previous DVD release.
Bottom line: Everyone need to educate themselves, to keep things in perspective and to apply a little more common sense to the issue of grain reduction on Blu-ray. Cool? Okay. Enough said.