Blu-Ray Sucks, Film Is Better

Walk into a Best Buy or Walmart. Look at the luxurious line of flat screen television sets the employees have lined up on the racks. The settings are maxed out to be as bright and vivid as possible so they can be seen five miles away. Sit down on the comfy leather sofas with complimentary popcorn. Maybe even put on a pair of 3D glasses and become hypnotized by Avatar.

1080P, 4K, LED, OLED, resolution, sharpness. It gets very confusing, but what the industry wants you to believe is that we are making progress. To the average consumer we are living in a high definition paradise. Who even owns a VCR anymore? Why do we even need to go to the theatre anymore? Who buys movie tickets? This is why I believe the consumer is being unknowingly duped by what we now refer to as, “home entertainment”.

Picture Quality:

In reality “high definition” is purley a marketing term. There is really no difference between high definition and standard definition in greater context. It’s all relative to how far away you are standing. You could watch a VHS tape on a 200 inch flat screen and it would look good if you were standing 500 feet away. This is why projection looks better to the eye.

To the consumer high definition means “perfect” image quality, so perfect that TV manufacturers and television and movie studios are willing to sacrifice organic sharpness for “perfection”.

So when high definition came along, it opened up an entire new industry called “home entertainment” and “home movies”.

Ask yourself what is missing from every movie you watch on television today? We have to understand the history of theatrical movies…

(Me Watching 70mm Print Of “Mad Mad Mad World”)

Before high definition television sets, if you wanted to see a movie you went to the movie theater. In the 1960’s and 70’s, there were many different formats shot that were projected at very movie theatre across the country.

This means that with any film you watched there was apparent film grain and lines you don’t see anymore. So when film projectors were in movie theaters, they could not digitally alter the film during playback.

When high definition television came out, it destroyed the idea of film projection. Kodak no longer produces 70mm prints, unless your name is Quentin Tarantino and you have millions of dollars to spend.

One of the consequences of “high definition” television is that it shows more detail to the viewer at home. In order to look good on a flat screen LCD, movie studios get lazy and apply an insane amount of noise reduction to movies that were shot on film in order to artificially sharpen the image.

Blu-ray Myths: Grain is a Flaw

”However, the grain is the image. Those “specks” if you will hold all of the detail. “Cleaning” the grain wipes the detail. There is no middle ground. Some stocks hold heavy grain, some you can barely see.“

If you want proof of this? Do this test. Pull out your favorite VHS tape and watch it on a tube set and then on a high definition flat screen set. VHS will look better 100% of the time on cathode ray tube television sets. Why? It has not been manipulated, it is unaltered by modern software.

Image for post
Image for post

When you watch a VHS tape on a high definition television set, you’re watching the VHS grain and the dirt on the tape that really looks bad in 1080P. High definition television sets are not forgiving. Especially on classic television shows, they look better on older television sets especially tube sets made before the 1990’s that were in the United States to a better quality. I’ve always wondered why I kept my old VCR lying around.

The same applies to film projectors at the movie theater. Movie theaters were forced to project tape that was in good condition from the original negative, where an engineer could sit behind the projector and dial in the focus and warm contrast from the tungsten bulb. You could really make film look sharp, and people paid good money to see it.

This Touching Documentary Explores the Dying Art of Film Projection

(Actual Scan Of 70mm Slide)

As you notice, there is no engineer sitting behind your flat screen adjusting the focus and contrast of the movie you are watching. It forces studios to apply a one-size-fits-all solution in order to compete with the market. It allows them to scan mediocre dirty film copies that may not be in the best condition, and hide the defects with software that can cloud out the noise.

This has created an environment where movie studios are not as concerned about working off a good quality example to begin with, and believe all film grain and lines are bad news. Why would they waste money trying to make as accurate of an archive as possible, when they can take a washed out and out of focus scan and apply a ton of artificial sharpening and blow out the colors so that it looks pretty good on a 32 inch flatscreen TV to the consumer watching at home. Then market it as “beautifully restored”.

Hollywood Attacking Film Grain For Blu-ray

There is also a nasty side effect of noise reduction called “motion smearing”, that blurs fast motion. It can be noticed in many releases if they apply too much noise reduction and other software enhancements.

Image for post
Image for post

Modern television manufactures attempt to hide this to by adding a built in motion blur that is turned on by default. It called the “soap opera” effect, and it makes me sea sea sick if I try to watch it. It’s another tactic used to hide digital manipulation, and the consumer believes it must mean better.

Does your TV’s picture look too real? Here’s how to zap soap opera effect

This would be so apparent if you took your typical Blu-ray disc and blew it up to theatrical size, and compared it side-by-side to the original 35mm or 70mm negative. The 35mm print would look sharper every time despite more grain and film lines. I attend movie theaters regularly that kept their film projectors, and you haven’t seen a movie shot on film until you’ve seen it projected from a film projector. Watching it on your home television set is no comparison to the art of film projection. Your typical Blu-ray or DVD has large amounts of digital fixes that were not possible on VHS.

There are a few exceptions where movie studios take the time to do a good job such as Criterion. Unfortunately there are a lot of sloppy jobs out there where film studios are on a tight budget or want to compete with everyone else, and use an enormous amount of editing to make a film pop on the most popular medium flat screen television. This destroys the integrity of the film it was shot on and not many care or notice to complain about it.

The bottom line is that theatrical releases are no longer being made to look ideal on a film projector. A great example would be Star Wars trilogy on Blu-ray today. What you are watching is far removed from what fans originally watched in theaters. So much so that fans have gone to great lengths in attempts to recreate the original by restoring color and removing CGI.

Fans restore original ‘Star Wars’ for online release

Harmy’s Despecialized Edition — Wikipedia

The reason Blu-rays look sharp is because you don’t see film lines and you don’t see film grain and to the average viewer it looks okay. Noise reduction does two things, it destroys color contrast and it destroys clarity if used in excessive amounts. It’s a real shame too, because home entertainment has led to a decline of shooting film and the elimination of real projectors.

Report: 35MM Projection Could Be Gone by 2015

Aspect Ratio:

Not only picture quality but high definition television has changed film ratio. When film negatives first started being scanned to VHS for the first time, we used to see black bars. Remember those? This was because when we had standard definition square sets, there was simply no way of getting around not watching the original aspect ratio.

Nobody wants to see black bars on their television screen anymore. So almost all movies shown on television are digitally resized in 16:9 1080P format, and without black bars it looks acceptable. When it was shown on a film projector, it would be shown in its original aspect ratio.

Here are some examples;

Mad Mad Mad World (2:76:1)

Star Wars: A New Hope (2.35:1)

Sound Of Music (2:35:1)

My Fair Lady (2:20:1)

Top Gun: (2.00:1)

The Shinning (1:85:1)

Midnight Cowboy (1:85:1)

A Clockwork Orange (1:66:1)

Wizard Of Oz (1:37:1)

In consequence, standardized high definition television has taken away freedom from the director to shoot in any format they would like. All television networks broadcast in 16:9. Most studios realize their movies are going to be viewed at home instead of on projection. So movie theaters stopped supporting other formats. That’s just the nature of the beast. It’s now popular to resize all films no matter the original format it was filmed on. This further reduces clarity and adds to motion distortion.

Making Sense of New Movies That Change Aspect Ratios

The Problem With The Shining (1980) And Aspect Ratio

There are however a few companies out there that seem to take great care in producing accurate archives of classic films. Criterion Collection I find have always pulled out winners when it comes to preserving film grain and the original aspect ratio which I highly respect. Turner Classic Movies frequently takes the time to broadcast their films in their original aspect ratio. It’s a niche market though that is losing support.

CriterionForum.org: Midnight Cowboy Blu-ray Review

Now obviously this doesn’t affect digital recordings shot in 16:9, but the adoption of high definition 16:9 format has had a dramatic affect on the film industry and most of it has not been good for the movie enthusiast.

My Take / Opinion:

I would say that this new technology has the potential to produce great results when it can, but unfortunately 90% of the material the consumer will happily buy is just not being produced to preserve film integrity. Good enough? For most absolutely. Most won’t remember the difference.

This is not just about old movies though. You would be surprised at the amount of new titles shot on film. Film is not going away anytime soon, but image integrity in my opinion has. High definition sets becoming cheaper are not helping. I find this akin to taking graffiti to the Mona Lisa.

So yes this is just a fact. Any digital release you watch will have many digital “fixes”. VHS was the last format that could not be digitally altered. It doesn’t have to be this way though, but the industry has driven this facade. In the last ten years I have watched over 50 film prints on projection at my local indie theatre, and I can tell you that the image is way sharper and colors are way more vibrant. Digital fixes are designed to optimize the picture for display on small LCD screens not projection. When you compare side by side, an unaltered film is a great improvement. It’s not so much the resolution of film, it’s about the methods by which film is projected.

High definition format has brought joy to billions of people around the world. I just wish Hollywood took the time to do the right thing. Would anyone buy a niche market though? Don’t even get me started on some terrible 5.1 remixes out there. Some movies just sound better in mono and that is a fact. (wink) I won’t even bring up an atrocity called “3D”.

So I would say keep film alive. Support your local indie theater that still owns a film projector, and support companies that make the effort to archive older films with minimal manipulation. The fait of preserving historic film is in our hands, and keeping analog film alive for another generation. Companies like Kodak are the only ones left producing 35mm and 70mm and they already declared bankruptcy once. I encourage you to build a projection room in your home, and always vote with your wallet.

Independent writer outside of Boston.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store