Roger Ebert Home

Digital dilemmas: You want pixels with that?

Yeah, you saw "The Artist" so you know it was a big deal when sound technology took over the movies. (Except that really you don't, because "The Artist" is only interested in the arrival of talkies as an obstacle to its love story. You'll learn more about the ramifications of the transition from film to video in pornography from P.T. Anderson's "Boogie Nights" than you will about the technological and aesthetic consequences of the shift from silents to sound in "The Artist.") David Bordwell concludes his awe-inspiring, in-depth series on "Pandora's digital box: From films to files" with some observations about the myths, realities and possibilities of digital projection (something the vast majority of moviegoers have yet to notice, I'd bet, although it's having a huge effect on distribution and exhibition) and finds a fantastic quote from "hacker historian" George Dyson:

"A Pixar movie is just a very large number, sitting idle on a disc."

That's not to diss Pixar, it's just a vivid statement of digital reality.

The ongoing switch from analog to digital movie projection is indeed a big deal, but I was struck by this observation from DB:

First, let's go fussbudget. It's not digital projection vs. celluloid projection. 35mm motion picture release prints haven't had a celluloid base for about fifteen years. Release prints are on mylar, a polyester-based medium.

Mylar was originally used for audio tape and other plastic products. For release prints of movies, it's thinner than acetate but it's a lot tougher. If it gets jammed up in a projector, it's more likely to break the equipment than be torn up. It's also more heat-resistant, and so able to take the intensity of the Xenon lamps that became common in multiplexes. (Many changes in projection technology were driven by the rise of multiplexes, which demanded that one operator, or even unskilled staff, could handle several screens at once.)

Projectionists sometimes complain that mylar images aren't as good as acetate ones. In the 1940s and 1950s, they complained about acetate too, saying that nitrate was sharper and easier to focus. In the 2000s they complained about digital intermediates too. Mostly, I tend to trust projectionists' complaints.

Me, too. And this takes me back to when I used to run a movie theater (well, I booked it -- with one of my favorite people in the whole wide world, Ann Browder, who owed the joint: the single-screen Market Theater in Seattle's historic Pike Place Market). We were adamant about employing union projectionists, even as some of the major chains were trying to get rid of them and substitute low-wage teenagers to push buttons on automated equipment.

mt_lobby.jpg

I'd say that the creation of the multiplex was a more dramatic transformation in the way most people experienced movies than the film-to-digital revolution now under way. Theaters themselves became not only smaller, but less individualized. Instead of grand movie palaces, or neighborhood theaters that had their own personalities, the multiplex was designed to offer a set of more-or-less characterless, indistinguishable viewing boxes with a white rectangle at one end and no fancy curtain to cover it when it wasn't in use. Initially, the rectangle was bathed in light between shows, then filled with ads and movie trivia questions on slides, and is now a screen for video advertising. So much for showmanship. I used to remember where I'd seen every movie; this was the beginning of the end of that feeling of a movie being associated with a physical location or destination. (I still fondly recall one of my favorite pre-movie experiences at the Guild 45th St. cinema in my old Wallingford neighborhood in Seattle. I was there to see John Sayles' great "Lone Star" and before the show, while the gold velvet curtain was closed, the auditorium was filled with classic country-western music, including Frankie Laine, Sons of the Pioneers -- and Gene Pitney singing "The Man Who Shot Liberty Valance." Talk about setting up the film and putting you in the mood!)

45curtain-thumb-510x340-45390.png

For all the complaints about poor projection today (especially insufficient light levels on the screen, a problem exacerbated by 3D glasses), the quality of presentation in the early multiplexes may have been the worst ever. First, you just had to accept that you were going to hear the soundtracks -- and sometimes the audience reactions -- from the movies next door. And then there was the matter of the movie in front of your face. Instead of hiring projectionists who had to stay in the booth to perform changeovers every 20 minutes or so, some chains would hire hyphenate "manager-projectionists" who would run up to the booth for as long as it took to get a show started (one button to dim the lights and start the projector, with the entire film built up as an endless loop on a platter), and then return to the lobby to sell tickets or concessions for the next movie in the next auditorium. For most of a movie's running time, there was nobody in the booth.

booth.jpg

A single booth might have 14+ movies running simultaneously, and nobody monitoring what's actually happening in the auditoriums. When the picture would go out of focus or start fluttering (as it invariably would), or the sound would suddenly turn to moosh, people in the audience could yell at the booth all they wanted, but nobody was up there to hear. If the problem had any chance of being fixed, then somebody in the crowd had to get up and go to the lobby to inform the staff. Even then, quite often there was nobody working who knew how to correct any but the simplest glitches with the equipment. It wasn't uncommon for the chain to have just one tech on call for all the company's theaters -- somebody who'd have to drive over to one multiplex or another before problems could be resolved, and that could easily take a half-hour or more.

Of course, the most significant change of all was the one that came along just after the "overscreening of America" began, and that's the availability of movies on home video (BetaMax, VHS, RCA SelectaVision disc, DVD, etc.). It's just that the shift to digital video, marked by the success of the DVD, is now going the other direction, into theaters.

DB quotes from Geoffrey Cheshire's prescient 1999 essay, "The Death of Film / The Decay of Cinema":

People who want to watch serious movies that require concentration will do so at home, or perhaps in small, specialty theatres. People who want to hoot, holler, flip the bird and otherwise have a fun communal experience . . . will head down to the local enormoplex.

I think that's the situation most (current and former) moviegoers are in right now, unless they live in New York or some other population center with heavily subsidized institutional support for film exhibition. During last Sunday's Weinstein Awards broadcast, there was a lot of nostalgia for the theatrical experience, for seeing movies the old-fashioned way on a big screen. (We can no longer pretend, disingenuously, that the "big screen" is The Way Movies Were Meant To Be Seen, because we know that's simply not true. It can be thrilling to see something in 35 mm or 70 mm or IMAX or even IMAX Lite, but filmmakers and financiers know that the biggest audience will see their movie in some home video format. And we know they know that.)

artaud.jpg

The vaunted "theatrical experience" has been dying for decades, its demise perhaps hastened by home video. In part because people have the option to watch movies at home, but also because the other people in the theater are likely to treat public screenings as if they are watching the movie at home. You can't get the "theatrical experience" if the audience doesn't know how to behave in a theater. DB debunks some of the prevailing myths about theatrical exhibition:

First pseudo-worry: "Movies should be seen BIG." True, scale matters a lot. But (a) many people sit too far back to enjoy the big picture; and (b) in many theatres, 35mm film is projected on a very small screen. Conversely, nothing prevents digital projection from being big, especially once 4K becomes common. Indeed, one thing that delayed the finalizing of a standard was the insistence that so-called 1.3K wasn't good enough for big-screen theatrical presentation. (At least in Europe and North America: 1.3K took hold in China, India, and elsewhere, as well as on smaller or more specialized screens here.)

Second pseudo-worry: "Movies are a social experience." For some (not me), the communal experience is valuable. But nothing prevents digital screenings from being rapturous spiritual transfigurations or frenzied bacchanals. More likely, they will be just the sort of communal experiences they are now, with the usual chatting, texting, horseplay, etc.

dcpcase.jpg

In November 2011, Twentieth Century Fox sent a letter to exhibitors telling them that time was running out for 35 mm: "... we remind you that the date is fast approaching when Twentieth Century Fox and Fox Searchlight will adopt the digital format as the only format in which it will theatrically distribute its films. We currently expect that this date will be within the next year or two, as the costs of maintaining a dual inventory that includes 35mm prints are simply not sustainable for Fox in the long term." So, we know it's happening and it won't be long. The consequences for repertory and museum programming are murky, but DB touches on a psychological sea change that intrigues me:

Film carries more color and tone gradations than the eye can perceive. It has characteristics such as a nearly imperceptible jiggle that I suspect makes deep areas of my brain more active in interpreting it. Those characteristics somehow make the movie seem to be going on instead of simply existing.

I have always been a fan of emulsion. I like to sit up front so that the picture fills my field of vision and I can see the texture of the image (see DB's "First pseudo-worry," above). Or I could, in a 16mm, 35mm or 70mm presentation. Digital -- not so much. In the old days, lenticular movie screens were actually coated with silver, or aluminum, tiny glass beads, mica platelets or other reflective substances. Whether they're made of fabric, vinyl or some other substance, modern movie screens have a grid of perforations that allow the sound from the speakers behind them to pass through. Most people sit back far enough that they don't notice these little dots, which in some ways resemble grid-like arrays of digital pixels.

PixelImage.jpg

To me, a pixellated image is something like looking out a window through an insect screen. If you're too close, you'll see the grid that overlays the view. But the individual squares themselves are always changing, even if their arrangement is not, because the world doesn't hold still even if the mesh through which you're viewing it does. I love the poetic language Roger and David use to describe the living, breathing, singing qualities of film, but I wonder how much of it is subjective and how much is objective. I've never seen any scientific evidence that watching video (even analog, interlaced cathode-ray video, where every other line of resolution changes 30 times per second, rather than every frame changing 24 times per second) actually affects the (subconscious?) human brain differently than watching film. If there is any such evidence -- particularly having to do with our responses to projected film and projected digital video -- I'd love to learn more about it.

artscreen.jpg

Meanwhile, we know that some people strongly prefer one to the other, and I wonder how much our perceptions are conditioned by our expectations and what we're used to seeing, rather than the inherent trade-offs between digital and analog formats. When you sit as close as I do, you see everything in a 35mm print: the jitter (and sometimes flicker) of the film going through the shutter, the slight smearing or streaking that occurs when the equipment isn't properly "tuned" [comet tails!*], along with dirt, scratches, splices (and, depending on the print, the changeover marks and anti-piracy digital watermarks).

kdf-essa20.jpg

I have never been close enough to a 2K or 4K theatrical projection screen to see what the image looks like when you're standing right in front of it. (I used to be able to do that when I worked at a movie theater, but it would be rather rude to stand so close to the screen when there's a paying audience in the house.) At home, I watch a 55" Sony Bravia rear projection DLP HDTV (from about 2006) and, sitting 12 feet or so away from it, I can't see the pixel squares. I've looked at Samsung DPI rear projection screens (which also use lamps, so the image is still light projected on a screen, unlike computer monitors or plasma, DLP or LED monitors) and don't recall noticing any visible pixels at all, although I was susceptible to the dreaded DLP "rainbow effect." It depends on the technology used.

ReconstructionsFromPixels.png

Above: From Wikipedia: "A pixel is generally thought of as the smallest single component of a digital image. However, the definition is highly context-sensitive. [...] A pixel does not need to be rendered as a small square. This image shows alternative ways of reconstructing an image from a set of pixel values, using dots, lines, or smooth filtering."

I will never forget (spoiler!) the death of Lillian Gish's character in Robert Altman's "A Wedding," because in her final close-up, the film grain itself stops moving. It seems poetically appropriate. Of course, it really doesn't -- there's still grain unique to each frame of film -- but the image itself is one frame printed repeatedly. Time itself stands still.

Looking at "The Walking Dead" in HD at home, I see something that, even in close ups, very much resembles little dancing particles of film emulsion. Turns out, according to the show's Wikipedia page, the series is shot entirely in 16mm, then converted to HD video. So, although grain can be digitally added to a video image after the fact as a post-production effect, in this case what we're seeing is real grain, rendered digitally. For someone like me, who grew up watching (mostly sub-par) 16mm nontheatrical prints in college film programs, the idea of an HD digital rendering of 16mm film is kind of mind-boggling -- like a pristine CD release of the gloriously filthy, lo-fi "Never Mind the Bollocks Here's the Sex Pistols."

LPgrooves.jpg

Left: A magnified image of the audible pocks, dust and dirt in an LP record groove. Look at anything close enough and you'll see the imperfections.

As DB notes, there are always compromises and trade-offs with any new technology. (The 24fps standard for motion pictures was itself "a concession to the just-good-enough camp.") CDs were smaller and more portable than LPs, less expensive to manufacture, and had much greater capacity -- not to mention programmability, greater dynamic range, and the elimination of so many audible imperfections that vinyl is prone to: groove distortion, rumble, swoosh, scratches, dirt and dust, static, warping and all kinds of other surface noise. On the other hand, some think CDs sound cold and sterile next to LPs (though it largely depends on mixing, equalization and mastering, and many early CDs were made from LP mixes that were not adjusted for the new technology) and that the digital sampling rate is too low.

David sums up the various positions he's heard regarding digital vs. film projection in theaters (with comparisons "assuming minimal competence of staff" handling each format):

1. They're not the same, just two different media. They're like oil painting and etching. Both can coexist as vehicles for artists' work.

2. They're not the same, and digital is significantly worse than film. This was common in the pre-DCI era.

3. They're not the same, and digital is significantly better than film. Expressed most vehemently by Robert Rodriguez and with some insistence by Michael Mann.

4. They're not the same, and digital is mostly worse, but it's good enough for certain purposes. Espoused by low-budget filmmakers the world over. Also embraced by exhibitors in developing countries, where even 1.3K is considered an improvement over what people have been getting.

5. They're the same. This is the view held by most audiences. But just because viewers can't detect differences doesn't mean that the two platforms are equally good. Digital boosters maintain that we now have very savvy moviegoers who appreciate quality in image and sound. In my experience, people don't notice when the picture is out of focus, when the lamp is too dim, when the surround channels aren't turned on, when speakers are broken, and when spill light from EXIT signs washes out edges of the picture....

I suppose I come down closest to #1 -- though my own feeling is that they're different, but each has its own advantages and disadvantages for accomplishing the same thing: projecting images onto a reflective screen with light. In that way, I guess the comparison is a lot like digital vs. analog methods for recording and playing music: CDs vs. LPs, or cassettes vs. DAT, or reel-to-reel vs. MP3 vs. FLAC vs. Ogg Vorbis vs. AAC...

How do you see it? Does it matter to you? If so, why? Is there any evidence that we respond differently to digital vs. film images? Are those differences, if they exist, aesthetic, psychological, neurological, or not logical at all?

- - - - -

* David McRae, a famous Seattle projectionist, technician -- and fellow Marketeer -- provided the term I was trying to think of when I was writing the above. He writes: "Hello Jim: In your article you are referring to a film projector being mistimed. The streaks you see are what we call comet-tails or rat-tails. This is caused when the projector shutter opens too soon or too late. The film is stopped and the shutter is supposed to open and then flick over the frame a second time to reduce the flicker on screen and create the illusion of movement. If the shutter opens while the film is being pulled down to the next frame the image streaks." Gracias, Dave!

Latest blog posts

Latest reviews

Comments

comments powered by Disqus