It’s exciting to see Shyamalan on such confident footing once more, all these years later.
This is the end... Oh. No. It isn't...
Walter Murch rekindled a discussion about 3D, a hot-and-cold topic since "Avatar," with a letter to Roger Ebert, published on Roger's blog under the headline "Why 3D doesn't work and never will. Case closed." Ebert introduced Murch's correspondence with this, accompanied by a recitation of Murch's credits:
I received a letter that ends, as far as I am concerned, the discussion about 3D. It doesn't work with our brains and it never will.
The notion that we are asked to pay a premium to witness an inferior and inherently brain-confusing image is outrageous. The case is closed.
This, of course, generated more discussion -- much of it with an ad hominem slant, signaled by the headlines in Slate ("Two Thumbs, Two Dimensions: Roger Ebert is done talking about 3-D movies. Thank goodness.") and Boxoffice Magazine ("This Week in Cranky: Walter Murch Declares War on 3D"). At the same time, Kristin Thompson published two sequels (produced concurrently, in the modern Hollywood style!) to her August 2009 piece, "Has 3-D already failed?," assessing the argument for the commercial viability of the format, pro ("Part 1: RealDlighted") and con ("Part 2: RealDsgusted").
What Murch contributes to the debate is not substantially different from what I, Kirstin Thompson and many others have been writing about since the release of "Avatar" (see my posts, "Avatar 3D headaches: Look at this! Don't look at this!" and "Avatar, the French New Wave and the morality of deep-focus (in 3D)"). The one thing he does bring to the table is that he's Walter Murch, famous sound designer and editor, who edited the Francis Ford Coppola/Michael Jackson Disneyland 3D movie attraction "Captain Eo" back in the 1980s. (That use of the technology as a theme park-style attraction is, in my view, a stage the technology still has not moved beyond. Even those who don't personally like the hallmarks of the current 3D processes -- the glasses, the flat-planed illusion of "depth," the dim picture -- admit it works just fine for animation and cartoony or CGI-enhanced live-action, where the 3D isn't meant to be "realistic.")
That Murch is Murch (and Ebert is Ebert) neither enhances nor diminishes the validity of the arguments they make. It's just that they are Murch and Ebert and people pay attention to them. (Me, I'm just another schmuck with a blog.)
Murch addresses the issue of "convergence/focus" -- basically, that 3D fools your eyes into perceiving distance/depth when, really, they are just focusing on a single plane at a fixed distance: the screen. (I wonder: Could you create a 3D "Vertigo" effect if you sat in a wheelchair and rolled up to the screen while the camera was zooming out, or vice-versa?) The illusion is not unlike those "Magic Eye" computer paintings that fool your eyes into seeing depth on a flat surface -- but at the movies, the glasses and the projected images do a lot of that work for you. Say the audience looks at a screen that is 80 feet away (who sits that far back?!?!). Murch writes:
But their eyes must converge at perhaps 10 feet away, then 60 feet, then 120 feet, and so on, depending on what the illusion is. So 3D films require us to focus at one distance and converge at another. And 600 million years of evolution has never presented this problem before. All living things with eyes have always focussed and converged at the same point.
OK, but clearly some evolved people can make this adjustment more easily than others. (I know people who've never been able to see the "Magic Eye" stuff -- they just get eyestrain from crossing their peepers.) That much is determined by the individual viewer's physiology. Either you can watch 3D without getting headaches and irritated eyes or you can't. The counter-argument some have made, that 2D editing creates a similar demand on the eyes and the brain, is simply bogus: Looking at a 2D image is no different than looking at a wall. It does not require your eyes to adjust convergence and focus like a 3D image does. Yes, some people do get nauseated from watching regular movies and TV -- especially when bombarded with quick-cutting and shakycam images. Again, those are not anything like what Murch is talking about. (Hey, I hear "How to Train Your Dragon" in 3D is a great movie to see on dramamine! Like "Bourne Ultimatum" in 2D!)
The problem is complicated when the camera adds another layer of confusion, using selective focus within the 3D frame to tell your eyes (which theoretically ought to be able to focus at any time on anything at any depth in the frame) that they can't bring something into focus, because the camera lens has already made the choice for them. (This also happens in 2D, but the efforts to look at something out of focus on a flat plane does not cause the stereoscopic eyestrain of the "convergence/focus" sort.) Here's what I wrote upon returning from my initial viewing of "Avatar" (with a headache) in December, 2009:
Finally, let me say a few words about why "3D" does not work in movies any better than it does on tacky postcards of tourist landmarks, waterfalls, or Jesus. The technology has not developed significantly since the creation in 1939 of the ViewMaster -- a plastic binocular-like toy I loved to play with, but quickly tired of, when I visited my grandparents. The Polarized double-lens process does not render images in three dimensions, it simply separates them into multiple planes. The effect is not much different than the early Disney animated features, filmed through layers of glass on which backgrounds, foregrounds and characters were painted for a more dimensional effect.
But your eyes and your brain do not interpret these different layers the way they do actual space. Instead (to my eyes, anyway), each layer looks flat, stacked in front of or behind some other layer. So, people for example look like cardboard cutouts rather than rounded figures. What's worse, if the camera's depth of field holds something out of focus in the foreground or background, you can't do anything about it. If you look at something that's closer or farther away, your eyes have a natural tendency to bring it into focus. 3D camerawork frustrates that instinct. Regular old 2D imagery, on the other hand, does not trick your eyes into trying to focus on something they can't, because both eyes are always looking at the same plane. All around, fewer headaches.
Whether you or I get headaches or don't get headaches matters only inasmuch as some of us do and some of us don't -- and some of us like to put a hyphen in "3-D" and some of us don't. Naturally, those of us who find the technology cumbersome, uncomfortable and unpleasant (even painful) aren't particularly impressed with the great strides in quality we're told 3D has made (according to high-powered evangelists like Jeffrey Katzenberg), because we don't detect an appreciable difference. (Maybe if we could watch "Captain Eo" side-by-side with "Avatar" in IMAX 3D?) But there are facts (how the technology creates its perceptual effects -- and movies themselves are based on perceptual effects at a standard rate of 24 frames per second), and there are opinions (whether the latest go-around promises aesthetic dividends we did not see in the fads of the 1950s and 1980s). All I can say is that the current level of technology does not impress me, and I have no desire to change or reinforce your opinion -- even though Katzenberg certainly does. (Show me a new technology, like true holographic cinema, and I will be glad to reconsider. Meanwhile, do all these parenthetical asides give you headaches?)
I don't know what the future holds for what we now call "3D." All of this reminds me a bit of the talking-past-each-other "debate" over whether video games are art. People interpreted the question in so many different ways: What is the nature of art? Is it to be found in the efforts of the artists (animators, composers, designers, cinematographers) and technicians who create the work (not unlike filmmakers do) or in the experience of the game-player? If I feel something, if I am engaged and entertained and maybe even learn something, isn't that art? Until you know which questions you're addressing, your answers have no context in which they could mean something.
Take this, from the piece in Slate.com:
If that means Ebert will stop complaining about the medium, so much the better. For years now, the venerable critic has been griping that 3-D cinema is dim, distracting, and useless. And I mean for years: Even at the age of 10, young Ebert turned up his nose at Arch Oboler's stereo jungle adventure, "Bwana Devil." (Deeply unmoved, was he, by the hails of spears.) That was back in 1952; more than a half-century later, he's still shaking his fist at the silver screen--I hate 3-D and you should, too! Professional obligations notwithstanding, Ebert doesn't want to see another movie in three dimensions. Ever.
I've had enough of this persnickety crusade, marching, as it does, under the banner of pseudoscience. "Our ancestors on the prehistoric savannah developed an acute alertness to motion," Ebert writes, in an attempt to explain why movies like "Clash of the Titans" totally suck...
Let's start with the obvious: "I've had enough..." (or "I'm sick and tired of...") aren't arguments. Complaining about the fact that Roger Ebert, who reviews 3D movies week in and week out, has consistently "complained" about dim pictures and lackluster effects (with a few notable exceptions), is a logical non-starter, too. A critic's job is to report as well as to interpret. You would prefer that he arbitrarily change his standards from time to time? Or you would prefer he just stop writing about it? (Scanners readers know I get the same kind of thing whenever I elaborate on what I find lacking in blockbusters like "The Dark Knight," "Avatar" or "Inception." Forget the individual points I'm making; all that matters to some is that my overall tone is not laudatory! It's just a smear campaign -- an orchestrated effort fiendishly designed to make you doubt experiences and feelings that you've already had! Well, let's make a deal: If popular culture will stop bringing up these movies again and again, so will I.)
What I have gleaned from Roger Ebert's writings (as his web editor) is that he has regularly raised the following objections: 1) He finds wearing polarized 3D glasses to be not only awkward and uncomfortable (a problem for many who already wear prescription glasses), it also makes the picture on the screen look dim because it reduces the amount of light getting through to the eyeballs -- you're basically wearing sunglasses in a dark theater; 2) the 3D effects do not appreciably enhance his experience of the movies (especially in live-action pictures that have been "retrofitted" for 3D in post-production, rather than being shot in 3D in the first place); and 3) therefore, the primary appeal of the process for studios and exhibitors (as in the 1950s and the 1980s) has been as a marketing gimmick -- a way of enticing moviegoers to pay more for tickets to 3D presentations.
You can't really argue with #3, because 3D is featured in the ad campaigns, and tickets to the 3D showings do cost more. If you want to make a case that the financiers and filmmakers are making 3D pictures primarily to expand the aesthetic possibilities of the cinema (along with glorious Technicolor, breathtaking CinemaScope and stereophonic sound), go ahead. But there are still legitimate debates to be had about what and how "3D" (in past and future films) has added to the entertainment or artistic value of individual movies (which is where #2 comes in). And I addressed #1 above: Saying you don't get headaches (or don't get them anymore because you have adjusted to currently available 3D processes) is anecdotal, but doesn't change the fact of anyone else's experience.
Which brings us to this, from Boxoffice Magazine:
While I think [Murch's] dismissal of 3D as less dimensional than a "good story" is unfortunate, petty sniping (even if thus far it's been largely accurate in terms of stories told in 3D films), I think the most damning component of Murch's argument is that it was presented to, and then by Roger Ebert. Ebert has repeatedly and vociferously critiqued 3D presentation as an irrelevant, distracting gimmick to jack up ticket prices, and while he may or may not ultimately have a point (prices notwithstanding, I don't think he does), any argument he puts forth--even one perceived as coldly logical, or even in Murch's case, biological--is essentially a screed to reinforce his opinion, rather than an honest assessment of the merits of this evolving and still largely unproven way of producing and exhibiting films. Or at least, it seems that way; there are only so many times that you can say you hate something until even a real, rational argument why that hate is justified is no longer a legitimate, objective analysis of a subject.
Overall I wouldn't purport to predict the inevitable success or failure of 3D technology as a viable format for production and exhibition, but as long as there are creative people who are interested in merging art and technology, I do feel like there are enormous untapped opportunities with the technology that we have yet to experience.
I can get behind that second paragraph, the conclusions of which ought to be self-evident. (We can't predict the future.) And if somebody values a form of technology over a "good story," that's their call to make. Whatever floats your boat. But the rest of that first paragraph is a myopic blur of assertions, backpedalling and off-the-wall opinions presented as statement crafted to appear to endorse its own antithesis: "a legitimate, objective analysis of a subject."
Let's look at that paragraph again: The "most damning component" of Murch's argument (which the writer mischaracterizes in earlier paragraphs -- either because he misunderstands it or because he questions Murch's qualifications to venture an opinion) is that it was repeated by Ebert? How does that work? Is he really staying that the quality of the evidence itself is, most damningly, determined by who makes it, and whatever opinion the writer has already formed of the individual? That's a form of personal abuse, and a common logical fallacy, we know as ad hominem.
Italics are mine below:
"... while he may or may not have a point (prices notwithstanding, I don't think he does)..." (One might well ask: What is the point he may or may not have, and why do you off-handedly assert that you think he doesn't have it?)
"... any argument he puts forth--even one perceived as coldly logical, or even in Murch's case, biological--is essentially a screed to reinforce his opinion, rather than an honest assessment of the merits of this evolving and still largely unproven way of producing and exhibiting films." (Again, one might be compelled to ask: Does the dismissal of someone's evidence and logic as "essentially a screed to reinforce his opinion" -- regardless of what he actually says -- constitute an "honest, objective analysis" of the merits of the subject?)
I'm not just playing word games here -- though the evidence suggests that these writers are, by failing to engage the actual arguments and focusing on their feelings about Ebert and Murch instead. What bothers me is that this stuff reads like the kind of balderdash we're accustomed to hearing from the US Congress or cable TV talking heads, where the primary goal is to stake out a position by attacking the opposition, not to reason or explain one's own judgment.
Fortunately, on the other hand, we have Kristin Thompson, who quotes James Cameron (in a 2010 New York Times interview) talking about selecting a company to retrofit "Titanic" in 3D. It's not, as Kristin observes, an exact science:
"It really boils down to a human, in the loop, sitting and watching a screen, saying, 'O.K., this guy is closer than that guy, this table is in front of that chair.' "
For his 3-D "Titanic" rerelease, Mr. Cameron said he had approached seven companies about working on the film, testing each by asking it to convert about a minute of movie footage before he chose the best two or three efforts.
"All seven of the vendors came back with a different idea of where they thought things were, spatially," he said. "So it's very subjective."
In other words, you may think you know what is supposed to go where, but (in the case of these 2D-to-3D conversions, anyway) someone else is actually going to be making that decision for you.
Kristin examines the upside(s) and the downside(s) of 3D technology -- past, present and future -- with an emphasis on its commercial viability (for producers, exhibitors and consumers) in theaters and on home screens.
Industry commentators don't seem to mention the novelty effect of 3D much any more. Surely they never really thought that audiences will be dazzled forever. I think we reached the ho-hum point some time last year. I've mentioned that people began to resent the $3+ price hikes and to pick and choose more carefully among 3D releases, wanting the movie to be good enough to warrant paying more. But others perhaps decided 3D in general wasn't worth it and that they would rather see a film the old fashioned way, seeing a flat image undimmed by glasses.
For me it was "Toy Story 3." In 2009, David and I saw "Up" in 3D and enjoyed it. But we enjoyed it because it was another great Pixar film. As I said in my 2009 entry, I have remembered the film in 2D. We went to "Toy Story 3" in 2D and enjoyed it. [Many others made the same decision: "Toy Story 3" in 2D outgrossed the 3D version.] I have yet to see a film in both 3D and 2D to make a comparison, but my suspicion is that I would usually prefer the 2D version. I suppose the basic problem is that if the 3D is used for flashy depth effects with things flying out at the audience, it becomes too distracting and obtrusive. But if it's used simply to make, say, jungle plants look closer to the viewer than Carl and Russell, then it's unobtrusive--and hence not very interesting. Given that we have other mental tools besides binocular vision for grasping the spatial relations in an image, the jungle plants look closer in 2D as well.
A final thought on disaffected audiences. Currently there is a sector of the moviegoing public that loves 3D, will pay extra to see almost anything in 3D, and hopes the process expands. That part of the public is probably as big as it's going to get. (Yes, new kids will grow up, but others will mature out of their adolescent obsessions with such things.) In the U.S. at any rate, right now there aren't a lot of people suddenly discovering the joys of this wonderful new format. (It's really just getting going in the major Asian markets.) But the proportion of the getting fed up by the process' drawbacks--its higher cost, the growing numbers of mediocre and bad films in 3D, the glasses--is probably growing.
I don't have any answers. I just know the difference between subjective and objective. Like "Inception" (there I go again), I find 3D more interesting to write and think about than to actually see.
- - - -
ADDENDUM (1/28/11): At Andrew Sullivan's blog, readers say Murch got his biology wrong:
If you ever looked in one of those old view-master stereoscope you can see that not only was the object 3D but it was in focus as well. How does this Jibe with Murch then? Well Murch is mistaking two different things called a convergence zone. Your eye balls don't do the 3D processing, your brain does. It compares the two slightly different images to infer the 3D. It is your brain that is inventing the concept of a convergence zone not the eyeballs. Now for objects very very close to you, your brain will also notice that your eyeball muscle are pointing the eyes in a non-parallel way. And your brain may use this info as well. This is the physical eyeball sight line convergence that Murch is worried about, not the one that comes from the brain processing. And again for any object farther away than 10 times the distance between your eyes then your eyes are pointing nearly parallel and there is no sightline convergence info for the brain to consider. All that remains is the cognitive inference of depth and that works fine because that is EXACTLY how your brain was built to work.
Thus to the extent that Murch has a point it is these two: 1) don't sit in the front row. 2) film makers should not project the 3D objects out of the plane any closer than 10 feet from my nose for long periods of time. Other than that Murch is wrong.
This message came to me from a reader named Peter Svensland. He and a fr...
Chaz Ebert highlights films with the potential to get us through the confusing political times of the Trump presidenc...
A review of Netflix's new series, Lemony Snicket's "A Series of Unfortunate Events," which premieres January 13.
One of the most audacious American films from the 1960s is now available via the Criterion Collection.