Roger Ebert Home

Misinterpreting the Tomatometer

Last week Slate ran a story about the "Hollywood Career-o-Matic," which claimed to use data from Rotten Tomatoes to chart the trajectories of Hollywood careers. Interactive feature: Just enter the name of an actor or director and it will instantly generate a graph showing that person's critical ups and downs.

For example, here's one for M. Night Shyamalan, with each dot representing the Tomatometer score for the features he has directed:

careeromatic.jpg

Slate concludes that, according to Rotten Tomatoes data, the Best Actor in movies is Daniel Auteuil, with John Ratzenberger the best American actor, since he's voiced a character in every Pixar movie. Best Actress: Arsinée Khanjian. Worst Actress: Jennifer Love Hewitt. Best Director: Mike Leigh. Worst Director: Dennis Dugan (veteran of Adam Sandler movies such as "Happy Gilmore," "I Now Pronounce You Chuck and Larry," "You Don't Mess With the Zohan" and "Grown Ups").

Yes, this is all so silly that the mind boggles, but let's start with the premise itself: What is the correlation between reviews and careers in Hollywood? Adam Sandler and Michael Bay wouldn't look much more impressive than Shyamalan if you looked only at reviews. And the Slate piece is riddled with misconceptions about the Tomatometer:

What makes the data from Rotten Tomatoes so brutal is that they depict not just one person's opinion of Shyamalan but the collective assessment of all our cultural critics. [...]

The site even aggregates archived reviews from films that were released in the 1990s and before. Whether you're looking at contemporary cinema or the classics, the Tomatometer can serve as shorthand for a film's critical reception, if not its box office success.

"All our cultural critics"? "Contemporary cinema or the classics"? Well, at least it gets this part right:

The Rotten Tomatoes website, created in 1999, aggregates reviews from hundreds of newspapers and websites across the country, converts each review into a thumbs-up ("fresh") or thumbs-down ("rotten") rating, and then combines those assessments into a single "Tomatometer" rating that gives the percentage of positive reviews.

And that's the source of another inequity. Who determines what constitutes a positive or negative review? That depends. If a critic uses some kind of ratings system (stars, a numerical system, letter grades) then anything above 50 percent would be considered "fresh." (And the average of all ratings must be 60 percent for the film to keep a "fresh" rating.) Which means that, on a four-star scale like the one Roger Ebert uses (much to his chagrin), 0 to 2 stars is "rotten" and 2.5 to 4 stars is "fresh."

At Metacritic, another review-aggregating site, at least an attempt is made to assign each review a numerical score between 0 and 100. So, on that same four-star scale, 2 stars would be 50 and 3 stars would be 75. The "Metascore" is simply the average of all those numbers. (Also, RT includes many unknown bloggers and online outlets, while MC "distills the opinions of the most respected critics writing online and in print...".)

But both Rotten Tomatoes and Metacritic have to assign values (fresh or rotten at RT; 0-100 at MC) to reviews that aren't accompanied by measurable ratings, and that's where things get even slipperier. Let's take "Super 8," for example. As I write this it has an 83 percent rating from 193 reviews (160 fresh and 33 rotten) on RT (and a 72 on MC). Selecting the All Critics tab (sorted by Most Recent, which is the default), the first three reviews are from BrandonFibbs.com, Mountain Xpress and Q Network Film Desk. Selecting the Top Critics tab gives you Tom Long of the Detroit News at top, followed by Lisa Kennedy of the Denver Post and James Berardinelli of ReelViews. (Dana Stevens of Slate is next, which reminds me: Why isn't Stephanie Zacharek, formerly of Salon, now of Movieline, also considered a "Top Critic"?)

Peter Howell of the Toronto Star gave "Super 8" 2.5 stars out of 4 and is quoted on the RT site: "Abrams fashions 'Super 8' in such a calculating manner, with every element weighed both for maximum nostalgia value and ironic hipster cred, that it has an artificial feel to it. It's like a birthday cake made of spun glass." This appears as a "fresh" review, though it may not sound like one. So does Rick Groen's 2.5-star Toronto Globe and Mail, represented by the quote: "It's the child actors, heroic indeed, who rescue 'Super 8' from the blockbuster grip of its adult makers."

The New York Times doesn't use a scale, but A.O. Scott's decidedly mixed review is considered "fresh": "Attempts the difficult feat of balancing self-consciousness about the olden days with wide-eyed, headlong, present-tense fun. For about an hour it succeeds marvelously. The modest letdown that follows exposes the limitations of Mr. Abrams's imagination."

You get the idea. A review that trashes the film gets the same rotten rating as one that finds it a near-miss. A rave is just as "fresh" as a lukewarm approval.

I find the Rotten Tomatoes site (and its predecessor, Movie Review Query Engine, or MRQE) useful in helping me to find reviews and information about individual titles in one spot. I can glance at a page and see who has reviewed a movie and then click over to read the entire review. (Google still can't sort search results for movie reviews that finely!) But none of this really has anything to do with monitoring or comparing Hollywood career trajectories. For that we still have palm readers and TMZ. The Tomatometer is a reliable way of measuring Tomatometer ratings. That's about it.

Latest blog posts

Latest reviews

Comments

comments powered by Disqus