DC vs. Marvel “fanboy wars” are common anywhere people discuss comic book movies. But far too often these discussions quickly turn away from an honest debate about the merits of the work and toward an all out assault to force the other side to adopt a particular point view. One common weapon is to point toward some external measure (e.g., Rotten Tomatoes) as proof of a film’s quality, a technique of rhetoric known as the “appeal to authority.” Instead of promoting an idea based your own ideas and opinions, one simply takes a three-step approach to win an argument: 1) prove that a particular authority is the final word on the truth of an assertion, 2) provide the conclusion of the such authority, and 3) deduce the conclusion as therefore true. But how do we prove a particular authority is the definitive source of truth? Such is the problem presented to those who make this type of argument.
One of the most common appeals to authority is how well a movie does at the box office. The argument is straightforward enough: people know what is good and loads of people went to see the movie; therefore the movie is great. The problem is the first assumption is spurious. People have notoriously bad tastes in movies. Spider-Man 3 out-grossed Spider-Man 2 despite the fact that it is widely considered the nadir of the series while Spider-Man 2 is widely considered one of the best comic book movies ever. Check any weekend’s top grossing movies and you will see that while there may be some good movies in the mix, the majority are average or below-average. Mediocre and bad films sell tickets weekend after weekend.
And while it is mostly true that in order to make the “highest grossing film” list a movie must be at least of a certain quality, it is by no means a guarantee of greatness. If you want to play the authority game, a quick survey of the top 25 highest grossing movies of all time shows an average rating of 7.0 using Rotten Tomatoes average critics score, hardly indicative of the best movies of all time. And none – zero, zilch, nada – of the top 25 grossing movies are listed in AMC’s or AFI’s list of the top 25 greatest movies. If box office returns are such a good proxy for quality, wouldn’t you expect at least one film to be on both the top grossing list and any of the well-known and respected “best” lists?
The next most common appeal to authority is critic aggregation sites like Rotten Tomatoes or Metacritic. Just as with box office returns, the strength of this argument relies on how reliable these sites are as a gauge of quality. Rotten Tomatoes has a “tomatometer” score and an “average rating”, while Metacritic assigns an average critics score or “metascore”. Generally individuals with the appeal to authority mindset will cherry-pick which score best fits their narrative, dismissing the others. But either way each score has its flaws as a way to measure quality.
Both sites’ average scores share one basic problem: the imperfect nature of assigning a value to the review. They both assign each review a percentage rating, either based on a critic’s own rating or according to their own, mysterious methodology. In those instances where the critic gives it a rating that does not have an exact numerical score (e.g., a C- or “thumbs up” rating) or no rating at all, subjectivity and imprecision are the rule. Irrespective of how these black box determinations are made it is clear is that they are imperfect measures at best. Even the very critics whose views make up the meat of these rankings question their validity. Wikipedia quotes Armand White, Chairman of the New York Films Critic Circle as saying that “[critic aggregation websites take] revenge on individual expression … by … dumping reviewers onto one website and assigning spurious percentage-enthusiasm points to … reviews.”
But these average scores have another problem: the pool of critics is suspect. For Rotten Tomatoes, most big films use upward of 250 critical opinions, including just about any online publication or newspaper. The reviews on www.comicbookmovie.com are likely just as good if not better than just about any sample you can find on Rotten Tomatoes (apart from perhaps the most-well known of critics). The line between critic and fan becomes blurred with so many critics in the mix and what you are left with is essentially popular opinion. And lest you think Metacritic’s metascore is our salvation, that site has the same problem only in reverse. Metacritic only aggregates reviews from a small number of generally well-known critics. For some this is a plus but for others it does not provide a broad enough sampling. Which is better is probably a coin flip but neither is very precise.
Finally there is the Rotten Tomatoes tomatometer, perhaps the most flawed of all despite being the most relied upon. Once a review is assigned a value, the tomatometer keeps track of the positive and negative reviews, with 60% or more positive reviews indicating a “fresh” rating and 75% indicating a “certified fresh” rating. But the score does not tell us how much a critic liked the movie (or anything else about the movie for that matter). A simple illustration might help demonstrate just how imperfect a gauge the tomatometer is. Assume film A has four reviews: 6/10, 7/10, 6/10, and 1/10, giving it a “certified fresh rating” of 75%. This is true despite the fact that three of the four reviewers just barely liked the movie and one reviewer gave it the worst possible rating. On the other hand, assume film B also has four reviews: 5/10, 5/10, 8/10 and 10/10, giving it a “rotten” rating of 50% despite the fact that the two “negative” reviews were actually split about the movie and the two positive reviews gave the film very high marks. It is clear in this case the tomatometer’s rankings give a misguided impression about these films considering film B’s average score (7/10) is actually higher than film A's (5/10).
In the end appeals to authority as a way to measure the quality of a movie are dubious at best. Filmmaking is an art form and as such it is difficult to measure numerically. Box office returns, critic aggregation sites and even “best of” lists are some of many tools you could use to help decide whether or not a movie is liked by critics or fans. But they are not meant to be authoritative measures that take the place of well-thought out opinions.