Certified Rotten: Appeals to Authority in the Comic Book Movie Debates

Certified Rotten: Appeals to Authority in the Comic Book Movie Debates

What can box office figures and critic aggregation websites tell us about a film's quality? Answer: very little.

Editorial Opinion
By brewtownpsych - Jul 27, 2012 04:07 PM EST
Filed Under: Other

DC vs. Marvel “fanboy wars” are common anywhere people discuss comic book movies. But far too often these discussions quickly turn away from an honest debate about the merits of the work and toward an all out assault to force the other side to adopt a particular point view. One common weapon is to point toward some external measure (e.g., Rotten Tomatoes) as proof of a film’s quality, a technique of rhetoric known as the “appeal to authority.” Instead of promoting an idea based your own ideas and opinions, one simply takes a three-step approach to win an argument: 1) prove that a particular authority is the final word on the truth of an assertion, 2) provide the conclusion of the such authority, and 3) deduce the conclusion as therefore true. But how do we prove a particular authority is the definitive source of truth? Such is the problem presented to those who make this type of argument.

One of the most common appeals to authority is how well a movie does at the box office. The argument is straightforward enough: people know what is good and loads of people went to see the movie; therefore the movie is great. The problem is the first assumption is spurious. People have notoriously bad tastes in movies. Spider-Man 3 out-grossed Spider-Man 2 despite the fact that it is widely considered the nadir of the series while Spider-Man 2 is widely considered one of the best comic book movies ever. Check any weekend’s top grossing movies and you will see that while there may be some good movies in the mix, the majority are average or below-average. Mediocre and bad films sell tickets weekend after weekend.

And while it is mostly true that in order to make the “highest grossing film” list a movie must be at least of a certain quality, it is by no means a guarantee of greatness. If you want to play the authority game, a quick survey of the top 25 highest grossing movies of all time shows an average rating of 7.0 using Rotten Tomatoes average critics score, hardly indicative of the best movies of all time. And none – zero, zilch, nada – of the top 25 grossing movies are listed in AMC’s or AFI’s list of the top 25 greatest movies. If box office returns are such a good proxy for quality, wouldn’t you expect at least one film to be on both the top grossing list and any of the well-known and respected “best” lists?

The next most common appeal to authority is critic aggregation sites like Rotten Tomatoes or Metacritic. Just as with box office returns, the strength of this argument relies on how reliable these sites are as a gauge of quality. Rotten Tomatoes has a “tomatometer” score and an “average rating”, while Metacritic assigns an average critics score or “metascore”. Generally individuals with the appeal to authority mindset will cherry-pick which score best fits their narrative, dismissing the others. But either way each score has its flaws as a way to measure quality.

Both sites’ average scores share one basic problem: the imperfect nature of assigning a value to the review. They both assign each review a percentage rating, either based on a critic’s own rating or according to their own, mysterious methodology. In those instances where the critic gives it a rating that does not have an exact numerical score (e.g., a C- or “thumbs up” rating) or no rating at all, subjectivity and imprecision are the rule. Irrespective of how these black box determinations are made it is clear is that they are imperfect measures at best. Even the very critics whose views make up the meat of these rankings question their validity. Wikipedia quotes Armand White, Chairman of the New York Films Critic Circle as saying that “[critic aggregation websites take] revenge on individual expression … by … dumping reviewers onto one website and assigning spurious percentage-enthusiasm points to … reviews.”

But these average scores have another problem: the pool of critics is suspect. For Rotten Tomatoes, most big films use upward of 250 critical opinions, including just about any online publication or newspaper. The reviews on www.comicbookmovie.com are likely just as good if not better than just about any sample you can find on Rotten Tomatoes (apart from perhaps the most-well known of critics). The line between critic and fan becomes blurred with so many critics in the mix and what you are left with is essentially popular opinion. And lest you think Metacritic’s metascore is our salvation, that site has the same problem only in reverse. Metacritic only aggregates reviews from a small number of generally well-known critics. For some this is a plus but for others it does not provide a broad enough sampling. Which is better is probably a coin flip but neither is very precise.

Finally there is the Rotten Tomatoes tomatometer, perhaps the most flawed of all despite being the most relied upon. Once a review is assigned a value, the tomatometer keeps track of the positive and negative reviews, with 60% or more positive reviews indicating a “fresh” rating and 75% indicating a “certified fresh” rating. But the score does not tell us how much a critic liked the movie (or anything else about the movie for that matter). A simple illustration might help demonstrate just how imperfect a gauge the tomatometer is. Assume film A has four reviews: 6/10, 7/10, 6/10, and 1/10, giving it a “certified fresh rating” of 75%. This is true despite the fact that three of the four reviewers just barely liked the movie and one reviewer gave it the worst possible rating. On the other hand, assume film B also has four reviews: 5/10, 5/10, 8/10 and 10/10, giving it a “rotten” rating of 50% despite the fact that the two “negative” reviews were actually split about the movie and the two positive reviews gave the film very high marks. It is clear in this case the tomatometer’s rankings give a misguided impression about these films considering film B’s average score (7/10) is actually higher than film A's (5/10).

In the end appeals to authority as a way to measure the quality of a movie are dubious at best. Filmmaking is an art form and as such it is difficult to measure numerically. Box office returns, critic aggregation sites and even “best of” lists are some of many tools you could use to help decide whether or not a movie is liked by critics or fans. But they are not meant to be authoritative measures that take the place of well-thought out opinions.

THE 4:30 MOVIE Interview: Filmmaker Kevin Smith On How His Passion For The Theater Shaped New Film (Exclusive)
Related:

THE 4:30 MOVIE Interview: Filmmaker Kevin Smith On How His Passion For The Theater Shaped New Film (Exclusive)

THE FRANCHISE: Trailer For Max Series Starring Daniel Brühl Reveals Chaos Inside World Of Superhero Filmmaking
Recommended For You:

THE FRANCHISE: Trailer For Max Series Starring Daniel Brühl Reveals Chaos Inside World Of Superhero Filmmaking

DISCLAIMER: As a user generated site and platform, ComicBookMovie.com is protected under the DMCA (Digital Millenium Copyright Act) and "Safe Harbor" provisions.

This post was submitted by a user who has agreed to our Terms of Service and Community Guidelines. ComicBookMovie.com will disable users who knowingly commit plagiarism, piracy, trademark or copyright infringement. Please CONTACT US for expeditious removal of copyrighted/trademarked content. CLICK HERE to learn more about our copyright and trademark policies.

Note that ComicBookMovie.com, and/or the user who contributed this post, may earn commissions or revenue through clicks or purchases made through any third-party links contained within the content above.

brewtownpsych
brewtownpsych - 7/27/2012, 5:00 PM
@SteveRogers - I would argue yes these kinds of stats are generally irrelevant.

@vegakingblade - actually, I am arguing that none of these figures means very much so by no means I am cherry-picking one over the other.
bagadoosh
bagadoosh - 7/27/2012, 5:15 PM
kinda like pulling quotes from the bible to make a point (I'm not trying to start a religious war )
marvel72
marvel72 - 7/27/2012, 5:16 PM
i honestly can't see a problem with it,both set of fans can use it.

its fine when you say compare two comic book movies against each other,it gets stupid when you start comparing films in other genres.

there will be blood 91% vs the avengers 92%

of course there will be blood is a better film,as i said the two of them shouldn't be compared against each other.
Ranger14
Ranger14 - 7/27/2012, 9:12 PM
You seem to leave out the fact that RT also posts the overall average critic's score along with the fresh/rotten tomatometer scale. For example, the overall rating of TASM by the critics listed on RT is at 6.8/10. TDKR is at 8/10.
brewtownpsych
brewtownpsych - 7/27/2012, 9:25 PM
@ranger -- I discuss the average score for both RT and Metacritic.
Ranger14
Ranger14 - 7/27/2012, 9:47 PM
Sorry about that, I guess you did put it in there. The thing is, for most of the more popular films, the actual tomatometer is typically higher than than the actual average score
scmittydude
scmittydude - 7/27/2012, 11:59 PM
Answer?: absolutely nothing.
you decide what you like, not someone else.
Tainted87
Tainted87 - 7/28/2012, 7:28 AM
The word "real" is derived from Anglo-French "roial", which later evolved to be "royal". What is "real" is what we define as the consensus of an authority.

As such, we find truth in the gathered opinions of critics, over a dozen. To suggest otherwise is to accuse the critics of falsifying their reviews to make a movie look bad so that it will make less money, and the other features will attract more audiences - one theory I have found particularly distasteful, but not difficult to believe.
SpiderFan35
SpiderFan35 - 7/30/2012, 10:55 AM
Yeah, what Yoss said.
View Recorder