Editorial: Why Rotten Tomatoes Scores Are An Awful Metric

Editorial: Why Rotten Tomatoes Scores Are An Awful Metric

Here is why rotten tomatoes scores should be ignored.

Editorial Opinion
By HeyWait - Apr 07, 2016 11:04 AM EST
Filed Under: Batman vs. Superman
Hi everyone this is my first article, so please forgive the mistakes and the English (not my first language). My favorite comicbook character is Superman, but I love many characters from DC, Marvel, Image and Vertigo (I don't consider it DC) as well. Usually I am not very vocal, I rather read the fights, than fight them myself (I am a pacifist, a pussy if you will), but the obsession of many users in this site with Rotten Tomatoes Scores (RTS from now) is driving me crazy, so I need to jump in.
 
For me, it all begun with the BvS tread. Was this movie as successful as expected or lived up to the excitement leading up to it? Resoundingly NO. Was the piece of sh*t many users in the site make it to be? Also, resoundingly NO. I guess I am one of the few people that seem to have enjoyed the movie, but still took it for what it was, an entertaining visual movie with a regular story. 
 
Let's assume we are all nice and we all wanted the movie to be a great success - critically and at the box office. Instead, we got what we got and now many of us are disappointed with the movie. Until here no problem - my problem is when disappointed viewers use RTS as a validation of their frustration, to sh*t on the movie and the people that liked it when they are down. Why does it bother me? From a personal point of view, because, I see this site as a community and I wish we were all more civil and less war-y towards each other and from an objective point of view because statistically speaking RTS is crap. Let's dig -
 
RT collects scores from hundreds of critics and normalizes them to fit in a 0 to 100 scale -  for example if a critic gave a movie 2.5 stars out of 5 his/her RT score is 50 out of 100 (50%). RT then takes all the critics scores and aggregates them into a single metric the RTS. Once at least 80 critic's scores have been aggregated RT publishes a consensus RTS for critics, fans and users all over the world to see. An RTS equal or larger than 60% means the movie is "Fresh", otherwise it is "Rotten". Overall, this process is fine, unless you are a critic and do not like to see your nuanced opinion reduced to an arbitrary number, but that is their problem, not ours. My issue is not really that we are aggregating critics scores into RTS, but how are we doing it. The way that RT scores are aggregated implicitly dismisses middle ground opinions in favor of the extremes. 
 
This is how RTS works. Before any reviews have come out for a movie two counters are set to zero. Movie Positive Reviews = 0 and Movie Total Reviews = 0. RT considers that a critic likes the movie if his/her score is bigger or equal to 60% and counts his review as positive. For every positive review the Movie Positive Reviews Counter as well as the Movie Total Reviews Counter are increased by one.
 
Movie Positive Reviews = Movie Positive Reviews + 1
Movie Total Reviews = Movie Total Reviews + 1
 
Note how both an score of 60 or 100, are both weighted equally in the Positive Review counter.
 
If a critic does not like the movie i.e. it gave the movie a score < 60% only the total reviews counter is increased by one.
 
Movie Positive Reviews = Movie Positive Reviews
Movie Total Reviews = Movie Total Reviews + 1
 
Note how both an score of 0 or 59, have the same effect in the Positive Review counter (none).
 
The final RTS is calculated as the ratio between Movie Positive Reviews and Movie Total Reviews:
 
RTS = (Movie Positive Reviews)/(Movie Total Reviews).
 
This is an awful metric for movie quality or anything else - 
 
Imagine 100 critics watch 2 movies - they all agree neither was great, neither was crap, but the second one was slightly better. 50 critics gave movie 1 a score of 57 and  50 critics gave it a score of 59, while 50 critics gave movie 2 a score of 60 and 50 critics gave it a score of 62. If we use RTS to evaluate the critics opinion on the movies we would see that movie 1 has an RTS of 0 (all critics gave it a score < 60), while movie 2 has an RTS of 100 (all critics gave it a score of at least 60). Conclusion: critics think that movie 1 is the worst movie ever produced and movie 2 is the best movie ever produced. Nothing could be further form the truth. An honest metric will show us the average opinion of the critics together with the variation in their opinions. This in statistics are the mean and standard deviation of a measurement. In the case of movie 1 the score would be 58 +-1; and in the case of movie 2 the score would be 61 +- 1. These measurements shows that the critics still consistently liked movie 2 slightly better than movie 1, that the critical opinion is not divided and that critics don't think that movie 1 was crap or movie 2 the best movie ever made. These scores reflects reality much better than RTS (0 vs 100) and it would be equally easy for RT to calculate. 
 
We could have included only the mean value of the critic scores as a metric, but that would have only partially reflected the critical reception of a movie. Including the standard deviation together with the mean tell us not only what the consensus on a movie was, but also how divisive the movie was. Think of a movie like Man of Steel, its RTS is 56%, (its mean score is 62% which would had made it fresh, although barely). RT does not provide the standard deviation on Man of Steel mean socre, and I am not willing to go through 291 reviews to calculate it. However, it will not stretch the imagination of anyone to assume that the 162 positive reviews were very high 88% for example and the 129 negative reviews were quite low 30% for example. I chose these numbers because they average to the actual mean score of Man of Steel ([162*88+129*30]/291 = 62), while they also reflects how divisive the movie was. Calculated with this oversimplified numbers Man of Steel score was 62% +- 29%. This score indicates that on average people though that the movie was ok, but also shows that the movie was very divisive, with people strongly liking or disliking it. From these numbers you could infer that if you don't like Snyder, or DC or darker takes on superheroes you may want to skip the movie - However, if you happen to like DC, Snyder or darker superhero movies you may infer that this is the movie for you. All these are things that we already knew, but it still a nice thought exercise -At least I think so! 
 
To finalize. I think RTS is an intellectually dishonest score. Using the mean +- stdev of the critics' and general audience's score would reflect much better the average sentiment critics and the general audience have of a movie while also providing information on how divisive it was. As with any score, there are flaws, there is the danger of outliers: fanboys/haters giving a movie very high or very low scores to help/destroy the movie score before they have even see it - this can be addressed by eliminating the top and bottom 5%  of the scores (i.e. all the 100s and 0s that come before the movie is even out) when calculating the movie final score metric.      
 
Would you rather see RT use this in stead of RTS? Do you agree with me that mean +- stdev is a fairer and more informative way to score a movie?
 
Thanks for reading : )

PS - the mean score of BvS is 49% - still not great, but far avobe 29%.
ABSOLUTE SUPERMAN #1 Preview Reimagines Krypton And The Man Of Tomorrow's Mother Lara Lor-Van
Related:

ABSOLUTE SUPERMAN #1 Preview Reimagines Krypton And The Man Of Tomorrow's Mother Lara Lor-Van

James Gunn Reveals His Title For A (Hypothetical) BATMAN And SUPERMAN Team-Up Movie
Recommended For You:

James Gunn Reveals His Title For A (Hypothetical) BATMAN And SUPERMAN Team-Up Movie

DISCLAIMER: As a user generated site and platform, ComicBookMovie.com is protected under the DMCA (Digital Millenium Copyright Act) and "Safe Harbor" provisions.

This post was submitted by a user who has agreed to our Terms of Service and Community Guidelines. ComicBookMovie.com will disable users who knowingly commit plagiarism, piracy, trademark or copyright infringement. Please CONTACT US for expeditious removal of copyrighted/trademarked content. CLICK HERE to learn more about our copyright and trademark policies.

Note that ComicBookMovie.com, and/or the user who contributed this post, may earn commissions or revenue through clicks or purchases made through any third-party links contained within the content above.

ossie85
ossie85 - 4/7/2016, 3:47 PM
There is no perfect metric.

Movies are art, art is subjective.

Rotten Tomatoes doesn't pretend to be anything more than it is - a percentage of critics who thought the movie was positive.

But imdb, boxoffice, etc are also flawed.

For me, I find RT to be the most reliable. But I do disagree it with at times. Because I like what I like.
HeyWait
HeyWait - 4/7/2016, 3:53 PM
@ossie85 - First thanks for reading. I appreciate you taking the time.
I agree there is no perfect metric. What bothers me from RT is that middle scores go to the extremes making things look either awesome or bad, and if people don't pay attention how the score is calculated that is very misleading.
01928401
01928401 - 4/7/2016, 4:47 PM
@ossie85 - I disagree about your perceptiom of film. It is art, but there is also a standard.

Basically, a movie's level of success revolves solely around its intention. If a movie was trying to be a silly, goofball, fart joke comedy, and it succeeds at doing those things in its own way (in addition to the imperatives of cohesion, coherence, and the few other distinctions), then it is a good movie.

Films must meet a level of standards. If it does not, it is rated lower objectively. The subjective part is whether someone likes it or not. If you like Adam Sandler's Jack and Jill, great. Enjoy the shit out of it. But your opinion does not affect the movies level of being good or bad.
ossie85
ossie85 - 4/7/2016, 8:44 PM
@PietroJaximoff - ^ that I agree with
Donovan
Donovan - 4/8/2016, 7:41 PM
@PietroJaximoff - & that's how industry works, sometimes away of what some film is as an art form, in this particular industry.
Yaf
Yaf - 4/7/2016, 4:37 PM
I personally prefer Metacritic, it gives a more balanced visage but either way, I have critics whom I trust and read their reviews of things, be it video game or movie rather than these aggregation sites.
HeyWait
HeyWait - 4/7/2016, 6:56 PM
@Yaf - I agree, better to get recommendations from someone you trust than an aggregated score. Still a lot of people uses RT as a guide, without realizing how skewed the scores are.

Take care
MikeJulz87
MikeJulz87 - 4/7/2016, 11:22 PM
I think they should have a rating in between. Someone wrote an article about how most of those critics on RT who've gave a rotten, actually gave a 3/5 or a C+,which is not a worse rating at all.
Maybe they should have two ratings in between, NEAR ROTTEN and NEAR TOMATO.
HeyWait
HeyWait - 4/8/2016, 9:34 AM
@MikeJulz87 - I agree the scoring doesn't work when a C+ counts for the same as an E or an F. Middle scores would work. But average +- stdev if scores would clearly show where the movie stand.

Thanks for reading and take care.
nibs
nibs - 4/8/2016, 5:11 AM
RT actually has more problems with it than just that.

But you're right, someone people see 29% and think the movie is a 2.9/10, when really, it just means 29% of reviewers liked it. Some of us agree with the 29%, some of us don't.

It's also important to remember that many critics are Hollywood obsessed douchebags like Bruce Vilanch, and no real person would ever share the same opinion as them, ever.

And then, as much as many people might not agree with me, for Disney to slip someone like Anaid Ramírez of Time Out Mexico a couple bucks to write a bad review, they can really hurt the competition for very little money.

The important part is to watch movies yourself and form your own opinion. If you want to tell me why a movie is good or bad, don't show me the RT score. Tell me using your own words.
huckfinnisher
huckfinnisher - 4/8/2016, 8:59 AM
Great editorial wish they would take your advice! Would make rt a site I actually listen to.
HeyWait
HeyWait - 4/8/2016, 9:11 AM
@huckfinnisher - thanks for reading it and also for the kind comment :)
Take care
Donovan
Donovan - 4/8/2016, 7:45 PM
I would like to think that thanks to the controversy caused by how BvS is being treated in many ways, we now could watch movies in a different, probably better way, instead of trusting in personal opinions, critics, or review systems...it's hard to keep it away of subjective views.
HeyWait
HeyWait - 4/8/2016, 8:06 PM
@Donovan - I agree, at the end all reviews, unless they focus on measurable technicals qualities of a movie are subjective, but unless we are willing/able to watch all movies that spoke our interest ppl will trust a friend, a critic or a website to prioritize, that's why I think we should make those fires as honest and informative as possible. At the end a well done RTS (not the way they currently do it) for critics or GA will give you information on the consensus on what a large group of ppl thinks about a movie which could be a useful tool to asses movie quality or likability. Regrettably the way it's done now does not provide the nuanced information you need to make RTS useful-

Take care
View Recorder