The popular film review site, Rotten Tomatoes, gives its users two different scores to help them determine the quality of a movie. There’s the Tomatometer score, which tells us what percent of critics gave the film a generally positive review. And there’s the Average Rating Score, which tells us how good a film is based on a one-to-ten scale. These two numbers seldom agree.
Broadly speaking, the value that we, the viewing public, truly want to know is the Average Rating score. Yet advertisers prefer to use the Tomatometer score. I suspect that this is because the Tomatometer generally overvalues good movies, which makes “okay” films look better than they really are. But that’s not what I wanted to investigate. I wanted to see if there are movies out there that break Rotten Tomatoes — which is to say, I wanted to see if there was a solid relationship between the Tomatometer scores and the more valuable Average Rating scores, and if any movies broke that trend.
I created a list of just nearly 275 films along with their Tomatometer and Average Rating scores. This is what the relationship between the two looks like:
I was surprised to find that there wasn’t a ton of noise in the data. I plotted each movie as a blue dot, and there was a pretty clear trend between the Tomatometer score and the Average Rating score (plotted as the red line in the figure above). This meant that with just a little math, I was able to predict how good a movie was supposed to be based on its Tomatometer score. But there was something odd in the data — some film’s Average Rating scores weren’t anywhere close to what the data predicted. What was going on here?
From my list of movies, I isolated all of the films whose Average Rating scores on Rotten Tomatoes were more than two standard deviations away from where they were expected to be. There were only two movies that vastly outperformed expectations: the 1994 hit Pulp Fiction (94 Average Rating score, 83 predicted), and the 2014 Nicholas Cage vehicle The Dying of the Light (39 Average Rating score, 31 predicted). Not a lot to go on. But the films that vastly under-performed expectations showed real promise.
When I say that the films vastly under-performed expectations, it might be a little misleading: What I mean is that the films received a higher percentage of positive reviews than expected, based on the Average Ratings. There were eight films whose Average Rating scores were more than two standard deviations lower than expected:
Movie | Tomatometer Score | Ave. Rating Score | Predicted Score | Difference (AR-Predict) |
Showgirls | 19 | 31 | 39.37 | -8.37 |
Four Weddings and a Funeral | 95 | 76 | 84.49 | -8.49 |
The Crawling Eye | 63 | 51 | 59.54 | -8.54 |
Plan 9 From Outer Space | 67 | 50 | 61.51 | -11.51 |
Caligula | 24 | 31 | 42.57 | -11.57 |
Youngblood | 38 | 36 | 49.45 | -13.45 |
The Room | 32 | 33 | 46.79 | -13.79 |
Manos: The Hands of Fate | 7 | 11 | 29.34 | -18.34 |
The clear outlier in this list of films is 1994’s Four Weddings and a Funeral, a hit romantic comedy with generally favorable reviews (a 95% Tomatometer score). All of the remaining films were considered bad, with an Average Rating less than 52. But that wasn’t the only thing that connected these movies.
Fans of the cult-hit TV show Mystery Science Theater 3000 might notice that many of these films appeared on that show — most (in)famously Manos: The Hand of Fate. From this, I surmised that the movies that break Rotten Tomatoes tended to be films that are widely considered bad, but received abnormally positive reviews thanks to their hapless charm and unintended comedy potential.
In other words, the movies that break Rotten Tomatoes are the ones we commonly refer to as “so bad, they’re good.”