This is, in part, a response to a delightfully written piece by my friend Peter Hall. If you haven’t read his column yet, do yourself a favor and check it out here: http://www.hallbrothersfilm.com/columns2/noteonstandards
I read Peter’s column a few weeks ago and I thought to myself, “What an interesting piece! I should write an article responding to and interacting with it.” Then, I did something else (probably play NBA 2K16) and I didn’t come back to it until today. But the concept of quantifying cinema like we quantify football is so simple and so intriguing that I kept coming back to it, and only partially because Peter’s fantasy football team is about to lay the smackdown on my own.
I’ve never fully trusted Rotten Tomatoes. Film aficionados seem to go here for reference more than any other ratings site, but it is inherently a superficial rating system. By gathering a large selection of general thoughts from dozens of critic with varying repute, the numbers aren’t always an accurate reflection of the quality of a film. The Tomatometer assumes a pre-existing knowledge of a film, and basically addresses whether or not the film meets what we expect it to be. There’s nothing wrong with that, per se, but people tend to look at those percentages and assume they are the be-all and end-all. On the surface, Mission: Impossible Rogue Nation received a glowing 92%, but many of the “fresh” reviews say things like, “mindless, escapist fun,” “visual wit of a Looney Tunes cartoon,” and “don’t try to make sense of it.” A little less lofty, no?
I’ve always personally preferred the IMDB rating system. It’s done by fan vote, and it usually fluctuates when a film is first released, but I tend to generally agree much more with the hard numbers once they even out. Let’s take an example that Peter gives: The Avengers is rated a shade higher on Rotten Tomatoes than The Dark Knight Rises, and he theorizes that this is in part due to the fact that The Dark Knight Rises had far higher expectations. On IMDB, The Avengers has a very good rating of 8.1, but TDKR has an even better 8.5. Of note, the latest Mission Impossible has a reasonable 7.7. Doesn’t that all feel about right? No numeric system is ever going to be perfect, but it feels like we have one that at least judges based more on merit than on expectation.
So, with all that said, I have a proposal. Let’s make our own rating system, or Fantasy Cinema if you will. Let’s take Rotten Tomatoes (expectations), IMDB (user ratings), and even throw in box office success for good measure. Here is my proposed (rudimentary) formula:
1(RT score out of ten) + 2(IMDB score) + .5(every $10 million opening weekend) = total score
So for Mission: Impossible Rogue Nation we have:
1(9.2) + 2(7.7) + .5(5.5) = 27.35
For The Dark Knight Rises, we get:
1(8.7) + 2(8.5) + .5(10.6) = 31
Now, we’ll take the Indie film Me and Earl and the Dying Girl:
1(8.1) + 2(8.0) + .5(.02) = 24.11
And the total flop Aloha:
1(1.9) + 2(5.5) + .5(.9) = 13.35
Maybe it’s not perfect, but it’s something to build on, isn’t it? The two well-reviewed money makers did great, the indie darling put up points but was punished for its lack of scope, and the flop, well, flopped. I’m open to suggestions, but seems like a good place to start.
So, I propose we do a beta Fantasy Cinema league using this formula (or some variation). We’ll have a Fantasy Cinema Draft, where we choose movies set to be released over the next, say, three months, and whoever has the highest cumulative score when the allotted time period is over wins.
What do you say?