Yes you read that correctly, this blog is not about East Asian geopolitics or the debt-to-GDP ratio of OECD members, this blog is about Pop Culture. I’ve been overwhelmed by the deluge of bad/terrible news relating to the economy and the state of world affairs lately. I don’t really want to know whats going in US/world politics right now. So instead I’ve decided to write a blog about something that mildly irritated me during the last Academy Awards winner. I have no qualms with The King’s Speech getting the award but I felt like it winning betrayed the  betrayed mainstream and critical acclaim The Social Network had received.

I felt like there was a generational gap in the two film’s importance. For me, and younger people I think Facebook’s cultural importance is ubiquitous. When the film came out its tagline was “You can’t make 500 million friends without making a few enemies” it seemed cool, and underscored the enormous importance of Facebook. The film reminded me of Pirates of Silicon Valley, or other films that helped to characterize the rise of companies that represented a sea change in society not just a new product that consumers bought. But this blog isn’t about my weird opinions about movies, it’s actually about the award itself.

The Academy Awards are one of the few non-sports TV events that I watch consistently. I’ve always liked that in order to win an Oscar, a film is not judged on its sales or its popularity, but by its merit. Recently I finally bothered to look into the rules for selection and voting in the Academy Awards. According to Wikipedia there were 6000 Academy Awards voters in 2007. For most award categories voters vote for awards that relate to their categories (actors vote for actors etc.), But a critical exception is voting for Best Picture, where all voters are eligible. While have no problems with the film industry being involved, but I think the vote would be fairer if film critics were given a share of the vote as well. I feel this way in part because my other TV-watching addiction comes from College Football, where human polls play a more important role than in any other sport. For years people have criticized the human elements of College Football in polls for being biased, and sometimes error-prone. Often there are ballots from the Coaches Poll where one or two teams in the  top 5 aren’t on their ballot at all, and coaches almost always rank their teams higher than average.

I can imagine voters for the Oscars are similarly prone to personal bias, though the large number of voters probably mitigates this risk. If it weren’t for two websites, I probably wouldn’t question the Oscars outcomes at all, most of the time I don’t see all of the nominated films anyway. But thanks to Metacritic and Rotten Tomatoes there are now more refined ways of determining how well received a film is with critics.

Rotten Tomatoes has been in existence for some time (I can recall reading it in high school) and its method has stayed simple and consistent for some time: select from a list of critics (a much wider list than metacritic) and determine whether each review is positive or negative, then give a percentage of all reviews that are positive. So, for example, if you wanted to know how many people thought The Sweetest Thing was utterly without merit (it is) you just on RT and, lo and behold, only 26% of critics wrote a positive review for the film. This was based on 107 reviews from sources ranging  from Richard Roeper of Ebert & Roeper to Christine Blosdale of BeatBoxBetty.com, a website that looks trapped in the 90s. Having a broad pool of reviews isn’t necessarily a bad thing, it helps to limit the impact of one bitter film critic rating a film “0” and devaluing an otherwise popular film.

On the other hand, metacritic uses a very selective list of critics who mostly write for large publications (The Rolling Stone, The New York Times) and they average their scores for films using a 0-100 metric. Perhaps in response to this, Rotten Tomatoes now offers a “top critics” tab that narrows ratings to ~45 of the best reviewers. But this doesn’t change a fundamental difference between the two websites: one answers “how good do most critics think this film was” the other answers “how many critics thought that this film was good?”. I think both have their merits but metacritic’s strategy is more useful for determining which movies ought to win awards.

I’ve created bar graphs that compare films selected for the past four Oscars in the “best picture” category and I’ve ranked them by their metacritic score (underlining the film that actually won). I also listed Rotten Tomatoes’ percentages using the “top critics” option from the site:

Starting with 2007:

I think this year best summarizes the differences between the two methods: I liked Juno, I bet most filmgoers liked Juno, but that doesn’t mean that most people thought it was exceptionally good. I liked There Will Be Blood too but I think it represented a deeper meaning, a heavier and more complicated reality than Juno did, and simply asking critics if they liked a movie will never show this distinction. I wouldn’t draw too much from No Country for Old Men losing out as the difference was likely down to how much coffee Ebert drank when he wrote the review.

2008 shows more variation:

An interesting advantage of using metacritic scores over Oscar winner outcomes is that you can actually compare how good a year in film did compared to years on average. 2008 looks like a weaker year using metacritc. I agree that Slumdog Millionaire was the best film of 2008, but I don’t think it would have won in a different year. It’s also interesting to see how poorly The Curious Case of Benjamin Button and The Reader fare, perhaps I should note that The Dark Knight scored higher than the bottom 3 nominees with 82 on metacritic  : – )

2009 introduced a 10 nominee process:

The suspense of the Oscars race between The Hurt Locker and Avatar does not bare out when using metacritic, as the gap is wide. Avatar’s high chances at the Oscars could have been related to its reputation as a rev0lutionary step in film production and special effects in particular. Needless to say, critics were less impressed by this.

And now my favorite year, 2010:

I think 2010 was the best year in film in the past decade by far. Individual films from other years might compete on merit but 2010 had so many good films. I personally loved 127 Hours, Inception, and The Social Network for their own particular directions and thought/read mostly positive things about The King’s Speech, Toy Story 3, The Fighter and The Kids Are All Right. I didn’t watch/read about True Grit or Winter’s Bone. The only film I actually disliked was Black Swan, which I found to be too formulaic and predictable.

Finally, which of these films was the best overall? My final graph:

This is how metacritic views the nominees for “best picture” over the past four years. What do you think?