Business Analytics Blog

Business Analytics Blog

Opinions expressed on this blog reflect the writer’s views and not the position of the Capgemini Group

Is predicting the Oscars a mug’s game?

Category : Entertainment

The annual season of film awards ceremonies is coming to a close. Most of the results are in, but the one mostly eagerly awaited, the Oscars, closes off the season on 24 February 2013.  The speculation and near hysteria reaches fever pitch around this time, as pundits the world over try and predict the outcome of the Academy Awards. But in this new age of analytics, is there anything to be learnt from historical data to give you an edge?

One thing we have learnt from the US presidential election is that the application of sophisticated forecasting models can give the pundits a real run for the money. In fact when it came to the 2012 Presidential election the models of Nate Silver at the New York Times, and Simon Jackson and Mark Blumenthal at the Huffington Post, predicted the correct outcome in all 50 states, wiping the smiles off all the commentators saying it was too close to call.

Do we have any data?

There are a lot of film awards, so that ought to mean a lot of data to work from. However each body giving out their gongs is made up of different groups of people with different agendas and preferences, pandering to different audiences. However, people being people the world over, a good film or good performance should be recognised as such by a large enough population (a sort of “wisdom of crowds” effect?). So maybe box office takings could figure as a predictor, but then since when do the public ever know what actually is a good film?

But, the Academy of Motion Picture Arts and Sciences voting panel presents its own challenges in predicting their preferences. It is made up of approximately 6,000 professionals from the film industry, which makes it one of the largest, but also one of the oldest demographically. It’s also predominantly Caucasian males. How do the voting panel relate to media hype, other awards, tendency towards safe, recognised film makers rather than indie or foreign ones? Well, they are under strict orders not to reveal any information about their voting – so not a lot to go on on that front.

So why is predicting the winners so difficult?

Well at its heart it’s about people making subjective decisions about what they consider the best from their perspective from a list of what’s been submitted. It’s a different list every year, and cinema tastes change over time, so maybe there are just too many unknowns. Some say that it is more about studio marketing than quality, or that popular blockbusters are given greater weight, whether or not they have the required quality. Others say any attempt to forecast the Oscars is flawed from the start.

Who thinks they know the Academy’s mind?

Well one place to start is simply - are other awards good predictors of Oscar success? Or more accurately - are other award bodies thinking in a similar enough way about the films on offer to make decisions that are similar to the Academy? So it would a correlation on the definition of quality used for that year, rather than some ability to influence the Oscar voters.

The vast majority of awarding bodies are either industry awards, critics’ awards, or peer group awards. We will look at the four main award categories - best film, director, actress and actor – and only consider results since 2000. Taken as a whole there is too much variation, with different bodies getting it right some of the time, so difficult to discern any combined patterns; but what about individually?

In the industry corner we have selected the Golden Globes and the BAFTAs, and in the peer groups corner we have the Directors Guild of America, The Producers Guild of America and the Screen Actors Guild. Plus in the time available we have one representative from the critics: LA Film Critics. A comparison on how many times they agreed with the Academy over last 12 seasons reveals:

Analysis of 2012 Film Awards

So, on the best film front, there is generally not a good showing, except the Producers Guild of America (correct over last five years straight!). In general, the critics and the industry bodies are generating a fairly unremarkable hit rate. The Golden Globes have the additional advantage of having a drama and a comedy/musical award for film, actor and actress. But despite giving them two bites at the apple, they still don’t fare well.

However the peer group awards in general perform a lot better – maybe this is no surprise that those actually in the business know what good looks like. The Producers Guild for ’film’, and the Directors Guild for ’director’, almost always name the award that subsequently gets the Oscar - the Screen Actors Guild have matched the best actor eight years in a row from 2004, and the Producers Guild the best film the last five years in a row.

What about a more analytical approach?

The Huffington Post Oscars Prediction Dashboard aims to apply analytical approach to predicting the outcome of each category on Awards night. It crunches a series of metrics including the major prediction markets (Intrade and Hollywood Stock Exchange), plus historical data on box office success, critical and popular ratings and success in other awards. In this approach, each of these possible inputs is evaluated against its ability to predict the outcome of winning the Oscar, and then combined as a set of rules. Initial analysis has produced some definitely unpredictable findings. For example, for best film category, popular ratings are better indicators than critical ones, and those films where there is a box office revenue jump between weeks four and five after release. Now that one is obscure , and make of it what you will, but it does indicate that statistical analysis will produce results like this; the real question is whether this represents a real correlation with an explainable underlying cause to be a valid predictor.

So what does this all mean for this year?

So who’s going win on the night? To see all the film nominations and their success so far this season across the main awards, see the following helpful visualisation.

For best actor and actress, the Screen Actors Guild have honoured Daniel Day Lewis and Jennifer Lawrence, the Producers Guild  gave the best picture to Argo, and the Directors Guild thought Ben Affleck deserved their best director award. All good so far, but Ben Affleck hasn’t been nominated for a director’s Oscar. The Huffington Post’s predictor confidently predicts Stephen Spielberg will win the director’s Oscar with all the others as above.

So what do we think?

We have not had the time to develop a forecasting model, so are applying a heuristic approach based on the historical accuracy of peer group awards with inputs from GoldDerby and Huffington Post Predictor forecasts.

Best Picture: Argo

Best Director: Stephen Spielberg

Best Actress: Jennifer Lawrence

Best Actor: Daniel Day Lewis

Best Original Screenplay:  Django Unchained

Best Adapted Screenplay: Lincoln

A word of warning

A lot depends on whether there is really a recognised front runner in everyone’s eyes. When there is, the Oscars are usually true to form. But they can surprise: for three years in a row (2004-2006) almost none of our fine predictors from above got the film right!

About the author

James Lally
James Lally
James Lally has 10 years experience in analytical consulting across a range of projects in the public and private sector. James spent the first part of his career in an internal business consulting and strategy group at Network Rail, working as an adviser to the Board and Executive Directors and leading on high profile projects to restructure the business, support executive decision-making and drive significant performance improvement. James now works as a Managing Consultant in Capgemini Consulting’s Business Analytics team. James leads the team’s Value Management service offerings and is also focussed on clients within the Utilities sector.

Leave a comment

Your email address will not be published. Required fields are marked *.