Calibrating the game review system

· by Steve · Read in about 4 min · (683 Words)

I’ve said for a while that I don’t think individual game review scores are particularly useful, on account of the fact that an individual’s taste varies. It’s always important to read the detail of a review rather than to take the score on face value, where (hopefully) a decent reviewer will explain the reasoning behind the aspects he/she did and did not like, so you can judge how much they apply to you. Even this, however, is a hit-and-miss affair, because English being the rich and imprecise language it is, all sorts of emphasis creeps in based on the reviewer’s own opinions, which may or may not be your own - it’s easier to detect and cancel out when you read the text rather than just look at the score, but even so there’s no getting away from the fact that a review is far from an empirical measurement.

I do tend to find Metacritic useful - despite my mistrust of individual review scores, when taken in aggregate, the natural statistical process tends to smooth out the anomalies and result in a reasonably good guide. Again though, you have to take into account your genre preferences - Halo 3 and Oblivion scored persistently highly but personally I’m not keen on either of them, but that’s down to my preference rather than the quality of the games themselves. When restricted to a genre though (and often, restricting to subgenre is required, for example Halo 3, Gears of War and Bioshock should all be in separate subgenres IMO), Metacritic’s results do seem to align with my overall opinion; such as rating Rock Band ½ and Guitar Hero 2 as the leaders in the music performance/imitation genre, and Geometry Wars 2 and Rez HD in the arcade shooter genre. I don’t think the overall numerical rankings are at all useful, but within subgenres the relative ranking does seem pretty sound, meaning that while you might want to filter your non-preferred game types out, if you’re looking for a good game in subgenre X, Metacritic is quite a good way to find one - provided you can reliably identify those subgenres (again, reading the full review text should help here).

However, Metacritic scores can take a little time to settle down, and inevitably purchases are made based on the earliest 2-3 web reviews (or even previews sometimes). In these cases, you need to evaluate the reviewer as much as the game, IMO. While most sites do credit the author of the review, few of them make it easy to find what other games this reviewer did or did not like, which is vital information. Some people categorise individual sites as reliable for them or not, but I don’t buy that - you can’t say that you trust 1UP or Eurogamer universally, it’s all about the individual reviewers. I tend to read Eurogamer for the humour, but whether I agree with their reviewers varies wildly. I often agree with Kieron Gillen (a veteran of several PC games mags I used to read too) but have strongly disagreed with their Keza MacDonald, who for example said that Guitar Hero III was ‘in every conceivable way, a better product than its predecessors’, which is pure, unadulterated tosh. It’s partly his fault I bought GHIII, only to abandon it in disgust within 2 weeks, so I discount any opinion he has on music games now.

So ideally, in online game reviews I’d like to see a box-out summary of a few other games in the same genre that this reviewer has judged, in order to figure out how much weight I should give to their opinions. Any chance of that Eurogamer/1UP/IGN et al? I think games reviewers should put their own face / personality out there more so we can identify with those we do / do not tend to agree with - this is the kind of thing some 8-bit mags used to do in the 80’s, with several people pitching in and each identifying themselves with a little picture or something. Why is everything so impersonal on the web?