I’m not sure if I’ve mentioned this in this past, but I’m quite a fan of numbers.
Numbers, lists, spreadsheets, calculations etc, all play a fairly significant role in my gaming, as I try to keep track of what I have and haven’t played.
Something I’ve generally paid less attention to in the past, is the rankings games receive on Board Game Geek – I might casually note a number, or whether a game appears in “The Hotness” (i.e. lots of people are talking about this game right now), but it’s never been very systematic. September was when I decided to change that.
It started with Mansions of Madness. As you may know, this is a game that was rebooted in 2016, and made quite a splash as it did – I was interested in how this translated over on the Geek.
Anyone with an account on BGG can rate a game and when you go to a game’s page, you will see the rating it has – a simple mean average.
Not unreasonably though, BGG itself doesn’t actually set all that much stock by the average rating, instead it uses a dark and mysterious scale called the Geek Rating. Some folk in the past have claimed to possess mystic insights into the arcane ways of the Geek Rating, but as far as I know, the exact algorithm has never been made public.
What I do know with a fair amount of certainty, is that a game’s Geek Rating is influenced both its average rating, and by the number of ratings that average is based upon. This means that you can’t get the top-rated game on BGG by creating something no-one has heard of, and having three mates all give it a 10.
To put things into a real-world example, in early September the then unreleased Arkham Horror LCG, had a phenomenal 9.2 out of 10 Average Rating. However, this translated into a Geek Rating of only 5.6 (which clocked in at number 4,300 and something), simply because that 9.2 average was based on only 55 reviews.
Looking at Mansions of Madness 2nd Edition specifically, at the end of August, the game was sat at 358th in the overall rankings. The 971 people who had rated it gave it an average of 8.56, but with fewer than a thousand people rating it, that made for a Geek Rating of only 6.945.
By 12th September, the average rating had actually dropped ever-so-slightly, to 8.55, but the extra nearly 300 people who had contributed to that rating were enough to drag the Geek Rating up to 7.176, and bring the game into the top 200.
The overall number of ratings continued to rise steadily throughout the month, and although the average rating dropped slightly, the Geek Rating rose until, at the end of the first week in October, it made it into the Top 100. Now we have reached early December, the average has fallen further still to 8.42, but it sits firmly settled as an all-time great, 47th in the overall rankings, and with a highly respectable Geek Rating of 7.630.
I’ve had this article on the back burner since September, as I tracked the various numbers. My suspicion is that Mansions of Madness is nearing its peak –it might climb a bit higher, but ultimately, I think it will start falling. Only 3 games above it in the rankings have a higher average score, and as times goes on, there is bound to be a cooling off. Of course, this could be for many reasons – People will decide that they don’t like it or have grown tired of it (limited replayability is one of the big complaints filed against this game), or its rise will bring it into direct comparison with other games whose fans will down-vote it. The longer it spends high in the rankings, the more likely it is to attract people who’ve heard a lot of hype, play it, and feel underwhelmed, or people who aren’t that interested in the theme/mechanic but like to play all the “top board games” and both types are likely to give it a low score.
I also wanted to take a moment to think about another game, Pandemic: Reign of Cthulhu. This game was announced earlier this year, and it gathered a lot of hype, spending months and months in “The Hotness.” This game rather successfully smashes together the Pandemic name and basic mechanic, which is pretty much the biggest thing in Board Gaming right now, with the perennially popular Cthulhu Mythos theme, (an IP that’s having a particularly good year).
So then, you’d expect Pandemic Cthulhu to be riding high in the rankings, right?
Well no, apparently not. The game currently has an average rating of 7.7 which certainly isn’t too shabby (I tend to work on the basis that anything above 7 or 7.5 is worth looking at), but by early October, all that chatter had apparently only transformed into 500 or so ratings, for a geek rating of 6.29 – that wasn’t enough to even put it into the top thousand in the overall rankings. Again, this is a game that’s still on the rise: with nearly 1300 ratings it had made it up to 545 by early December, but that still felt surprisingly low to me, all things considered.
One of the issues with the way the Geek Ratings are worked out, is that a lot of people will gravitate towards the first title in a series. As such a refined, later edition or instalment may attract the almost universal opinion that it is ‘better’ yet wind up with a lower ranking because it doesn’t get as many ratings.
Obviously, one of the positives of the Geek Rating system, is that it works to correct against expansion bias. Put simply, loads of people with diverse gaming backgrounds might try a new game, but only the ones who liked it will stick around for the expansion or the next iteration. If that expansion can be played as a standalone game (thus earning it its own entry in the rankings) then it will likely get a much higher average score, because the people who gave the original low ratings won’t have bothered looking at this version, playing it, or (crucially) rating it.
Dice Masters is a good example of some of the issues with this: for a while, all the discussions were happening on the page for the original version of the Game, Avengers Vs X-Men. As such, it has been rated by nearly 4500 people, and has the highest Geek Rating of any title in the series.
The average ratings are rather different: Green Arrow and the Flash briefly boasted an impressive 8.2 average. However, as that was based on the opinion of only 27 people, it didn’t even have a Geek Rating. Most people just don’t really bother rating a new Dice Masters set separately these days, it seems: by now, they expect most people to have made up their minds on the game one way or another.
What’s in a Number?
Obviously any ratings system of this kind is, ultimately, subjective. Some people will love games that others hate and soforth.
Around the time I first started writing this article, a quiz popped up on Facebook, a “how many of the top 100 games on Board Game Geek have you played” – I was quite startled that I had only played 23.
Of the ones I hadn’t played, there are a mixture of different types. Blood Rage only took me a few days to add to the “played” list (this was already planned, not just a reaction), offering a timely reminder that these things are always in flux. By the time Mansions nudged its way into the top 100, I could tick off a quarter of the games in the top 100.
Of the remaining games in the top 100 that I haven’t played Descent is very high on my want-to-play list, and others like Legendary Encounters: Alien are fairly close parallels to games I have played (Legendary Encounters: Firefly).
That still leaves a vast number that I’m steering well clear of on cost grounds (X-Wing, Imperial Assault), or simply know nothing about.
Working as a Board Games reviewer means that I’m keeping a far closer eye on new releases than I ever used to, and I’m a lot more likely to have played 2016’s new releases than anything else to come out over the past few years. Hopefully over time, I’ll be able to play enough of the top 100 to feel better informed next time one of these quizzes rolls around.
Thematic? Family? Customisable?
Aside from the overall rating, Board Game Geek also ranks games by category. There are some obvious benefits to this – a game might score fairly low overall in the rankings, but actually be considered the best game of its type (Arkham Horror LCG – still only 297 overall, but already the best “customisable” game by ranking). Equally, a game might seem to rate quite highly, but the recognition that it falls within a very popular category can indicate that there are still plenty of better alternatives out there.
These Categories also provide a convenient way of filtering things out. A lot of games that get ranked highly on Board Game Geek are long, complex, heavily strategic, head-to-head games. Whilst neither strategy nor complexity are a major issue for me, I know that (realistically), for a game to do well in my house, it has to be a good experience for 2 players, where one is a hardcore gamer with a brain that likes abstract puzzles, and the other is a casual gamer who likes theme and creativity. In practice, that means that 2-player games for the most part have to be either cooperative, or pretty quick.
With a bit of poking around in Board Game Geek, I managed to find out how to rank all the Cooperative games in order. Whilst I didn’t expect to have played all the overall Board Game Geek heavyweights, I was really surprised at how few of the top co-ops I’d played. 8 of the top 20 isn’t too horrendous, but that doesn’t include any of the top 5, and as you look further down, I can only tick off 14 of the top 50.
The top 5 Co-operative games, according to Board Game Geek, are Pandemic Legacy, Mage Knight, Robinson Crusoe, T.I.M.E. Stories, and Dead of Winter. Everything I’ve ever heard about Pandemic Legacy suggests that the gameplay is really good, but the disposability scares me. Mage Knight only came onto my radar very recently, but it certainly intrigues me. TIME and Dead of Winter are both things I’ve had my eye on for a little while, but the probability of having a traitor in the ranks for Dead of Winter keeps it at arm’s length.
Ultimately, the most important thing when playing games is to have a good time. If ratings, rankings, or categories help people with that, then great – but it’s still important to keep a perspective on things. It’s pretty rare these days that I actually go out and buy a brand new game (as opposed to buying new sets for existing games, or getting new games sent me for review) but when I do, I’ll make a point of reading several reviews, including at least one negative one, maybe even watch a video or two and often these will highlight why even the most highly-rated game may not be for me.
What do other people think? do rankings and ratings matter to you? or is there something else driving your game-making decisions?