More Reviews
REVIEWS Boss! Review
PlayStation Vita owners looking to exercise a little frustration can look to this rather low-fi title about creating a monster and destroying everything in your path.

Part 2 of Square-Enix and Disney's cooperative compilation cash-cow is ready to milk the series for another go, but does the milk taste sweet or is it spoiled?
Release Dates
Release date: Out Now

Release date: Out Now

Persona 5
Release date: 12/31/14

Motorcycle Club
Release date: 01/01/15

LATEST FEATURES Downloadable Content Walks the Line Between Fun and Frenzied in Middle-earth
I don’t even care all that much for the Lords of the Rings brand, which makes the content falling under Shadow of Mordor’s Season Pass a pleasant surprise.

Ugly Christmas Sweaters for Gamers
If this awful trend is going to persist, you may as well do it your way.

Read More Member Blogs
RIP Ralph Baer (1922-2014)
By KevinS
Posted on 12/07/14
RIP Ralph Baer (1922-2014) I really, really hate writing obits. I really do. But I take it as a personal honor to be able to say good things about the men and women I respect, whether in this industry or just in my life, and Ralph Baer is the reason all of this exists in the first...

Mind Over Meta

Posted on Friday, July 14 @ 12:14:15 Eastern by Joe_Dodson

It turns out most of the emails complaining about a GR score on a meta review site have something to do with For some reason, they skew our grades preposterously low, beyond the realm of any normal human calculus. Instead, you need drunken calculus, and we have that in spades.

So we took Metacritic to task, first skimming through their online FAQ. Which, alas, is FUCed…and they can prove it.

Metacritic’s letter grade-to-number conversion scale is insane. It starts off okay with the A’s all landing in the 91-100 range - a respectable 10 point spread - but things quickly go bonkers when you give a game a B+, which, according to Metacritic, converts to an 83. What happened to 84-90? Apparently those digits are being used to prop up the rest of the B's, which run from 83 (ugh) to 67 (double ugh). That’s a 19 point range, holmes, for three grades. The C’s suddenly dip an extra 9 points, starting at a 58 for the C+ and ending with a C- at 42. That might be the answer to life itself, but it only made us more confused. And it just gets messier as you plow down towards the F, which they laughably break down into an F+, F and F-.

While this system is obviously loony, we think we see where they went wrong, and so can you because it's plain as day in their very own aggregate breakdown chart. It seems Metacritic uses two standards: one for games, and one for everything else. But there is only one letter grade-to-number conversion scale, and it doesn’t match either breakdown. It represents a third standard, one of stark, raving madness.

And since no one wants to be crazy, they cover it up by not showing original grades on their score pages. Why? Who knows, but readers look at these pages and think we’re irresponsible critics, when in fact Metacritic is loco. We give Metacritic’s funhouse mirror conversion scheme an F+.

Or maybe an E- for lack of effort, because we know they aren’t trying to make us look like jerks. We suppose we should be consoled by the fact that Play magazine and Gameshark get the same treatment. Then again, those pubs rarely give grades below A’s, so the wacky conversion doesn’t affect them as much as it does cranky old GR. Jerks.

Now, you’d think one screwy grade wouldn’t matter on an aggregate site; after all, it’s just a drop in the bucket right? Tell that to irritable Stardock chief Brad Wardell, who went ballistic over our review of Galactic Civilizations II. Though we don't often air our dirty laundry, we just had to swipe this snippet from Brad’s email barrage:

“…your review isn't just the lowest review of the game but REALLY far down.  As you point out, on Metacritic, it's a 67. The next lowest review is 80.”

We gave Galactic Civilizations II a B-. According to our own standard (the same one used by America's public education system), a B- converts to - you guessed it - an 80. We should have been right there next to the other low grade, not 13 points below it. Brad claims, as a result, that :

"[Game Revolution’s] current score ensures we'll never break the overall 9.0 average on there.  That may sound silly, but for us, it's important to us because so much of our lives have been wrapped up making the game.”

Far be it from us to point out that our lives are wrapped up in reviewing games, not fixing Metacritic (current article notwithstanding).

Metacritic's top-secret founder?

But maybe Brad is right. Maybe our score did screw his game out of a 9.0 average. We’ll never know for sure because Metacritic individually weighs its sources, meaning they decide how much or how little each site influences the aggregate score. What's the criteria for this forbidden formula? Who knows? It's their secret sauce and they keep it under tight wraps; there’s just no telling what their aggregate scores are actually based on. As a result, we can only speculate as to whether or not scores are aggregated from more than a handful of heavily-weighted sites.

Don't believe us? According to Marc Doyle, a Metacritic founder:

We don’t think that a review from, say, Roger Ebert or Game Informer, for example, should be given the same weight as a review from a small regional newspaper or a brand new game critic.”

Yes, he did just bring up Roger “Games Aren’t Art” Ebert and Game “GameStop” Informer in the same sentence as examples of important, weighty critics, who by virtue of their very rightness make smaller, newer critical entities less correct. We get the idea, but oh, the wrongness.

Metacritic’s desire to base their aggregate scores on their own interpretation of Who’s Who in the video game industry completely undermines all the hard work they do to provide you with an otherwise great service. Even though they claim to be an aggregate site and have a list of accepted publications a mile long, it’s entirely possible that only ten of them count. You have no way of knowing with Metacritic. They’re asking you to trust their judgment while simultaneously asking you to believe their inane science.

The kookiest part, though, is that it sort-of works, and their aggregate scores are usually in line with what seems to be the critical consensus for any given game. You can look at Metacritic and reliably see how well a game has been received. They even color code their scores and include blurbs from each review so you can get an idea of the tone and focus without having to go anywhere. Yet they break our grades and their own, then cover their shoddy conversion by omitting the actual review scores and concealing the scheme they use to weigh their sources.

Because they obviously graduated from the Underpants Gnome School of Business (Underpants + ??? = Profit!), we give them a final grade of Two Underpants. Two Underpants out of how many, you ask? Sorry, that’s our Secret Underpants Formula.


comments powered by Disqus