More Reviews
REVIEWS D4 Review
Just another game about a time-traveling detective who deflects knives by spitting bubblegum at them as they fly through the air at full speed. Yep, just another game...

Persona 4 Arena Ultimax Review
Ultimax comes the closest to realizing my dream of an actual fighting RPG.
More Previews
PREVIEWS Sunset Overdrive Preview
Microsoft and Insomniac Games have created a new open-world shooter with clear influences from Jet Grind Radio. We went hands-on with the first hour of the game.
Release Dates
NEW RELEASES Persona 4 Arena Ultimax
Release date: Out Now

Alien: Isolation
Release date: 10/07/14

Borderlands: The Pre-Sequel
Release date: 10/14/14

The Evil Within
Release date: 10/14/14


LEADERBOARD
Read More Member Blogs
FEATURED VOXPOP ryanbates
Respawning
By ryanbates
Posted on 09/25/14
I had planned to write something about the Borderlands series, but that will have to wait. I have something I need to get off my chest first. It's very personal, and I hope the two or three of you who follow my sparse blog will spare me this moment. I joked in my review for the bizarre...

Mind Over Meta

Posted on Friday, July 14 @ 12:14:15 Eastern by Joe_Dodson

Metacritic.com

It turns out most of the emails complaining about a GR score on a meta review site have something to do with Metacritic.com. For some reason, they skew our grades preposterously low, beyond the realm of any normal human calculus. Instead, you need drunken calculus, and we have that in spades.

So we took Metacritic to task, first skimming through their online FAQ. Which, alas, is FUCed…and they can prove it.

Metacritic’s letter grade-to-number conversion scale is insane. It starts off okay with the A’s all landing in the 91-100 range - a respectable 10 point spread - but things quickly go bonkers when you give a game a B+, which, according to Metacritic, converts to an 83. What happened to 84-90? Apparently those digits are being used to prop up the rest of the B's, which run from 83 (ugh) to 67 (double ugh). That’s a 19 point range, holmes, for three grades. The C’s suddenly dip an extra 9 points, starting at a 58 for the C+ and ending with a C- at 42. That might be the answer to life itself, but it only made us more confused. And it just gets messier as you plow down towards the F, which they laughably break down into an F+, F and F-.

While this system is obviously loony, we think we see where they went wrong, and so can you because it's plain as day in their very own aggregate breakdown chart. It seems Metacritic uses two standards: one for games, and one for everything else. But there is only one letter grade-to-number conversion scale, and it doesn’t match either breakdown. It represents a third standard, one of stark, raving madness.

And since no one wants to be crazy, they cover it up by not showing original grades on their score pages. Why? Who knows, but readers look at these pages and think we’re irresponsible critics, when in fact Metacritic is loco. We give Metacritic’s funhouse mirror conversion scheme an F+.

Or maybe an E- for lack of effort, because we know they aren’t trying to make us look like jerks. We suppose we should be consoled by the fact that Play magazine and Gameshark get the same treatment. Then again, those pubs rarely give grades below A’s, so the wacky conversion doesn’t affect them as much as it does cranky old GR. Jerks.

Now, you’d think one screwy grade wouldn’t matter on an aggregate site; after all, it’s just a drop in the bucket right? Tell that to irritable Stardock chief Brad Wardell, who went ballistic over our review of Galactic Civilizations II. Though we don't often air our dirty laundry, we just had to swipe this snippet from Brad’s email barrage:

“…your review isn't just the lowest review of the game but REALLY far down.  As you point out, on Metacritic, it's a 67. The next lowest review is 80.”

We gave Galactic Civilizations II a B-. According to our own standard (the same one used by America's public education system), a B- converts to - you guessed it - an 80. We should have been right there next to the other low grade, not 13 points below it. Brad claims, as a result, that :

"[Game Revolution’s] current score ensures we'll never break the overall 9.0 average on there.  That may sound silly, but for us, it's important to us because so much of our lives have been wrapped up making the game.”

Far be it from us to point out that our lives are wrapped up in reviewing games, not fixing Metacritic (current article notwithstanding).

Metacritic's top-secret founder?

But maybe Brad is right. Maybe our score did screw his game out of a 9.0 average. We’ll never know for sure because Metacritic individually weighs its sources, meaning they decide how much or how little each site influences the aggregate score. What's the criteria for this forbidden formula? Who knows? It's their secret sauce and they keep it under tight wraps; there’s just no telling what their aggregate scores are actually based on. As a result, we can only speculate as to whether or not scores are aggregated from more than a handful of heavily-weighted sites.

Don't believe us? According to Marc Doyle, a Metacritic founder:

We don’t think that a review from, say, Roger Ebert or Game Informer, for example, should be given the same weight as a review from a small regional newspaper or a brand new game critic.”

Yes, he did just bring up Roger “Games Aren’t Art” Ebert and Game “GameStop” Informer in the same sentence as examples of important, weighty critics, who by virtue of their very rightness make smaller, newer critical entities less correct. We get the idea, but oh, the wrongness.

Metacritic’s desire to base their aggregate scores on their own interpretation of Who’s Who in the video game industry completely undermines all the hard work they do to provide you with an otherwise great service. Even though they claim to be an aggregate site and have a list of accepted publications a mile long, it’s entirely possible that only ten of them count. You have no way of knowing with Metacritic. They’re asking you to trust their judgment while simultaneously asking you to believe their inane science.

The kookiest part, though, is that it sort-of works, and their aggregate scores are usually in line with what seems to be the critical consensus for any given game. You can look at Metacritic and reliably see how well a game has been received. They even color code their scores and include blurbs from each review so you can get an idea of the tone and focus without having to go anywhere. Yet they break our grades and their own, then cover their shoddy conversion by omitting the actual review scores and concealing the scheme they use to weigh their sources.

Because they obviously graduated from the Underpants Gnome School of Business (Underpants + ??? = Profit!), we give them a final grade of Two Underpants. Two Underpants out of how many, you ask? Sorry, that’s our Secret Underpants Formula.

 


comments powered by Disqus




More On GameRevolution