More Reviews
REVIEWS Wayward Manor Review
Not even the power of Neil Gaiman and The Odd Gentlemen could save this game from a fate worse than death: a terrible score.

ONE PIECE Unlimited World Red Review
"Unlimited World Red"? More like "Sorta Limited Town and Extended Areas... Red. And Blue. And Some Yellow."
More Previews
PREVIEWS Pillars of Eternity Preview
For Obsidian's crowdfunded love letter to Infinity Engine games like Icewind Dale and Baldur's Gate, I was impressed by its willingness to pull back the curtain and let me see the machinery behind it.
Release Dates
NEW RELEASES Sacred 3
Release date: 08/05/14

CounterSpy
Release date: 08/19/14

Tales of Xillia 2
Release date: 08/19/14

Plants Vs. Zombies: Garden Warfare
Release date: 08/19/14


LATEST FEATURES How Bioware Creates Romances
Bioware's games have romances where you might save the world, on the side of course.

We Absolutely Should Be Upset With Club Nintendo's Latest Elite Rewards
Surveys out the wazoo and I get a code for Dr. Luigi?
MOST POPULAR FEATURES Picking Your Gender: 5 Industry Professionals Discuss Queer Identity in Gaming
Women from Naughty Dog, ArenaNet, Harmonix, and Gamespot unite to talk about what they want from games in terms of diversity.
 
Coming Soon

LEADERBOARD
Read More Member Blogs
FEATURED VOXPOP Kakulukia
Why Sunset Overdrive Can Go Suck A Lemon
By Kakulukia
Posted on 07/14/14
Yesterday, while cleaning up my media center, I found my copy of Ratchet & Clank: Into The Nexus, which I bought sometime before Christmas last year. I had been pretty excited about this game pre-release, what with it being the first "traditional", albeit shorter than usual,...

Mind Over Meta

Posted on Friday, July 14 @ 12:14:15 Eastern by Joe_Dodson

Metacritic.com

It turns out most of the emails complaining about a GR score on a meta review site have something to do with Metacritic.com. For some reason, they skew our grades preposterously low, beyond the realm of any normal human calculus. Instead, you need drunken calculus, and we have that in spades.

So we took Metacritic to task, first skimming through their online FAQ. Which, alas, is FUCed…and they can prove it.

Metacritic’s letter grade-to-number conversion scale is insane. It starts off okay with the A’s all landing in the 91-100 range - a respectable 10 point spread - but things quickly go bonkers when you give a game a B+, which, according to Metacritic, converts to an 83. What happened to 84-90? Apparently those digits are being used to prop up the rest of the B's, which run from 83 (ugh) to 67 (double ugh). That’s a 19 point range, holmes, for three grades. The C’s suddenly dip an extra 9 points, starting at a 58 for the C+ and ending with a C- at 42. That might be the answer to life itself, but it only made us more confused. And it just gets messier as you plow down towards the F, which they laughably break down into an F+, F and F-.

While this system is obviously loony, we think we see where they went wrong, and so can you because it's plain as day in their very own aggregate breakdown chart. It seems Metacritic uses two standards: one for games, and one for everything else. But there is only one letter grade-to-number conversion scale, and it doesn’t match either breakdown. It represents a third standard, one of stark, raving madness.

And since no one wants to be crazy, they cover it up by not showing original grades on their score pages. Why? Who knows, but readers look at these pages and think we’re irresponsible critics, when in fact Metacritic is loco. We give Metacritic’s funhouse mirror conversion scheme an F+.

Or maybe an E- for lack of effort, because we know they aren’t trying to make us look like jerks. We suppose we should be consoled by the fact that Play magazine and Gameshark get the same treatment. Then again, those pubs rarely give grades below A’s, so the wacky conversion doesn’t affect them as much as it does cranky old GR. Jerks.

Now, you’d think one screwy grade wouldn’t matter on an aggregate site; after all, it’s just a drop in the bucket right? Tell that to irritable Stardock chief Brad Wardell, who went ballistic over our review of Galactic Civilizations II. Though we don't often air our dirty laundry, we just had to swipe this snippet from Brad’s email barrage:

“…your review isn't just the lowest review of the game but REALLY far down.  As you point out, on Metacritic, it's a 67. The next lowest review is 80.”

We gave Galactic Civilizations II a B-. According to our own standard (the same one used by America's public education system), a B- converts to - you guessed it - an 80. We should have been right there next to the other low grade, not 13 points below it. Brad claims, as a result, that :

"[Game Revolution’s] current score ensures we'll never break the overall 9.0 average on there.  That may sound silly, but for us, it's important to us because so much of our lives have been wrapped up making the game.”

Far be it from us to point out that our lives are wrapped up in reviewing games, not fixing Metacritic (current article notwithstanding).

Metacritic's top-secret founder?

But maybe Brad is right. Maybe our score did screw his game out of a 9.0 average. We’ll never know for sure because Metacritic individually weighs its sources, meaning they decide how much or how little each site influences the aggregate score. What's the criteria for this forbidden formula? Who knows? It's their secret sauce and they keep it under tight wraps; there’s just no telling what their aggregate scores are actually based on. As a result, we can only speculate as to whether or not scores are aggregated from more than a handful of heavily-weighted sites.

Don't believe us? According to Marc Doyle, a Metacritic founder:

We don’t think that a review from, say, Roger Ebert or Game Informer, for example, should be given the same weight as a review from a small regional newspaper or a brand new game critic.”

Yes, he did just bring up Roger “Games Aren’t Art” Ebert and Game “GameStop” Informer in the same sentence as examples of important, weighty critics, who by virtue of their very rightness make smaller, newer critical entities less correct. We get the idea, but oh, the wrongness.

Metacritic’s desire to base their aggregate scores on their own interpretation of Who’s Who in the video game industry completely undermines all the hard work they do to provide you with an otherwise great service. Even though they claim to be an aggregate site and have a list of accepted publications a mile long, it’s entirely possible that only ten of them count. You have no way of knowing with Metacritic. They’re asking you to trust their judgment while simultaneously asking you to believe their inane science.

The kookiest part, though, is that it sort-of works, and their aggregate scores are usually in line with what seems to be the critical consensus for any given game. You can look at Metacritic and reliably see how well a game has been received. They even color code their scores and include blurbs from each review so you can get an idea of the tone and focus without having to go anywhere. Yet they break our grades and their own, then cover their shoddy conversion by omitting the actual review scores and concealing the scheme they use to weigh their sources.

Because they obviously graduated from the Underpants Gnome School of Business (Underpants + ??? = Profit!), we give them a final grade of Two Underpants. Two Underpants out of how many, you ask? Sorry, that’s our Secret Underpants Formula.

 


comments powered by Disqus




More On GameRevolution