• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Metacritic's weighting system revealed

dmr87

Member
Highest (1.5) -- Game Informer
Highest (1.5) -- GameTrailers
Highest (1.5) -- IGN
Highest (1.5) -- IGN AU
Highest (1.5) -- IGN UK

Shocker.

Also, please nuke Metarcitic from orbit.
 

sixghost

Member
I don't know whether these are qualitative rankings or not (it doesn't seem like it), but if you were to try to do it in a statistically unbiased manner, it would basically be a matter of weighting the reviews by how much statistical information they provide. As an example, they could be weighted based on the following three criteria:

- Total number of reviews from publication
- Granularity of scores
- Distribution of scores (ie how large a standard deviation their score distribution has)

Publications with lots of reviews, very granular scores and a wide range of scores (relative to that granularity) will, statistically speaking, give you more precise information to base your metascore on than publications that have reviewed only a handful of games, operate on a 5-point scoring system and have very tightly clustered scores. It would therefore make sense to weight the former more heavily than the latter, which might be what they're doing.

Another possibility is that they model the predictive value each publication's scores have (ie how accurately that publication's scores predict every other publications' scores). Basically, this would mean that the publications with scores closest to the norm would be weighted more highly.
If they determine the weight based on a site's standard deviation over time, I wonder if someone could game the system by publishing reviews a week or two late with scores 1 or 2 points off the current metascore.

The list is not comprehensive but the *** sites are banned on neogaf. The link has no censored sites.

my mistake then
 
Metacritic does all that. I know on DarkZero when a new review goes up we don't do anything, Metacritic just pull it themselves and puts it on the site.

Ah okay; always assumed there was some database tbh
-------------------------

Metacritic should let you create a weighting preference; would probably bring more users and be more like a Decision Analysis (not really but kind of) than just some number.
 
What an odd list.

Game Almighty looks like a decade-old blog. It isn't even the top result when searching "Game Almighty" on Google, or second, or third... seventh.
 

Thraktor

Member
If they determine the weight based on a site's standard deviation over time, I wonder if someone could game the system by publishing reviews a week or two late with scores 1 or 2 points off the current metascore.

I would imagine they'd only recalculate weightings every year or so, and even then it would be on the entire corpus of reviews. Well, that's unless there's a strong trend over time, but trying to model that would get pretty messy.
 

RotBot

Member
They also fucked it up relative to movie scores: as I recall if a game got a 7 or lower it was marked as a "rotten" score, leaving only 8-10 mattering. As much as people talk about the skewed review scale it wasn't THAT skewed on average from what I saw, and as I recall we had plenty of pretty good games get ravaged there as a result.
What's a good threshold then? If you set it at 50, 95% of games would be fresh. 70 probably gives a better match to freshness ratios of movies.
 

jono51

Banned
These are no name sites, and they have the highest ranking. Just look at Firing Squad. The site is a fucking abomination.

Hahaha. It looks like they built the site 10 years ago then forgot how to change it a few years ago. Their CPU overclocking database at the top? Hasn't been update since ~2008. "Deals" goes nowhere except a 404 page that fucks up the site design. "Features" empty since 2011. etc etc. Metacritic were wise to weight them so highly :rollseyes
 
High (1.25) -- Da GameBoyz


The first review I randomly clicked on. A sample:

Conclusion Like its predecessors, New Super Mario Bros features unbridled, uncomplicated, old-school gameplay, but with some modern twists to keep it fresh. I have heard more than one person refer to their addiction to this game since it came out. I think that is because of all the great platforming moments that New Super Mario Bros. offers on it brings back gaming memories from one’s youth (that is if you aren’t currently a youth right now). With all the good comes a sliver of bad. This game does not take advantage of the advanced features of the DS, but the game is not really meant to, and doesn’t need to. So in many ways this point is negated.

Clearly wordsmiths up there with EDGE, and spitting down on Eurogamer and Giant Bomb.
 

Omikaru

Member

Weightings definitely not derived from stature/quality/traffic, then.
KuGsj.gif
 

Dali

Member
Damn that's basically written instructions on who to spend your "marketing" bucks on if you're looking for a higher metacritic.
 

JABEE

Member
This is why when people say that scores don't matter, they are uninformed. They impact who gets coverage as Jeff said on the PAX Giant Bomb podcasts.

I think all sites should drop numbers of they gave a damn about not getting in the hype machine or getting the page views Metacritic provides, but they won't.

It's about selling your eyes first, content second.
 
Because publishers did not have a version of this list.

Anyways, if you check the OP, some researchers backwards engineered the rankings to get to these figures. It doesn't appear to be the real list, but a close approximation.

I am sure the marketing departments at the pubs have made an approximation like this and use that to make sure of small things like - heavily weighted publications get their review copies on time, making follow up phone calls, etc.
 
Metacritic seems busted. We all know the only way to accurately gauge a game's score is to take all the GAF opinions and try to find the number that represents them as a whole
 

Orayn

Member
This wouldn't bother me at all if everything was weighted at 1.0. Like I'd look at that site, laugh at the design, and move on. But reading that content and realizing that it somehow counts more than Play UK or something just makes me want to drain a whole bottle of scotch in my mouth.

i1hqAsKbbOpgl.jpg


My bottle is ready.
 
uhhhh guys?

http://www.facebook.com/Metacritic/posts/501424766586647
Metacritic Facebook post said:
Today, the website Gamasutra "revealed" the weights that we assign to each gaming publication (for the purpose of calculating our Metascores), based on a presentation given at the Game Developers Conference this morning. There's just one major problem with that: neither that site, nor the person giving the presentation, got those weights from us; rather, they are simply their best guesses based on research (the Gamasutra headline is misleading in this respect).

And here's the most important thing: their guesses are wildly, wholly inaccurate. Among other things:

* We use far fewer tiers than listed in the article.

* The disparity between tiers listed in the article is far more extreme than what we actually use on Metacritic. For example, they suggest that the highest-weighted publications have their scores counted six times as much as the lowest-weighted publications in our Metascore formula. That isn't anywhere close to reality; our publication weights are much closer together and have much less of an impact on the score calculation.

* Last but definitely not least: Our placement of publications in each tier differs from what is displayed in the article. The article overvalues some publications and undervalues others (while ignoring others altogether), sometimes comically so. (In addition, our weights are periodically adjusted as needed if, over time, a publication demonstrates an increase or decrease in overall quality.)
 
MetaCritic said:
Today, the website Gamasutra "revealed" the weights that we assign to each gaming publication (for the purpose of calculating our Metascores), based on a presentation given at the Game Developers Conference this morning. There's just one major problem with that: neither that site, nor the person giving the presentation, got those weights from us; rather, they are simply their best guesses based on research (the Gamasutra headline is misleading in this respect).

And here's the most important thing: their guesses are wildly, wholly inaccurate. Among other things:

* We use far fewer tiers than listed in the article.

* The disparity between tiers listed in the article is far more extreme than what we actually use on Metacritic. For example, they suggest that the highest-weighted publications have their scores counted six times as much as the lowest-weighted publications in our Metascore formula. That isn't anywhere close to reality; our publication weights are much closer together and have much less of an impact on the score calculation.

* Last but definitely not least: Our placement of publications in each tier differs from what is displayed in the article. The article overvalues some publications and undervalues others (while ignoring others altogether), sometimes comically so. (In addition, our weights are periodically adjusted as needed if, over time, a publication demonstrates an increase or decrease in overall quality.)

Hmm.
 
Metacritic is a joke. People shouldnt even listen to review scores period, so many instances where they are way off. Whats more important is what is said and not said in a review. I find that its very hard to actually get good information that your actually looking for in a review. Plus something that a reviewer is annoyed by you dont care about or they dont care about something you find annoying and never mention it. Best way to go about it is look at videos and read and ask questions on forums.

Same with movies etc, i went to read some reviews on titanic 3d to see how good the 3d conversion was and mostly no one even mentioned the 3d and choose only to review the movie itself. If there is a 3d movie you should make a comment on the 3d itself aswell as the actual movie. Plus being an old movie the review of the 3d conversion should be the main focus not review of the old movie that was already done when it came out !
 

Eusis

Member
What's a good threshold then? If you set it at 50, 95% of games would be fresh. 70 probably gives a better match to freshness ratios of movies.
7 is probably fine, with the caveat of all scores on a 5 point scale bumping 3 up to "like" for the intents of accurately representing their view. Hell, 6 as a baseline might not be TOO bad, but the problem is there's a lot of games that get a 6 and the review's only lukewarm at best.

8's ridiculous though, there's tons of more niche but very enjoyable games that are in the 7 range, and many sites even treat that as good. nevermind it throws things way off when you get a bunch of 8s and 9s, only for a third to be 7.5 or whatever and give it that bad looking score of 66%, even though the 7 guys liked the game. I do believe they changed it though, then either abandoned or failed to keep advertising game scores. Plus there does seem to be a fundamentally different view of reviewing movies versus reviewing games: movies are far more of a work of art, whereas games have a technical side that must be factored in, you don't exactly have to worry about stiff controls or anything in a movie afterall.
 

PaulLFC

Member
Nah, it's not. It's a pretty useful site actually. If you want to blame someone, blame the developers for tying bonuses to such of a back-of-the-envelope calculation. Metacritic is cool. They're not out to get anyone.
Maybe they're not out to specifically "get" anyone, but the site is a joke. The entire idea behind it is. When you have GamesTM and GiantBomb ranked lower than some no name website about 3 people have heard of, and Eurogamer ranked lower than one of its newer, regional sites, something is wrong.

That something is the fact that there's even a "ranking" system in the first place, it's ridiculous. Just use an average instead of introducing arbitrary ratings decided by someone who may have a completely different opinion to someone else on what they should be.
 
Good to know that as a former TeamXbox reviewer I had a big part in shaping the future of our industry, and played a vital role in forming the public consciousness.

So take that Dungeon Fighter LIVE.
 

newjeruse

Member
Metacritic is a joke. People shouldnt even listen to review scores period, so many instances where they are way off. Whats more important is what is said and not said in a review.
We live in a world where Metacritic can exist and people can still read the contents of a review.
 

Gilgamesh

Member
This would be bad enough on its own but the fact that publishers use these scores to evaluate and compensate developers makes it fucking disgusting.
 

Omikaru

Member


To be fair, the article in the OP states this:

The course director and his students then set about modelling the weightings based on data pulled from the site. Finally, after six months of work, the researchers compared their modeled scores to the actual scores and discovered that across the 188 publications that feed into Metacritic's video game score work, their findings were almost entirely accurate.

In other words, they applied their derived weightings to the aggregated scores, and got accurate Metascores.

Plus, you know, if Metacritic wants to rubbish their research they should be more transparent.
 
i1hqAsKbbOpgl.jpg


My bottle is ready.

8Bd0yHl.jpg


Solidarity, brother. Although thinking about it, I wanted rum instead.

To be fair, the article in the OP states this:



In other words, they applied their derived weightings to the aggregated scores, and got accurate Metascores.

Plus, you know, if Metacritic wants to rubbish their research they should be more transparent.

I'd suggest someone reverse-engineering the math here to see if it all checks out, but I guess that's what started this article in the first place. I certainly hope what Metacritic is saying is true, but I'd feel a lot better if they shone at least a little light on what their actual system is. They don't have to blow the whole thing wide open if it's a trade secret or whatever, but some transparency would be nice.
 
Top Bottom