No, it is based on quality. If this list is correct, it represents Metacritic's beliefs as to what the best review sites are.
Metascore is a weighted average in that we assign more importance, or weight, to some critics and publications than others, based on their quality and overall stature. In addition, for music and movies, we also normalize the resulting scores (akin to "grading on a curve" in college), which prevents scores from clumping together.
I actually had no idea that there was such a huge disparity in the way they weighted things. I'd love to see similar data on how they handle film reviews, too. At least RottenTomatoes is fairly upfront about categorizing "cream of the crop" publications.
I propose the creation of a new rankings website where, instead of weightings, the distribution of scores given by websites is normalised based on their mean review score and then spread in a normal distribution about 50%.
You could also go finer grain and do it per individual reviewer but that might lead to bias. For instance, some sites like to have reviewers who focuses on AAA titles which typically get higher scores and freelancers who review low-scoring shovel ware.
Obviously, the need for such a rankings system is questionable, but if there must be one, it can be done more fairly than Metacritic.
Yeah, it seems really unnecessary. Either use a particular critic's/site's reviews or ignore them altogether. That said, I still think metacritic is a pretty good general reference for overall opinion. Most of the time user/ critic average scores don't differ too wildly. (I know there are exceptions.)
Well, it kinda makes sense as you can have real sleazy sites that give shitty ass reviews for absolutely everything and you want some way to protect against that. At the same time it's bullshit because good quality publications get ignored (e.g. Giant Bomb) in favour of mainstream sites and we don't exactly know why this is. Is it because of readership size? Or did they pay CNET? Or is it their based on their history?
This was revealed by Adams Greenwood-Ericksen of Full Sail University at a talk titled 'A Scientific Assessment of the Validity and Value of Metacritic' delivered at the Game Developer's Conference in San Fransisco this afternoon.
That's what they say. But looking at the list, I don't think many people would agree with what they determine as quality and stature. So either:
1) They have an incredibly skewed sense of quality; some sites are unusually high, some are unusually low.
2) They're lying and they weigh publications by a different criteria.
I don't want to say for sure it's 2, because I have no evidence to suggest they're lying (and because of that, it's easier to believe they just have a poor barometer of quality), but something's definitely up. Whether that's a farce or something more sinister, I don't know!