Highest (1.5) -- Game Informer
Highest (1.5) -- GameTrailers
Highest (1.5) -- IGN
Highest (1.5) -- IGN AU
Highest (1.5) -- IGN UK
Shocker.
Also, please nuke Metarcitic from orbit.
Highest (1.5) -- Game Informer
Highest (1.5) -- GameTrailers
Highest (1.5) -- IGN
Highest (1.5) -- IGN AU
Highest (1.5) -- IGN UK
If they determine the weight based on a site's standard deviation over time, I wonder if someone could game the system by publishing reviews a week or two late with scores 1 or 2 points off the current metascore.I don't know whether these are qualitative rankings or not (it doesn't seem like it), but if you were to try to do it in a statistically unbiased manner, it would basically be a matter of weighting the reviews by how much statistical information they provide. As an example, they could be weighted based on the following three criteria:
- Total number of reviews from publication
- Granularity of scores
- Distribution of scores (ie how large a standard deviation their score distribution has)
Publications with lots of reviews, very granular scores and a wide range of scores (relative to that granularity) will, statistically speaking, give you more precise information to base your metascore on than publications that have reviewed only a handful of games, operate on a 5-point scoring system and have very tightly clustered scores. It would therefore make sense to weight the former more heavily than the latter, which might be what they're doing.
Another possibility is that they model the predictive value each publication's scores have (ie how accurately that publication's scores predict every other publications' scores). Basically, this would mean that the publications with scores closest to the norm would be weighted more highly.
The list is not comprehensive but the *** sites are banned on neogaf. The link has no censored sites.
Metacritic does all that. I know on DarkZero when a new review goes up we don't do anything, Metacritic just pull it themselves and puts it on the site.
So if you're a freelancer, your opinion has more validity depending on who you're writing for?
If they determine the weight based on a site's standard deviation over time, I wonder if someone could game the system by publishing reviews a week or two late with scores 1 or 2 points off the current metascore.
What's a good threshold then? If you set it at 50, 95% of games would be fresh. 70 probably gives a better match to freshness ratios of movies.They also fucked it up relative to movie scores: as I recall if a game got a 7 or lower it was marked as a "rotten" score, leaving only 8-10 mattering. As much as people talk about the skewed review scale it wasn't THAT skewed on average from what I saw, and as I recall we had plenty of pretty good games get ravaged there as a result.
These are no name sites, and they have the highest ranking. Just look at Firing Squad. The site is a fucking abomination.
High (1.25) -- Da GameBoyz
Conclusion Like its predecessors, New Super Mario Bros features unbridled, uncomplicated, old-school gameplay, but with some modern twists to keep it fresh. I have heard more than one person refer to their addiction to this game since it came out. I think that is because of all the great platforming moments that New Super Mario Bros. offers on it brings back gaming memories from one’s youth (that is if you aren’t currently a youth right now). With all the good comes a sliver of bad. This game does not take advantage of the advanced features of the DS, but the game is not really meant to, and doesn’t need to. So in many ways this point is negated.
Looks right to me.Edge not in the top bracket? lmao.
Looks right to me.and goodness at Giantbomb.
The fuck is this
Because publishers did not have a version of this list.
Anyways, if you check the OP, some researchers backwards engineered the rankings to get to these figures. It doesn't appear to be the real list, but a close approximation.
This wouldn't bother me at all if everything was weighted at 1.0. Like I'd look at that site, laugh at the design, and move on. But reading that content and realizing that it somehow counts more than Play UK or something just makes me want to drain a whole bottle of scotch in my mouth.
Metacritic Facebook post said:Today, the website Gamasutra "revealed" the weights that we assign to each gaming publication (for the purpose of calculating our Metascores), based on a presentation given at the Game Developers Conference this morning. There's just one major problem with that: neither that site, nor the person giving the presentation, got those weights from us; rather, they are simply their best guesses based on research (the Gamasutra headline is misleading in this respect).
And here's the most important thing: their guesses are wildly, wholly inaccurate. Among other things:
* We use far fewer tiers than listed in the article.
* The disparity between tiers listed in the article is far more extreme than what we actually use on Metacritic. For example, they suggest that the highest-weighted publications have their scores counted six times as much as the lowest-weighted publications in our Metascore formula. That isn't anywhere close to reality; our publication weights are much closer together and have much less of an impact on the score calculation.
* Last but definitely not least: Our placement of publications in each tier differs from what is displayed in the article. The article overvalues some publications and undervalues others (while ignoring others altogether), sometimes comically so. (In addition, our weights are periodically adjusted as needed if, over time, a publication demonstrates an increase or decrease in overall quality.)
MetaCritic said:Today, the website Gamasutra "revealed" the weights that we assign to each gaming publication (for the purpose of calculating our Metascores), based on a presentation given at the Game Developers Conference this morning. There's just one major problem with that: neither that site, nor the person giving the presentation, got those weights from us; rather, they are simply their best guesses based on research (the Gamasutra headline is misleading in this respect).
And here's the most important thing: their guesses are wildly, wholly inaccurate. Among other things:
* We use far fewer tiers than listed in the article.
* The disparity between tiers listed in the article is far more extreme than what we actually use on Metacritic. For example, they suggest that the highest-weighted publications have their scores counted six times as much as the lowest-weighted publications in our Metascore formula. That isn't anywhere close to reality; our publication weights are much closer together and have much less of an impact on the score calculation.
* Last but definitely not least: Our placement of publications in each tier differs from what is displayed in the article. The article overvalues some publications and undervalues others (while ignoring others altogether), sometimes comically so. (In addition, our weights are periodically adjusted as needed if, over time, a publication demonstrates an increase or decrease in overall quality.)
God Tier (50.0) -- Giant Bomb
Da GameBoyz is clearly doing something right here. Keep up the good work, fellas.(In addition, our weights are periodically adjusted as needed if, over time, a publication demonstrates an increase or decrease in overall quality.)
7 is probably fine, with the caveat of all scores on a 5 point scale bumping 3 up to "like" for the intents of accurately representing their view. Hell, 6 as a baseline might not be TOO bad, but the problem is there's a lot of games that get a 6 and the review's only lukewarm at best.What's a good threshold then? If you set it at 50, 95% of games would be fresh. 70 probably gives a better match to freshness ratios of movies.
Maybe they're not out to specifically "get" anyone, but the site is a joke. The entire idea behind it is. When you have GamesTM and GiantBomb ranked lower than some no name website about 3 people have heard of, and Eurogamer ranked lower than one of its newer, regional sites, something is wrong.Nah, it's not. It's a pretty useful site actually. If you want to blame someone, blame the developers for tying bonuses to such of a back-of-the-envelope calculation. Metacritic is cool. They're not out to get anyone.
We live in a world where Metacritic can exist and people can still read the contents of a review.Metacritic is a joke. People shouldnt even listen to review scores period, so many instances where they are way off. Whats more important is what is said and not said in a review.
God Tier (50.0) -- Giant Bomb
Hmm.
The course director and his students then set about modelling the weightings based on data pulled from the site. Finally, after six months of work, the researchers compared their modeled scores to the actual scores and discovered that across the 188 publications that feed into Metacritic's video game score work, their findings were almost entirely accurate.
Their forum over at Game-boys is such a waste, 17 forum sections and only 27 posts.
My bottle is ready.
To be fair, the article in the OP states this:
In other words, they applied their derived weightings to the aggregated scores, and got accurate Metascores.
Plus, you know, if Metacritic wants to rubbish their research they should be more transparent.