Wow O_O I write for DarkZero, which is a volunteer based site (we all have other jobs or are students, etc.)
I have no idea how we are fixed into the highest category. O_O I'm shocked. We don't get paid or anything, we're just a group of guys who like gaming. O_O
Importance based on what websites will play ball confirmed.GiantBomb is a lower? Well dang.
That list doesn't look like it was made by a human. Why give EG Italy/Spain more weight than EG main?
Uuuhhm, why are there several (official) Playstaion and Xbox publications in the highest two categories, but no Nintendo publications whatsoever?
This is obviously not the biggest question here, but it stood out to me.
Yeah, a critic can always affect someone's job, but it shouldn't be so DIRECT. If a game sells lower because of that review and they fail to meet sales based bonuses, fine, but it seems kinda crap to do it based on those scoring averages as that can put pressure on some people to score a bit higher just to protect someone's livelihood.And that's the thing, I don't think aggregation is a bad thing. It's just that it sort of loses a lot of meaning when you skew numbers so (seemingly) randomly, not to mention crazy stuff like the way it's affected bonuses, budgets, etc.
Wow. I don't understand why they would do that. Publishers are going to be throwing money at the sites in the highest tier.
Seeing how this is seemingly an arbitrary weighting decision, I have to wonder if certain sites with low scoring weights could file antitrust suits against Metacritic if their lack of influence cost them lucrative ad revenue opportunities with various publishers.Wow. I don't understand why they would do that. Publishers are going to be throwing money at the sites in the highest tier.
I wonder what New Vegas's score would be if their grades were "unskewed."
Wow. This really damages my opinion of Metacritic.
$$$$$$$$$$$$$$$$$$$$$$
Most likely Sony/Microsoft pay to have their official magazine reviews matter more. It's bullshit, and Official Playstation/Xbox/Nintendo magazines need to be ignored completely.
Who do you think is making that assumption?
Also seeing Giant Bomb at a .5 is a goddamn tragedy.
Would adopting RT's system of like/dislike be better as a whole? Review sites can keep their grading scale and it's certainly better than averaging all the scores and in this case aggregating them too.
So sites that don't follow the review score hivemind don't skew the results too much? That makes sense, given Metacritic's intentions. Where's Tom Chick on that list? Maybe he's off the chart at 0.01.
While its weaknesses are clear, Greenwood-Ericksen and team wanted to determine if there is value to the Metacritic rating in spite of that. There's no reliable way to translate quality to quantitative data, so instead they measured a game's economic value by comparing sales to scores.
The results? It turns out that the Metacritic score is a great indicator of financial success. Sales rise sharply for games listed as 83 or higher on Metacritic. Greenwood-Ericksen determined there was a .72 correlation between the Metacritic score and sales, which is considered extraordinarily high — though he emphasized that this means the two are closely related, not that one directly causes the other.
Let's just get rid of the 1st Amendment altogether while we're at it.Seeing how this is seemingly an arbitrary weighting decision, I have to wonder if certain sites with low scoring weights could file antitrust suits against Metacritic if their lack of influence cost them lucrative ad revenue opportunities with various publishers.
I don't know whether these are qualitative rankings or not (it doesn't seem like it), but if you were to try to do it in a statistically unbiased manner, it would basically be a matter of weighting the reviews by how much statistical information they provide. As an example, they could be weighted based on the following three criteria:
- Total number of reviews from publication
- Granularity of scores
- Distribution of scores (ie how large a standard deviation their score distribution has)
Publications with lots of reviews, very granular scores and a wide range of scores (relative to that granularity) will, statistically speaking, give you more precise information to base your metascore on than publications that have reviewed only a handful of games, operate on a 5-point scoring system and have very tightly clustered scores. It would therefore make sense to weight the former more heavily than the latter, which might be what they're doing.
Another possibility is that they model the predictive value each publication's scores have (ie how accurately that publication's scores predict every other publications' scores). Basically, this would mean that the publications with scores closest to the norm would be weighted more highly.
GiantBomb is a lower? Well dang.
There really, really needs to be a RottenTomatoes of gaming.
So proud to work for a website that doesn't use scores or get listed on Metacritic.
There is a Gaming Age forum? From what I understand NeoGAF is NeoGAF because it parted ways with Gaming Age.I'm surprised Gaming Age is so high up.. I only visit its forum.
This might lead to confusion as scores constantly change.I propose the creation of a new rankings website where, instead of weightings, the distribution of scores given by websites is normalised based on their mean review score and then spread in a normal distribution about 50%.
So proud to work for a website that doesn't use scores or get listed on Metacritic.
There really, really needs to be a RottenTomatoes of gaming.
There really, really needs to be a RottenTomatoes of gaming.
So proud to work for a website that doesn't use scores or get listed on Metacritic.
Seriously. I'd trust their opinions over everyone in the top tiers.holy fuck at how low giantbomb is
Imagine a world without scores, guys.
It doesn't seem like that list is comprehensive. There are two publications with the name *****'d out, one may be Quarter to Three. Also, I doubt metacritic would list a review if it didn't actually factor into the aggregate score.This effectively destroys all the stupid but incessant conspiracy theories that Tom Chick purposefully lowers games to below average (i.e. using the whole scale properly) to skew the Metacritic score.
Keep on trucking, Tom Chick.
Imagine a world without scores, guys.
Love Kotaku's review scoring; actually pressures you lot to tell us why you say YES or NO
Do you know if metacritic pick the reviews themselves; or if they're 'sent' to them by the sites?
Why are people assuming the publishers didn't already have a version of this list?
The list is not comprehensive but the *** sites are banned on neogaf. The link has no censored sites.It doesn't seem like that list is comprehensive. There are two publications with the name *****'d out, one may be Quarter to Three. Also, I doubt metacritic would list a review if it didn't actually factor into the aggregate score.
Imagine a world without scores, guys.
They also fucked it up relative to movie scores: as I recall if a game got a 7 or lower it was marked as a "rotten" score, leaving only 8-10 mattering. As much as people talk about the skewed review scale it wasn't THAT skewed on average from what I saw, and as I recall we had plenty of pretty good games get ravaged there as a result.Rotten Tomatoes had a gaming section, it's just that nobody cared for it for some reason. Of course that was back before Metacritic became "a thing".
Love Kotaku's review scoring; actually pressures you lot to tell us why you say YES or NO
Do you know if metacritic pick the reviews themselves; or if they're 'sent' to them by the sites?
This might lead to confusion as scores constantly change