• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Metacritic's weighting system revealed

jtb

Banned
I don't know why they would do it this way. Wouldn't it be better to lower the importance of outliers? Sometimes you get a game getting all 7s and 7.5s and suddenly you have a 2 thrown in there or a perfect 10. At least then you don't have people at meta critic deciding which sites deserve more weight.

Why shouldn't outliers matter or be included? This isn't a scientific, objective study. These are opinions. Just because someone's opinion dissents from the majority doesn't make it any less valid.
 

Toma

Let me show you through these halls, my friend, where treasures of indie gaming await...
Nobody has any idea how the weightings are given so there's no point raging about it. Could be part reputation, hits, visitor demographic (age, country). Could be pseudo-random.

Who says this list is official anyways? I might have missed it when reading through on my phone. I have a hard time believing they just made it public.

Its not an official list, but something being researched over months. MC already came out and did NOT deny that they are doing it, but simply said they are doing it "differently" without giving any hints to what they are actually doing. And with that, the point still stands. Weighting results, with seemingly random changes in weighting importance at that, is just not something a score aggregate website should do.
 
Integrity of MC.

Admittedly, WE arent the average person, but just pushing that to the frontpage of GAF changes MC's perception in the eyes of some people. It always starts small, and I actually hope some other gaming websites pick up on it and inform their readers accordingly.

It's tricky though. Because some of those sites really are shit, and sites that really are shit should not be given the same weight. I agree it isn't fair, and I also think they currently have the weights wrong. But I guess it comes down to how the weightings are determined. If they're getting paid under the table by sites to increase their site's weight, then that's definitely a scam. But since we have 0 transparency on how they determine the weights, it's hard to really know.
 

jtb

Banned
It's tricky though. Because some of those sites really are shit, and sites that really are shit should not be given the same weight. I agree it isn't fair, and I also think they currently have the weights wrong. But I guess it comes down to how the weightings are determined. If they're getting paid under the table by sites to increase their site's weight, then that's definitely a scam. But since we have 0 transparency on how they determine the weights, it's hard to really know.

If the site's shit, then they should just not aggregate their scores. Obviously they don't include every gaming enthusiast under the sun's blog so just cut it off there, imo. No need to bother with weighting within that.
 

Varth

Member
Its not an official list, but something being researched over months. MC already came out and did NOT deny that they are doing it, but simply said they are doing it "differently" without giving any hints to what they are actually doing. And with that, the point still stands. Weighting results, with seemingly random changes in weighting importance at that, is just not something a score aggregate website should do.

My point exactly. Weighting different opinions that cover different aspects of the same game is already something pretty nonsensical to me, but hey, people like it, so...
If you add the choice of opinions to cover, it just gets crazier.
Add to all that a weight system, and to top that, one decided "behind closed doors" by a single guy, and well, you could end up with any given number for any given game, basically.
 

Toma

Let me show you through these halls, my friend, where treasures of indie gaming await...
It's tricky though. Because some of those sites really are shit, and sites that really are shit should not be given the same weight. I agree it isn't fair, and I also think they currently have the weights wrong. But I guess it comes down to how the weightings are determined. If they're getting paid under the table by sites to increase their site's weight, then that's definitely a scam. But since we have 0 transparency on how they determine the weights, it's hard to really know.

True, and that missing transparency is part of the issue why I smell an incoming shitstorm for MC here. Without that transpareny it will be outright impossible to convince the populace knowing of this issue, that no one is being paid for any ranking (not saying ALL websites did that, but that it might be one factor).

The issue is indeed tricky how to make this kind of reviewing aggregate website work properly, but this doesnt feel like the right way to do it.
 

DEADEVIL

Member
Heh. I emailed Metacritic about this just to say we'd appreciate transparency and whatever (because that's what I do, I email people). They literally just responded with a "thanks for your opinion on the matter" and that was it. Responded pretty quickly though. Suggest more of you do just so I'm not a lonely man shouting at (read: emailing politely at) a cloud (read: a probably disgruntled editorial team).

Contact us page: http://www.metacritic.com/contact-us

Editorial Inquiries: editorial@metacritic.com
User Support: support@metacritic.com

You may end up getting stickied in the OP.

I always said that that site because so powerful in the gaming industry that one pop of the integrity bubble and devs would stop losing their jobs over that sites rating.

I can see Activision, EA, and others spitting their drinks out after seeing which sites were on top.

I get the weighted premise as we know that the bias is strong in some of the smaller sites, but nowadays it comes across there as well.

I think we all just trusted that Meta had the best system out there. since GameRankings got steamrolled.

The hype that some causes get that are nothing compared to this makes me believe this issue has some sticking power.
 
Nobody has even considered the possibility that supposed high ranking "nobody" sites are given a high weighting to offset high weighted popular sites, creating a balance. Also explains why some favourites are extremely low and accompanied by other nobodies.

So, aside from the drama, has anyone actually had a problem with metacritic? For my personal needs, the averages have more or less been fairly consistent with how I feel about a game overall.

With that in mind, even if their heuristic is bat-shit crazy looking... maybe it works?
 
You may end up getting stickied in the OP.

I always said that that site because so powerful in the gaming industry that one pop of the integrity bubble and devs would stop losing their jobs over that sites rating.

I can see Activision, EA, and others spitting their drinks out after seeing which sites were on top.

I get the weighted premise as we know that the bias is strong in some of the smaller sites, but nowadays it comes across there as well.

I think we all just trusted that Meta had the best system out there. since GameRankings got steamrolled.

The hype that some causes get that are nothing compared to this makes me believe this issue has some sticking power.

Your post just made me realize now, that this lets game publishers know exactly who to pay off to get a high Metacritic score in the future. We already know that publishers are buying reviews, but now they can do it with even greater efficiency and effectiveness.
 

Cat Party

Member
This thread is bonkers even for GAF. Metacritic has no reason or obligation to post its weighting formula. There is quite frankly no good reason why they should, or why we should need to see it. They're not going to tell you what their formula is, because then people can copy it.
 
This thread is bonkers even for GAF. Metacritic has no reason or obligation to post its weighting formula. There is quite frankly no good reason why they should, or why we should need to see it. They're not going to tell you what their formula is, because then people can copy it.

I'm not sure if you read the thread or not, but the entire point is that they don't need to tell anyone their formula, it's right here. Even if their formula works differently, according to the researchers, it accurately models the score of almost every game on their site. So it doesn't matter if it's not exactly Metacritic's formula, almost equivalent is good enough.

Edit: And with that, now I've answered my own question earlier about whether or not the reviewers' bottom lines will be affected by their ranking. They will, because the people who are ranked low will get paid less (or not at all) by publishers to boost their review scores, and the people who are scored higher will get paid more for bought reviews.
 
Are people really that surprised about GB reviews? I love me some GB, but there reviews I hardly give a shit about. Podcasts and quicklooks and TNT's give me more info about a game than any written thing. And let's face it, I don't particularly feel any of the gb staff writes really well. They are great on mic and camera though.
 
Why shouldn't outliers matter or be included? This isn't a scientific, objective study. These are opinions. Just because someone's opinion dissents from the majority doesn't make it any less valid.
I agree that their opinion isn't any less valid, but that's why we have written reviews. We're talking about statistics. One reviewer lowering or raising the average score doesn't make it more accurate as to how good the game is.

Its not an official list, but something being researched over months. MC already came out and did NOT deny that they are doing it, but simply said they are doing it "differently" without giving any hints to what they are actually doing. And with that, the point still stands. Weighting results, with seemingly random changes in weighting importance at that, is just not something a score aggregate website should do.
They're actually proud of it. It's on their about page:
Metacritic has evolved over the last decade to reflect their experience distilling many critics' voices into the single Metascore, a weighted average of the most respected critics writing reviews online and in print.
 

hammster

Archbishop of Canterburny
This thread is bonkers even for GAF. Metacritic has no reason or obligation to post its weighting formula. There is quite frankly no good reason why they should, or why we should need to see it. They're not going to tell you what their formula is, because then people can copy it.

What is Metacritic's role? Review aggregation. Why? To serve as a guide for a consumer. Now, wouldn't you say that to be an effective guide to a consumer that it is important for the consumer to know how you landed on your score? They absolutely need to be transparent about this or their site lacks legitimacy.
 

jayu26

Member
Are people really that surprised about GB reviews? I love me some GB, but there reviews I hardly give a shit about. Podcasts and quicklooks and TNT's give me more info about a game than any written thing. And let's face it, I don't particularly feel any of the gb staff writes really well. They are great on mic and camera though.

May be, but that does not make their opinion any less relevant then say IGN.
 

Buft

Neo Member
Yahoo Games and official Magazines getting a higher rating while decent websites roll with a lower aggregate modifier, in no way is that consumer friendly.
 

Cat Party

Member
What is Metacritic's role? Review aggregation. Why? To serve as a guide for a consumer. Now, wouldn't you say that to be an effective guide to a consumer that it is important for the consumer to know how you landed on your score? They absolutely need to be transparent about this or their site lacks legitimacy.

It's always been disclosed by Metacritic that they weigh the scores, and the raw data is there for everyone to see. You are not being misled or kept in the dark. Metacritic has no obligation or incentive to tell you more.
 

hammster

Archbishop of Canterburny
It's always been disclosed by Metacritic that they weigh the scores, and the raw data is there for everyone to see. You are not being misled or kept in the dark. Metacritic has no obligation or incentive to tell you more.

We are being kept in the dark. They haven't disclosed what they actually do with that raw data. It could make the world of difference. Knowing how sites are weighted is absolutely essential in knowing how trustworthy they are. As the outrage from this supposedly wrong list demonstrates.
 
It's always been disclosed by Metacritic that they weigh the scores, and the raw data is there for everyone to see. You are not being misled or kept in the dark. Metacritic has no obligation or incentive to tell you more.
I completely agree. If metacritic was actually bad nobody would use it. The fact that people trust their averages means that they are doing something right. Knowing how they average it doesn't matter, particularly if the weightings shift regularly, which they probably do (regardless of what this firm is reporting).
 

Toma

Let me show you through these halls, my friend, where treasures of indie gaming await...
I completely agree. If metacritic was actually bad nobody would use it. The fact that people trust their averages means that they are doing something right. Knowing how they average it doesn't matter, particularly if the weightings shift regularly, which they probably do (regardless of what this firm is reporting).

You are basing your comments on completely pointless assumptions. Trust does not need to be in a direct correlation to the quality of a service. In fact it doesnt even need to be "trust". Setting up their business as a monopoly, and blatantly out advertising any competitor has the same effect: more people using it because they dont know better. And this is exactly the reason why more people should be made aware of that and why its wrong. Just because a majority of people THINKS something is good/right, doesnt make it good/right.
 

Ahasverus

Member
The gamasutra guys found a model that works. That's all that matters. Wheter it's accurate to the real or not (It pretty much is) it wors so it's not far. The differennce between them is astounding.
 
The fact is the study is practically useless. There's no chance that Metacritic keeps or has kept their weightings the same. There are multiple factors to consider regarding any given outlet's weight at any given time.
 
You are basing your comments on completely pointless assumptions. Trust does not need to be in a direct correlation to the quality of a service. In fact it doesnt even need to be "trust". Setting up their business as a monopoly, and blatantly out advertising any competitor has the same effect: more people using it because they dont know better. And this is exactly the reason why more people should be made aware of that and why its wrong. Just because a majority of people THINKS something is good/right, doesnt make it good/right.
I take it you would have issue with Google search as well. Results for their own products are probably filtered higher, but nobody cares. Their service works.

Metacritic has competitors. I even remember the Activision/Bungie deal specifically mentioning GameRankings as a benchmark score for the studio receiving bonuses.

I am failing to see any problem with metacritic. I find it far more likely that their rankings do not work as described in the OP.

You don't see Google releasing their search algorithms. Same concept.
 

Pikma

Banned
Highest (1.5) -- Planet Xbox 360
Highest (1.5) -- PlayStation Official Magazine UK
Highest (1.5) -- PlayStation Official Magazine US
Highest (1.5) -- TotalPlayStation
Highest (1.5) -- Xboxic
This is so damn wrong, Company-exclusive magazines/websites shouldn't be up there.
 
I LOVE how people think I'm crazy when I say that Playstation exclusives seem to have inflated metascores as opposed to other exclusives.

I'm right, HAHAHA, I'm right!

Nah looks like you are both crazy and wrong. :p

Edit: Why are people still believing that article despite Metacritic's response?
 

wildfire

Banned
I'm not sure if you read the thread or not, but the entire point is that they don't need to tell anyone their formula, it's right here. Even if their formula works differently, according to the researchers, it accurately models the score of almost every game on their site. So it doesn't matter if it's not exactly Metacritic's formula, almost equivalent is good enough.

Edit: And with that, now I've answered my own question earlier about whether or not the reviewers' bottom lines will be affected by their ranking. They will, because the people who are ranked low will get paid less (or not at all) by publishers to boost their review scores, and the people who are scored higher will get paid more for bought reviews.

Keep in mind that Metacritic mentioned they update their weights. That greatly impacts tthe usefulness of these findings if they do it too frequently. We don't know how regularly they do it but we could tell over the next 6 months how long these research results hold up with the actual results and what has changed.
 

mclem

Member
I say let's kick up a massive fuss and act like it's legit until Metacritic goes public. Transparency is for the best.

They don't talk about WiiU enough.
I actually think Jeff in particular uses the line "the reality of modern game development" a bit too much.

That's a necessary sin for the reality of modern games journalism.
 

Toma

Let me show you through these halls, my friend, where treasures of indie gaming await...
Nah looks like you are both crazy and wrong. :p

Edit: Why are people still believing that article despite Metacritic's response?

Because MC isnt denying anything:
http://www.neogaf.com/forum/showpost.php?p=51752982&postcount=403
http://www.neogaf.com/forum/showpost.php?p=51755148&postcount=417

I take it you would have issue with Google search as well. Results for their own products are probably filtered higher, but nobody cares. Their service works.

Aaaah, so thats why the Google result for "Buy tablet" brings up the iPad first instead of the Google Nexus. And why "free email" brings up yahoo and hotmail before gmail.

...no, wait.

Seriously, stop making pointless assumptions.
 

Camwi

Member
Somehow I knew there would be a lot of complaining in this thread due to GAF's massive erection for GiantBomb. I'm glad it's lower. Never understood the obsession with that site.
 

Oersted

Member
Nah looks like you are both crazy and wrong. :p

Edit: Why are people still believing that article despite Metacritic's response?

Why are you trusting them without them showing how they rank? What makes you believe a company more than journalists which investigated more than six months on this?
 

Stinkles

Clothed, sober, cooperative
Why shouldn't outliers matter or be included? This isn't a scientific, objective study. These are opinions. Just because someone's opinion dissents from the majority doesn't make it any less valid.

but the value of the score is the fact that it's an aggregate of typical review "tones" so heavy weighting of aberrant opinions would defeat the purpose unless the outlier was a hugely popular and influential organ.
 
Lower (0.5) -- Armchair Empire
Lower (0.5) -- Cheat Code Central
Lower (0.5) -- Game Over Online
Lower (0.5) -- Game Positive
Lower (0.5) -- Gamer's Hell
Lower (0.5) -- Gamereactor Sweden
Lower (0.5) -- Gamers.at
Lower (0.5) -- Giant Bomb
Lower (0.5) -- PS3bloggen.se
Lower (0.5) -- RPGamer
Lower (0.5) -- Vandal Online

I'm left scratching my head over this list.

First, as co-founder and head writer of The Armchair Empire* (almost 13 years old now), I'm kind of terrified that publishers will look at this list and cut AE out of distribution of review copies. That move has the potential to completely gut my site. I'm not making enough money with ads to buy games -- usually it's enough to pay server bills, mailing costs, beer, etc. -- so if the games stop coming because we're part of the 0.5 then I might as well pack it in because I've only got so many joke articles in me.

Second, how the hell is AE on the "same level" as Giantbomb? Admittedly, we're not even on the same level. With major corporate backing, a legion of fans, and the experience of the writers involved... how does that compute? We don't even use a 5-star scale!

I can't wrap my brain around this, especially in light of the sites sitting in the 1.5 and 1.0 groups. I will admit that I do like the hits even if our "weight" isn't very high.

* Never heard of it? I don't blame you, even if we've been around for longer than a decade; but if you're familiar with the GFW Radio Reunion at PAX '09, that was a lot of my handiwork. (Keep your fingers crossed for PAX Prime this year!)
 

Brashnir

Member
but the value of the score is the fact that it's an aggregate of typical review "tones" so heavy weighting of aberrant opinions would defeat the purpose unless the outlier was a hugely popular and influential organ.

By weighing publications higher when they hit the "norm" more often, you're encouraging outlets to try to meet that norm rather than give an honest opinion.

It's yet another influence on outlets to stick to the 7.5-9.5 scale for everything. Stay in the safe zone, and your voice matters more.
 

I thought that it was commonly known that they weighted different each review. I actually was wondering why people were still quoting that list, which Metacritic already refuted.

Why are you trusting them without them showing how they rank? What makes you believe a company more than journalists which investigated more than six months on this?

Well, are there people actually believing that Firing Squad, Da GameBoyz, Yahoo Games and other lesser known sites with atrocious writing in their reviews have more weight than Eurogamer, Destructoid, Giant Bomb and other websites with high reputation? The hierarchy in that list is why I don't believe that Greenwood's list is accurate and surprises me that some people do.
 

Jhriad

Member
CTRL + F "Giantbomb"

Lower (0.5) -- Giant Bomb

jesus_haha_no_answer_3_xlarge.jpg
 
I'm still confused why anyone interested in a meta score would use MetaCritic as opposed to GameRankings.

GameRankings is better in every way
 

sono

Member
This is just plain wrong.

No wonder I always end up agreeing a lot more with the ranking when I sort by user reviews. I will now do this always.
 

Codeblue

Member
CTRL + F "Giantbomb"



jesus_haha_no_answer_3_xlarge.jpg

Makes sense. Giant Bomb uses five stars that don't necessarily correlate to what IGN would give a 6/10 to. When you only grade in increments of 20 that doesn't help Metacritic's retarded mission to assign each games worth a numerical value.
 
Top Bottom