• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

The recent backlash against review scores is misguided.

I think that some "skewed" reviews scores are natural, after all we are not all equal, but if the Metacritic average is clearly wrong(=it's different from what the majority of gamers think) then it means that most of the reviews' scores are incorrect.

Theres no such thing as metacritic average being wrong.

even in this dumb scenario you've concocted, if 10 people review a game and give it a 4/10 and then 20 other people play it and say it's 10/10, the 10 people weren't wrong. The 10 people who got paid for their opinion got "counted" by metacritic, but they weren't wrong. That's asinine
 
It has enough steps to give a decently nuanced take on the game's worth without adding a bunch of extra scale that is never used.

Basically:

5/A - Exceptional
4/B - Above average
3/C - Average
4/D - Below average
1/F - Terrible

If a reviewer is dead set on attaching a score to a review then it's the most efficient way to do so.

EDIT:
gamepro_guy.jpg

so.. nothing between "above average" and "exceptional"? ok then. makes zero sense to me but that might just be me. i think a 1-10 or even 1-100 scale makes more sense. just me? ok, i'll go hide in a corner.
 
Oddly enough, this behavior is in my experience mostly isolated to video games as a medium. It's one of the only places where you see people building up massive senses of investment in something prior to experiencing it and then struggling to accept when it doesn't match expectations; in some cases going as far as developing and claiming vast conspiracies rather than admit to some simple cognitive dissonance. Far to often on GAF, for example, you'll see people with the game in question as their avatar balking at review scores both before and after release. Seemingly unaware of how biased they appear and most likely are.

Reminds me of when Star Wars: Episode 1 came out.
 
so.. nothing between "above average" and "exceptional"? ok then. makes zero sense to me but that might just be me. i think a 1-10 or even 1-100 scale makes more sense. just me? ok, i'll go hide in a corner.

Any room in that corner? I think 1 to 5 is so incredibly simplistic that it does a complete disservice to developers who want honest, detailed feedback on their games. It also assumes that the consumer is incredibly simplistic or too stupid to understand a more detailed review scale.
 
Consumer Reports, a very well regarded American publication, which takes no advertising to prevent accusations of bias, uses numerical scores.

Sort of. They come up with a score, but it is based on a detailed breakdown of various features of a product with standardized testing that results in a qualitative score. Those qualitative scores get compiled to come up with the numbers.

Game reviews work nothing like this. There is no standard pick up and play test. No two-hour playthrough. No newcomer to the series tests. If they had such standard tests, then maybe the number or even a qualitative series of harvey balls would make sense. Until that day, they're slapping a number on a review that makes no attempt to constrain (not eliminate) reviewer bias.

Harveyballs_red_black_modification.PNG
 
I sympathize with Eurogamer and Joystiq's reasoning with regards to the big AAA games that people go into a tizzy about being very fluid experiences these days. Review code is not day one code, which itself will be different from third month code. Never mind that reviewers don't have time to sink dozens of hours into the multiplayer, which won't be in a representative state pre-release anyway, and experience how its long term unlock grinding works out in practice. Then there's constantly evolving multiplayer-only games like the entire MOBA genre, as well as more and more open beta and early access games that people want coverage of, but which don't lend themselves well to being assigned a single static score for all time.

Review scores only work well under the assumption that the product is a fixed, unchanging thing, and games are more complex than that now.
 
so.. nothing between "above average" and "exceptional"? ok then. makes zero sense to me but that might just be me. i think a 1-10 or even 1-100 scale makes more sense. just me? ok, i'll go hide in a corner.


above average/good/great, then


what's the difference between a 6.6 and a 6.7

or on the other side, what's the difference between a 20/100 or 18/100?

at some point when you've got too many numbers then they dont actually mean anything, except you know that a 15/100 is probably worse than a 30/100 which is probably worse than a 50/100 in between that it's borderline useless

Any room in that corner? I think 1 to 5 is so incredibly simplistic that it does a complete disservice to developers who want honest, detailed feedback on their games. It also assumes that the consumer is incredibly simplistic or too stupid to understand a more detailed review scale.

that's why they're usually accompanied with the dang words, yo

if 2 reviews are basically similar and one guy gives it a 6.5 and the other guy gives it a 63, is a developer going to learn anything about the difference in the .2 that made one guy score it higher, or is he going to read the words of why his game was scored so low
 
I agree. I'm just saying the heat should go on the publishers for their insistence in tying bonuses to metascore. I don't think it's a strong incentive for making high quality games.

I don't think incentives work in art in general, but that's a different discussion.

The problem with bonuses tied to review scores is it gives reviewers too much power over the final product. Suddenly any stupid thing they want becomes required in every game.

"No online in this singple player RPG is unforgiveable."

So then devs have to plop in an online mode. Even though it makes no sense.
 
They can't give you your opinion on a game. They can only give you their opinions.

People seem to forget that between professional and unprofessional people there is and there should be a difference, a war correspondent cannot look at explosions and say that they looks like fireworks, even if he really thinks so, he should report facts as complete and objectively as possible so people can understand.

The same should apply to reviewers, just because games are not a serious matter like a war doesn't mean they don't deserve a serious treatment.
 
As someone who has done reviews a few times in my career, I often find that the score is actually the hardest part of a review when I write one. I find the rest flows easily when I've spent a protracted period with the game, and I always segment it into an overview?background, critique of graphics, critique of sound, critique of how long it'll last/if it's replayable, notable issues and then a conclusion.

With the score, it's hard to try and get everything into a specific value. I've always tried to maintain the 0-10 with 5 being average, 7 being good, 9 being awesome and 10 being amazing, but you will always have people who think 7 is average and everything below sucks.

Review scores are a necessary thing. Movies reviews have them. Book reviews have them. Hotel reviews have them. Why not games? Yes, the metacritic bonuses developers/publishers impose sucks, but we shouldn't scrap a metric that people find vital just because it's abused.

It's why I want to use a rating out of 28 for reviews :p (Obviously not serious) Seriously, a five star system should be standard. No decimal places, just five star.
 
*edit* I am a fan of Eurogamers non-interval four point scale. It is still a score, there is just no expectation of uniform distribution. I don't like overly-specific 10-100 point scales. Buy/rent/pass scales are problematic because they assume loads about the reader's financial status.

I thought it was a 3 point scale? Can't remember now
 
In my opinion, the criticism is misguided. The problem is not the scores, it is just another info added to a review. If someone jumps straight to the score, it is the same as someone who just reads the headline, or the last paragraph of a review, or the OP of a topic forum. There is nothing wrong with the scores as a device of summarizing the review.

The problem always lays on the usage of scores. Paying bonuses according to a score average or marketing scores is the problem because it leads to people that are not the original target of reviews not only to care, but to try to interfere. Getting rid of the scores won't change it because I can see them creating a new system to aggregate the review results and other press opinions quickly, and as long as they try to use reviews for marketing or even evaluating workers' performance, the manipulation won't stop.
 
Peesonally I don't see how a 5-point scale is any better or worse than a 10-point scale. Okay let's say a game gets a 3 out of 5. Is that any different than giving a 6 out of 10? It's the same value. It may seem more objective, but it's the same thing. If I learned anything from statistics, the more data you have, the more accurately you can represent the population. Shrinking the range of scores only leads a less accurate representation. I think a 10 point scale works well enough.

The thing is I do think is crazy is the decimals. A 9.75 vs a 9.9 and so on. That's just ridiculous
 
Consumer Reports, a very well regarded American publication, which takes no advertising to prevent accusations of bias, uses numerical scores.

To be clear, Consumer Reports is reviewing products on fairly hard metrics. It's seriously about product quality, durability, etc. It's a very different kind of system than reviewing/critiquing a piece of entertainment or art.
 
Any room in that corner? I think 1 to 5 is so incredibly simplistic that it does a complete disservice to developers who want honest, detailed feedback on their games. It also assumes that the consumer is incredibly simplistic or too stupid to understand a more detailed review scale.

Honest, detailed feedback comes from the text of a review. If you're looking for nuance from an almost arbitrary number you're looking in the wrong place.
 
Review scores only serve to ensure that the actual drivel (review) thats been written to justify the arbitrary number goes unread.
 
That the thing, other mediums aren't having this debate. There's a diversity of options. But, for some reason, in gaming, it's all or nothing for many.

I like scores but I also like sites that don't do scores as long as the writing is sound.

Heck, my favorite reviews are post mortem reviews, looking at games a year or more later. Those hardly ever have scores.

Other mediums have been around for longer, started when journalism was at its peak and so naturally those standards flowed into their reviews. Gaming journalism started at the decline of journalism and in most case their source of income is entirely dependent on the people who they are reviewing, so it is different.

Not all mediums are equal.
 
I have been on the 'no scores' train for a while. I see no reason to get off it. It's not a recent feeling. It's one I had even when I wrote game reviews, because they were so arbitrary and so often taken without the context of the accompanying text.

Scores are dumb. I hope we continue moving away from them.

But isn't it up to the outlet/review to come up with a method for readers to quickly analyze and determine your thoughts on the game without having to read 3 pages of text?

I think all reviews need to be able to concisely summarize their points, and scores--whether numerical or symbolic--were basically the key to this. I personally think summaries work best with a score AND text breakdown, but honestly I think the responsibility lies within the reviewer to list a concise summary.
 
I'm a big fan of the yes/no scale. That way nobody gets mad about how this game got 4 stars over that game that got 3. Either the reviewer thinks it's worth your time and money (and explains in text of the review why) or they don't (and explains why not). This can also be handled by making sure this is conveyed in the review without actually writing yes/no in a box at the beginning or end of the article.

It also helps with the problem of the aging of scores where people get into arguments about how a 9 now relates to a 9 five years ago. Metacritic adjusts to give a summary of all review sites as a yes/no where you report yes/no based on which which there are more of so that a person can go see a larger sample of opinions as well as their go-to sources for reading full reviews.
 
I'm a big fan of the yes/no scale. That way nobody gets mad about how this game got 4 stars over that game that got 3. Either the reviewer thinks it's worth your time and money (and explains in text of the review why) or they don't (and explains why not). This can also be handled by making sure this is conveyed in the review without actually writing yes/no in a box at the beginning or end of the article.

It also helps with the problem of the aging of scores where people get into arguments about how a 9 now relates to a 9 five years ago. Metacritic adjusts to give a summary of all review sites as a yes/no where you report yes/no based on which which there are more of so that a person can go see a larger sample of opinions as well as their go-to sources for reading full reviews.
But that's a problem with the readers, not the reviewers.
 
Give me a few bullets before the written piece, that's all the information I need to make a quick assessment. Most of the time I'll read a review just to learn a bit more.

I feel like the score is an opinion, where the review is a bit more in depth. That's just an opinion too, of course, so take it for what you will.
 
Peesonally I don't see how a 5-point scale is any better or worse than a 10-point scale. Okay let's say a game gets a 3 out of 5. Is that any different than giving a 6 out of 10? It's the same value. It may seem more objective, but it's the same thing. If I learned anything from statistics, the more data you have, the more accurately you can represent the population. Shrinking the range of scores only leads a less accurate representation. I think a 10 point scale works well enough.

The thing is I do think is crazy is the decimals. A 9.75 vs a 9.9 and so on. That's just ridiculous

You should have also learned that you shouldn't represent values as significant when they are a product of noise.
 
Okay let's say a game gets a 3 out of 5. Is that any different than giving than a 6 out of 10? It's the same value

numerically when breaking it apart into a 100 point scale and then comparing that to the "industry standards for reviews" then yeah its the same

but because 5 point scales will most likely use 3 on an average game, but some place like game informer will have 7.5 for an average game, then it means nothing.

There is a difference in tone/quality/expectations from a review of giant bomb giving something a 3/5 and then ign giving a game a 6/10.

Theres not really a problem with a 5 or 10 point scale in either direction. It's when they're all pulled into a 100 point scale that they become useless, even if mathematically it works.
 
People seem to forget that between professional and unprofessional people there is and there should be a difference, a war correspondent cannot look at explosions and say that they looks like fireworks, even if he really thinks so, he should report facts as complete and objectively as possible so people can understand.

The same should apply to reviewers, just because games are not a serious matter like a war doesn't mean they don't deserve a serious treatment.

So where does a score even fit into this? Sounds like you want reviews to be clinical descriptions of a game's features and mechanics. A fixed number of points for things like a game's length, price, resolution, and frame rate?
 
A five point scale is the only one that works.

I agree with this, but also think review scores are unnecessary. Scores are purely for those who want a big shiny number, they have almost no bearing on whether you will enjoy a game or not. I think listing the pros and cons of the title would help do that better than any score.
 
It has enough steps to give a decently nuanced take on the game's worth without adding a bunch of extra scale that is never used.

Basically:

5/A - Exceptional
4/B - Above average
3/C - Average
4/D - Below average
1/F - Terrible

If a reviewer is dead set on attaching a score to a review then it's the most efficient way to do so.

EDIT:
gamepro_guy.jpg
Always liked the 5 star system, it basically mirrors our grading system in the U.S starting from Kindergarden through college you get A-F as a grade. 1-5 always makes much more sense to me. I don't mind the Play it or don't play it some reviewers are doing either, but I like the 5 star system. and it can even be broken into intervals of half-stars.
 
People seem to forget that between professional and unprofessional people there is and there should be a difference, a war correspondent cannot look at explosions and say that they looks like fireworks, even if he really thinks so, he should report facts as complete and objectively as possible so people can understand.

The same should apply to reviewers, just because games are not a serious matter like a war doesn't mean they don't deserve a serious treatment.

I'll repeat. It is impossible for any reviewer to tell you how you personally are going to feel about a game.

If you don't care what the reviewer thinks about it, you shouldn't read their review because that is just about the only useful information they can give you beyond if the game is broken.

Also, if a reporter thinks an explosion sounded like fireworks, they are perfectly clear to say so. Obviously, they shouldn't report that someone shot off fireworks if they in actuality detonated a bomb, but there is absolutely no parallel to be drawn from this to video game reviews. There is no truth or fact to be had from a video game review. The best you can do is find someone who generally has the same opinions as you but that still won't be objective reporting even if you think they're always right.
 
The way I read the 5 point scale

1 - Some ol Bullshit
2 - Rental
3 - Buy it, just not at full price
4 - Worthy of your $60
5 - A crowning achievement in gaming
 
Peesonally I don't see how a 5-point scale is any better or worse than a 10-point scale. Okay let's say a game gets a 3 out of 5. Is that any different than giving a 6 out of 10? It's the same value. It may seem more objective, but it's the same thing. If I learned anything from statistics, the more data you have, the more accurately you can represent the population. Shrinking the range of scores only leads a less accurate representation. I think a 10 point scale works well enough.

The thing is I do think is crazy is the decimals. A 9.75 vs a 9.9 and so on. That's just ridiculous
You might be missing the point, and some others here as well.

I don't think the user who originally stated that (I don't want to speak for him, this is my personal opinion) the 5-point system is better or more accurate. I don't think it is at all. But IMO, it is the best way to deliver a critical analysis of a game that's easily understandable. 1-10 scale is of course more accurate, but the bottom half of that scale is never used. At the very least, most games that get greenlit are functional, working titles, so nothing hardly ever gets a 3/10 or 4.5/10. Really, for the majority of titles, you only need to differentiate amazing/good/average/bad. The more accurate systems aren't actually needed and do more harm than good from my perspective.
 
That the thing, other mediums aren't having this debate. There's a diversity of options. But, for some reason, in gaming, it's all or nothing for many.

I like scores but I also like sites that don't do scores as long as the writing is sound.

Heck, my favorite reviews are post mortem reviews, looking at games a year or more later. Those hardly ever have scores.

I get where you're coming from, I just personally think that it's been so one sided for so long (with everything being score based) that people are really tired of scores and the bullshit that can sometimes follow them, so that's what they want to push for. Worse case this might kill off scores for a while, and I don't even think that'll happen.
 
I sympathize with Eurogamer and Joystiq's reasoning with regards to the big AAA games that people go into a tizzy about being very fluid experiences these days. Review code is not day one code, which itself will be different from third month code. Never mind that reviewers don't have time to sink dozens of hours into the multiplayer, which won't be in a representative state pre-release anyway, and experience how its long term unlock grinding works out in practice. Then there's constantly evolving multiplayer-only games like the entire MOBA genre, as well as more and more open beta and early access games that people want coverage of, but which don't lend themselves well to being assigned a single static score for all time.

Review scores only work well under the assumption that the product is a fixed, unchanging thing, and games are more complex than that now.

That's a good point, but couldn't they just have reviews on the aspects of the game that won't fluctuate? Like piecemeal reviews instead of the whole meal at once?

Of course that means having to revisit the part of the game they didn't review and score it a week or so down the line.

I feel like there's a solution here, and it's up to the outlets to find a creative one, but just abolishing review scores seems like a lazy catch-all approach to the whole thing.
 
I sympathize with Eurogamer and Joystiq's reasoning with regards to the big AAA games that people go into a tizzy about being very fluid experiences these days. Review code is not day one code, which itself will be different from third month code. Never mind that reviewers don't have time to sink dozens of hours into the multiplayer, which won't be in a representative state pre-release anyway, and experience how its long term unlock grinding works out in practice. Then there's constantly evolving multiplayer-only games like the entire MOBA genre, as well as more and more open beta and early access games that people want coverage of, but which don't lend themselves well to being assigned a single static score for all time.

Review scores only work well under the assumption that the product is a fixed, unchanging thing, and games are more complex than that now.

Reviewers do themselves and their readers a disservice by forcing a review score out the door the day the embargo lifts, too. Gotta have them clicks.
 
People seem to forget that between professional and unprofessional people there is and there should be a difference, a war correspondent cannot look at explosions and say that they looks like fireworks, even if he really thinks so, he should report facts as complete and objectively as possible so people can understand.

The same should apply to reviewers, just because games are not a serious matter like a war doesn't mean they don't deserve a serious treatment.

this is an awful comparison. why would you compare a professional critic to a news correspondent when there are so many other meaningful comparisons? for instance a professional critic in another medium. news, by it's very nature, calls for objectivity. a critique, by it's very nature, calls for subjectivity.
 
Reviewers do themselves and their readers a disservice by forcing a review score out the door the day the embargo lifts, too. Gotta have them clicks.

Well, you're assuming that, A) it's "forced out the door", and B) that there isn't a legitimate reason for consumers to want reviews of products before they're released, especially in an industry where preorders and playing it on week one is so ridiculously prioritized.
 
In a 5 point vs a 10 point scale, scores such as 9/10, 7/10, 5/10, 3/10, and 1/10 wouldn't be represented. Those kinds of scores are noise?

It's my opinion that someone who gives a score of 8/10 would be just as likely to give a 7/10 or 9/10 if they reviewed the game on an empty stomach or after meeting the love of their life (extreme examples). So I think a 10-point scale is noisy, yes.
 
The problem here is not the scores, but the power game publishers have on gaming journalism. 'Give my game a bad score and I won't send you review copies anymore'. That's just wrong. I don't understand why is this a thing. Music is basically dominated by 3 companies and this doesn't happen. Why does it happen in gaming?

And the problem with Metacritic is "You don't have a review score, we're not going to send you our game early, because you don't contribute to Metacirtic"

or

"Your review scale does not provide enough flexibility on Metacritic. 4/5 is too harsh when you could mean 9/10, but not 10/10. Please adjust your scoring system if you wish to receive special access."

PR then does its job of casually bribing the press with lavish review events, personal attention, and swag to push that number up, because it's the difference between keeping your job or not, earning a bonus or not.

Review scores often decide the long term potential of New IPs.
 
I think The Order is somewhat boned in the long run because single player games that are not long can get cannibalized by GameStop. But the first month of sales will be interesting for one reason: Watch Dogs, Destiny, and Dyling Light topped their debut months with middling to poor review averages. Meanwhile, games that the press loved have never entered the top ten. I think we are entering an era where YouTube, Twitch, and social media matter more to the success of a product. Throw in the requesite marketing campain for good measure.

I think the Eurogamer/Kotaku approach is at least logically consistent in wanting people to focus on the actual content of a review, instead of how it relates to or impacts a review aggregate. If the audience and the outlets are far apart on where the consumer dollars should go, this method at least gives people more things to consider as opposed to arguing over an arbitrary numerical score. I think some sites invite the argument though, since it can drive traffic.
 
Reviewers do themselves and their readers a disservice by forcing a review score out the door the day the embargo lifts, too. Gotta have them clicks.

100% agree. It's all about catching that release-day rush of hits rather than actually digesting the meal.

When you marathon run a game to finish a review as fast as possible, it really messes with your ability to actually subjectively analyze it. Sometimes you get jaded, other times you get tired of having to keep playing the same game.

It's not a natural element to the process. This is the real problem. Now doing away with the review scores just makes it easier for the reviewers not to have to compromise day-one rush hits for a meaningful review.

It's a detrimental cycle that's eroding the soul of the industry. This is the real problem, I think, not scores in general.
 
What does "Average" even mean? Who actually plays enough appropriately-sampled games (i.e. without the giant bias of playing games from major publishers, or games that have positive word of mouth) to have a good impression of what average is?

Average as in middle of the road. So, a game good enough to play, but not exceptional in any way. I don't think average in this case means games that are enough alike to all be considered average.
 
I'll repeat. It is impossible for any reviewer to tell you how you personally are going to feel about a game.

If you don't care what the reviewer thinks about it, you shouldn't read their review because that is just about the only useful information they can give you beyond if the game is broken.

Also, if a reporter thinks an explosion sounded like fireworks, they are perfectly clear to say so. Obviously, they shouldn't report that someone shot off fireworks if they in actuality detonated a bomb, but there is absolutely no parallel to be drawn from this to video game reviews. There is no truth or fact to be had from a video game review. The best you can do is find someone who generally has the same opinions as you but that still won't be objective reporting even if you think they're always right.

this is an awful comparison. why would you compare a professional critic to a news correspondent when there are so many other meaningful comparisons? for instance a professional critic in another medium. news, by it's very nature, calls for objectivity. a critique, by it's very nature, calls for subjectivity.

So where does a score even fit into this? Sounds like you want reviews to be clinical descriptions of a game's features and mechanics. A fixed number of points for things like a game's length, price, resolution, and frame rate?

The score is usually the summary of the review, so an unfair review usually gets an unfair score for example.

But let's restart from the beginning and see if i can let you understand what i mean.

I personally don't care about reviews and reviewers, i prefer trailers and forums and the rare times i read reviews i do it just for the description of the game, in my mind i collect all the data and decide if a game is to buy or not.

That's me, but many(most?) people keep deciding blindly based on reviews because they see reviewers as people whose words count somehow, but if reviews are not meant for people and reviewers just write everything that passes on their minds then what's the difference between reviewers and non professional like us or even worse between them and fanboys? What about the people that rely on them?

Maybe people here in GAF are conscious that reviewers write exactly what they want, but more common people think that reviewers' words count because reviewers are professional, and reviewers know that people listen to what they say.
 
You might be missing the point, and some others here as well.

I don't think the user who originally stated that (I don't want to speak for him, this is my personal opinion) the 5-point system is better or more accurate. I don't think it is at all. But IMO, it is the best way to deliver a critical analysis of a game that's easily understandable. 1-10 scale is of course more accurate, but the bottom half of that scale is never used. At the very least, most games that get greenlit are functional, working titles, so nothing hardly ever gets a 3/10 or 4.5/10. Really, for the majority of titles, you only need to differentiate amazing/good/average/bad. The more accurate systems aren't actually needed and do more harm than good from my perspective.
But even with a 5-point scale, the same would happen. A 1/5 (20%) would practically be reserved for the most broken, unplayable messes. Is a functional game worth a 1/5? So it would almost never be used. So you have a scale of 2 - 5. How often would a 2/5 be used? A 4/10 essentially? Probably as often as games are rated 4 out of 10s now. So you'd be left with most games getting 3, 4, or 5. Average, Good, Excellent. Nothing changes
 
A five point scale is the only one that works.

I prefer a 15 point scale, myself. One of the biggest problems with review scores is that people tend to operate on the same scale they learned in school; 90-100 is an A, 80-90 is a B, etc. 5 and 10 point scales are too easy to mentally convert to a 100 point scale; a 15 point scale requires a little more mental arithmetic.
 
Top Bottom