• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

I decided to create a new, better review aggregator

Status
Not open for further replies.
Because if you have a different opinion to the majority you can't be trusted!
If you have some valid point on why you think that a game isn't worth a 10 so you can be trusted.
The problem is that who are criticize Jim for his review haven't read the full review, stop acting like the game is perfect and start read the review and not only the score.
 
Yeah I don't get it
The choice of 'trusted sites' is a bit bizarre - I can understand only wanting to include professional sites rather than individual youtubers but why only those 7? If you're going to be using a mean, incorporating more reviews would give you a more representative & useful result. Also, surely there are going to be a lot of games that haven't been reviewed by any/all of those sites either

Removing outliers in principle is a good idea, but automatically removing the top and bottom score doesn't make sense. If the lowest score is an 88 & the rest are 89-95 than it isn't really an outlier. Using more review scores would help with identifying outliers, but the cut-off point is always going to be arbitrary (also, the more scores you have the less influence an outlier has on the overall result)

Kind of bold to claim this is a better aggregation system then what's already out there, to say the least
 
If you have some valid point on why you think that a game isn't worth a 10 so you can be trusted.
The problem is that who are criticize Jim for his review haven't read the full review, stop acting like the game is perfect and start read the review and not only the score.

Did I really need to put a '/s' at the end of that post? Hint: it was sarcasm.
 
Just try and ignore (or, at least not care) about the Metacritic score if its purpose of for god's sake. Be happy if it's high but don't despair if the score goes lower, especially not to the point of posting a thread such as this.

This may be weird coming from me, a guy who was 'clearly' affected by Jim's score, but honestly the obsession with score is a roadblock for both viewing games seriously and maturing this immature community. A better review aggregator, to me, would not try to aggregate arbitrary, subjective scores but to aggregate consensus, just like Rotten Tomatoes really. To the average consumer I would bet decent money that "100% of reviewers liked this game" would be a better single determining factor of quality than "this game's mean score... with some reviewers more heavily weighted than others... based on completely arbitrary figures... is 98."
 
Giant Bomb isn't granular enough
Easy Allies I would maybe consider, but I don't really know about YouTubers.
No single person review sites.

How much granularity are you looking for? The sites you mentioned have 10 point, 20 point, 100 point and 1000 point review systems. Seems arbitrary to make the cut off at 5 while allowing huge ranges like that.

Easy Allies is literally Game Trailers. It's the same exact people doing the same thing without an ad revenue team.

Laura Kate Dale reviews for Jimquisition soo...
 
How much granularity are you looking for? The sites you mentioned have 10 point, 20 point, 100 point and 1000 point review systems. Seems arbitrary to make the cut off at 5 while allowing huge ranges like that.

Easy Allies is literally Game Trailers. It's the same exact people doing the same thing without an ad revenue team.

Laura Kate Dale reviews for Jimquisition soo...

I feel like going below a 10 point system we're getting too imprecise for an aggregator. You get situations where an 80 can cover the whole range from 70-90 and that's just too much.

Sounds good, they're definitely up for consideration. Will probably be added soon, actually. As soon as I get some sleep.
 
"I decided to create my own imaginary biased aggregate system just to make Breath of the Wild look like its the closest thing to a perfect 10 in existence by ignoring all non 10/10 reviews"

You're a goddamn genius OP
 
Op your idea is bad. It's a bad idea. It's OK man, everyone shits their pants every once in a while, ya just pull up your pants, apologize for the lingering smell and the stains, and promptly leave after cleaning the couch with baby wipes and leather polish.
Oh and stop wining so much about dissenting review scores for games; it's sad and weird
.
 
Why shouldn't a negative review be allowed to shift the aggregate?
It's not incentive compatible. Using the mean gives people the capacity to game the system by misreporting their true preferences. This is half of why Rotten Tomatoes is generally very reliable while Metacritic is generally not.
 
Wait so just because a review doesn't get on board with the majority of other scores it shouldn't be counted? Why should a more negative review not be counted just because there are a ton of reviews that lean more positively? This is 100% because of the ridiculous outrage at the 7/10 Zelda review. Just stop... This is embarrassing at this point. Reviewers are allowed to have varied opinions. No matter how different they are from your own. You can't just snuff out certain journalistic outlets because you don't agree with them.
 
Let's just agree to care less about reviews unless you take the perspective that reviews are harmful to the industry as opposed to somewhat useful, subjective measures of quality. There are very few reviews I read/watch and they're typically from people whose videogame tastes somewhat align with my own but I find previews and hands ons much more influential and informative than reviews.
 
https://docs.google.com/spreadsheets/d/1hO2d_BHZYT-A4JR-IQyuDoEcWGx1PevBgQsQPzh17b4/edit?usp=sharing

Explanation: This is a draft for a possible review aggregation system. If people like it I will continue to work on it.

The idea is that I only take the review scores from a few, major, well trusted gaming websites and these form a "panel" from which a score is calculated.

The panel consists of:
EDGE
Polygon
GameSpot
IGN
EGM
Game Informer
GamesRadar+

For any particular game, the highest and lowest scores are ignored and the rest are averaged to give the final score.
What do you think?

Who suggests which outlets are authoritative enough make up the panel? Why are other outlets' opinions worth less?

And worst, why would you chop off the highest and lowest scores? How is that determined? If a game received 10, 10, 10, 10, 7, 5 from various outlets, you'd only give that game a 7/10? Or conversely, if a game received 4, 4, 4, 4, 4, 4, 4, 7, 8, you'd also give that game a 7/10?
 
Averages can get skewed too much by outliers. Judged sporting events generally work that way, its not an unreasonable to counter it.

For example, I think a game that scores 9,8,9,8,9,9,1 should score above one that scores 8,8,8,7,8,8,6

Though having lots of reviewers also works well for limiting the impact of outliers, which is one reason I don't think limiting the "panel" so severely is a good move.

Video game scores are also more vulnerable to negative outliers because "average" is so high on the scale. This makes it easier to shift the score with a strongly negative review than a strongly positive one.
Just because an outlier exist doesn't justify it's exclusion. Additionally, once you remove the outlier from a distribution another one takes its place.
 
It's not incentive compatible. Using the mean gives people the capacity to game the system by misreporting their true preferences. This is half of why Rotten Tomatoes is generally very reliable while Metacritic is generally not.
You think reviewers care about a game's metacritic score when they're scoring a game?

What are the incentives to game the system? It's not like a low score will earn them a spot on a trailer or give them a bonus
 
I'm late to this thread, but I'd like to contribute and say how idiotic this idea is.

Good luck building your giant echo chamber of self validating opinions.
 
You know, something I *would* like is a way to customize the reviews that are aggregated on Metacritic. You login, you select the outlets you trust the most (or trust the least) and you get customized review scores calculated per your preferences.

Why not?
 
You know, something I *would* like is a way to customize the reviews that are aggregated on Metacritic. You login, you select the outlets you trust the most (or trust the least) and you get customized review scores calculated per your preferences.

Why not?

Use OpenCritic instead. This is a feature there.

It's also better than MC generally anyway.
 
Let's just agree to care less about reviews unless you take the perspective that reviews are harmful to the industry as opposed to somewhat useful, subjective measures of quality. There are very few reviews I read/watch and they're typically from people whose videogame tastes somewhat align with my own but I find previews and hands ons much more influential and informative than reviews.

Previews and Hands-on are based on controlled demos or a vertical slice of the game where as reviews are based on the whole game. How is the former more informative?
 
You know what would solve all of this. Just fucking copy Netflix before the threw out their suggestions algorithms.

You go in you type in ratings for like 20 games you played, more if you want.

Loved It
Liked It
Meh
Didn't like it
Hated it

No numbers bullshit nothing.

Then it matched you up with similar peoples tastes and it gives you suggestions based on games you didn't rate, but people with similar tastes liked. Hell then you could get marketers on board by shooting out ads to specific groups.

It wouldn't matter if you vote 5 or 1 star to fuck over the average since you are only hurting yourself on getting good suggestions.

Someone steal my idea and give me 20% of the residuals.
 
You know what would solve all of this. Just fucking copy Netflix before the threw out their suggestions algorithms.

You go in you type in ratings for like 20 games you played, more if you want.

Loved It
Liked It
Meh
Didn't like it
Hated it

No numbers bullshit nothing.

Then it matched you up with similar peoples tastes and it gives you suggestions based on games you didn't rate, but people with similar tastes liked. Hell then you could get marketers on board by shooting out ads to specific groups.

It wouldn't matter if you vote 5 or 1 star to fuck over the average since you are only hurting yourself on getting good suggestions.

Someone steal my idea and give me 20% of the residuals.
Someone stealing it kind of removes you from getting anything from it. ;)
 
You know what would solve all of this. Just fucking copy Netflix before the threw out their suggestions algorithms.

You go in you type in ratings for like 20 games you played, more if you want.

Loved It
Liked It
Meh
Didn't like it
Hated it

No numbers bullshit nothing.
How is that any different from a number scale? It's a 5 point ranking with 1/5 being the lowest and 5/5 being the highest. That's all a number score is. It's not a value, it's a label
 
I think something more akin to rotten tomatoes, where something is fresh or liked vs something that is rotten or disliked by critics would be best rather than trying to go with numbers since they always end up going bad
 
The way I've always considered buying a game after a review is: buy at full price (i.e. now), buy at a discount, don't buy. I don't think that translates to a review of the qualities of a game, but helps me to differentiate a truly great game from a good game to a bad game.

If there was a site that reviewed that way, they'd get my readership.
 
You think reviewers care about a game's metacritic score when they're scoring a game?

What are the incentives to game the system? It's not like a low score will earn them a spot on a trailer or give them a bonus
One reason would be because they think the aggregate is too low or to high and want to swing it close to where they think it "should" be.

There is no shortage of cases where this happens in film and hall of fame voting for example, so ex ante it's not clear to me why gaming would be uniquely immune from this phenomenon.

That said, I really don't know or care whether people do this, but it's still a basic principle of good preference aggregation. If you believe that no one would try to game the system to make the aggregate more closely reflect their preferences, then more power to you and your faith in people's integrity, but it's still a bad system.
 
How is that any different from a number scale? It's a 5 point ranking with 1/5 being the lowest and 5/5 being the highest. That's all a number score is. It's not a value, it's a label
No because everyone complains that a 1 or a 2 or a 3 mean something different to everyone else. This way it doesn't even matter if you think a game is a 6 or 2 or 10 you liked or you didnt.

Also this would try to actually take out the necessity to judge a game by its quality making 'best of' lists into more representative 'favorite' lists.


Aggregate ratings would be hidden to indivials in the same way Netflix red star ratings are what they think you would rate it not the average rating.
 
Why shouldn't a negative review be allowed to shift the aggregate?

It should shift the aggregate, but not disproportionately so. What's proportionate is essentially subjective and depends what the purpose of your system is, but there's a decent argument that simply taking a mean effectively gives too much weight to extreme scores.

If you want your review scores to reflect the consensus, then trimming off outliers can make sense. If 7 people thought the game was mediocre and 1 person thought it was the best game ever, then the consensus is that its mediocre, not that its above average.
 
I think something more akin to rotten tomatoes, where something is fresh or liked vs something that is rotten or disliked by critics would be best rather than trying to go with numbers since they always end up going bad

I don't think that model would work well for the videogame industry because you'd frequently get most games at above 90%, 95% positive reception or so. There are very few legitimately negative reviews for any given game or truly mixed reactions to a title. For the movie industry it works because there's both a wider range of scores and critics.
 
I think the aggregation sites are fine but articles or videos from somebody who regularly engages in trolling should not be allowed to be posted on NeoGaf. These posts and videos are toxic and do nothing but feed the troll.

When somebody comes out with an article "Why x game Is Blantantly Better Than y game" he is looking for an angered response from a certain fan base. He is looking to get DDos'ed so that he makes the news and people start sharing his "work." Suddenly, his "work" bubbles to the top of search engines and Youtube.

He is looking to become the victim so people can support him on Patreon. He says things like I don't monetize my videos or website and people drink the Koolaid that he is one of them when he is working against them. Is he really doing God's work? If he didn't get paid to be a troll, he wouldn't be trolling. If nobody fed the troll, he would just go away. You don't need to remove "critics" from Metacritic. You just need to stop talking about them.

Jim Sterling has you shaken.
 
Completely useless. You can already decide to ignore some reviews on OpenCritic for your own bias.

It's just a number people, if you enjoy the game, who cares ? (Besides OP)
 
Is hilarious how many people not in the industry, with nothing riding on review scores at all, are suddenly up in arms.
 
Previews and Hands-on are based on controlled demos or a vertical slice of the game where as reviews are based on the whole game. How is the former more informative?
Watching people actually play a game or discuss their experiences tell me more than reviews which are 1 or 2 pages long or 5 to 8 minutes in length. I like hands on videos and discussions. They give me a better idea of what the game is like and whether or not I will enjoy it. For example, watching Game Attack play Snipperclips showed me much more about what is fun and maybe not so fun about the game than listening to someone's scripted monologue about it. Strictly controlled demos less so.
 
How is that any different from a number scale? It's a 5 point ranking with 1/5 being the lowest and 5/5 being the highest. That's all a number score is. It's not a value, it's a label
A better way to pUT it is that most people's favorite games are rarely if ever the best games. So best games lists are not necessarily a good way to determine whether or not you would actually enjoy playing the game.

For example. I don't like the last of us at all. I will aknowledge it is a great game, I just don't like it. Now I see that game all over the place as being the best ever, but it's not the best ever to me, so a 10/10 means nothing to me. Where as I think kingdom Hearts 2 final mix is my favorite game and that essentially a solid 8 at best
 
Status
Not open for further replies.
Top Bottom