• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

IGN: Is Metacritic Ruining The Games Industry?

didnt someone just write an article on this a few weeks back or was it a post on GAF?

i swear i just read something on the same topic here recently... or was it in some shitty game mag i dont know why i have a sub to?

IGN is ripping off someones idea here.
 
didnt someone just write an article on this a few weeks back or was it a post on GAF?

i swear i just read something on the same topic here recently... or was it in some shitty game mag i dont know why i have a sub to?

IGN is ripping off someones idea here.
Thank God I am not the only one that notices this trend. I thought I was being delusional, but no! Someone has awakened!
 
So because publishers use an aggregate site like this as a metric for financial bonus' or contract obligations we should blame the aggregate site?

Why not put the blame on the publishers? They seem to be in control of these scores to some degree from what we have seen. Seems pretty shady to blame the aggregate site rather than the people pulling the strings. IGN grow some balls and bite the hand that feeds rather than going after some middle man or just stfu and go back to licking portable game consoles.
 
I would say GameRankings > Metacritic, but that's just my opinion.

Does anyone know the aggregate user review scores of those respective websites?

User scores are sometimes even worse. Fanboy kids rating a major exclusive from the other camp with low scores, or people rating a game 0 because of the little changes, using it as means to express themselves how discontent they are, for example the ME3 scores.
 
I always use gamerankings.

My gaming time is limited. When I buy a game, I want to keep the chances of buying bargain bin material as low as possible and while GR is no garantee , it's still the best tool for that.
 
mclem: Are Sensationalist Headlines Ruining the Games Journalism Industry?


The headline implies that the fault here lies with Metacritic, which I don't agree with; it's simply a service. The text, however, is much more reasonable - but I still don't think it gets to the root cause: budgets.

Budgets are high. Publishers need to make a profit on a high budget, so they've got to appeal to everyone. Metacritic *does* strike me as a reasonable measure of 'appeals to everyone' in principle (tastes can vary that somewhat, admittedly).

If publishers could make a reasonable profit on fewer sales, they wouldn't have to ensure they appeal to everyone. And to make a reasonable profit on fewer sales, you've gotta spend less in the first place.
 
Budgets are high. Publishers need to make a profit on a high budget, so they've got to appeal to everyone. Metacritic *does* strike me as a reasonable measure of 'appeals to everyone' in principle (tastes can vary that somewhat, admittedly).

Makes sense. And I wonder what will happen in this gen with the possibility of blowing even more money on complex art assets. The only saving grace is that online distribution has higher margins than retail which may provide publishers with more revenue...
 
If publishers could make a reasonable profit on fewer sales, they wouldn't have to ensure they appeal to everyone. And to make a reasonable profit on fewer sales, you've gotta spend less in the first place.

There is a school of thought out there that the biggest publishers are deliberately pushing the bar on costs as high as possible - even to unprofitable levels - to push weaker competitors out of the market.
 
I think the problem was never metacritic itself, but the publisher that use money/trips/gift/exclusives to manipulate the gaming press. It's the gaming press that's pretty much in bed with the publishers that are the problem. Which IGN is guilty off the ME3 review page was damn disgrace. Also publishers that give out bonuses based on metacritic scores. Why blame a tool when the core of the problem lies elsewhere.
 
IGN reads like a tabloid.

It only points out the problem but the solutions it suggest are the easy way out without actually thinking much (e.g. culling only the symptoms but not the problem, i.e. reviews that are obviously paid). Then again it wouldn't be surprising since they practice it.
 
I'm pretty confident that most gamers no longer care about the traditional review. With all the podcasts, word of mouth, forum sites like GAF, demos ect It's much easier to get an idea about a game and if you will like it much easier then it used to be. If you are still trusting numbers on metacritic you are doing it wrong.
 
As long as sites like IGN have a numerical score this problem isn't going to go away.

It also doesn't help that mainstream videogame sites act more like PR dumps then journalists.
 
There is a school of thought out there that the biggest publishers are deliberately pushing the bar on costs as high as possible - even to unprofitable levels - to push weaker competitors out of the market.

I've heard that too, but it's implicit on said weaker competitors being willing to play that game in the first place. Then again, we do see quite a lot of groups releasing games outside their realistic hopes of recouping costs, so maybe there's a little truth to it.
 
No grades and we're done. Or a video review but no scoring at the end. anyone should be able to make his own decision with the explanations of the reviewer. It still will be a review of course with good and bad points but you'll make your own grade by matching and taking into accounts all the items you're the most interrested in.
 
The site itself isn't ruining anything, the problems as I see it are:

-An over-reliance on numerical ratings in the games industry which aren't there for any purpose other than for the reader to skim to the end and look at the all defining number. The decimal point system that some sites use is comical.

-A lack of real diversity in review scores, everybody seems to have the same metrics, opinions seem to converge and offer samey insight, there's about a hundred or so major gaming outlets yet very few offer much genuine analytical and critical thinking. Not to mention the quality of the writing is more than often incredibly poor.

-Gamers seem to think that a compendium of subjective reviews results in an objective truth. Because more people say something doesn't mean it's necessarily correct, nor does it discount the ones that disagree with that consensus.

-Publishers see the effect it has on gamers and see it as the be all and end all in terms of the quality of a game. So they set some arbitrary cut off point whereby a game needs to be over X score on metacritic. It's frankly ridiculous, not only to put so much stake in metacritic but also suggest that there's any ounce of difference between an 84 and an 85 score. These numbers are next to meaningless.
 
Chû Totoro;39959224 said:
No grades and we're done. Or a video review but no scoring at the end. anyone should be able to make his own decision with the explanations of the reviewer. It still will be a review of course with good and bad points but you'll make your own grade by matching and taking into accounts all the items you're the most interrested in.

Some time ago - I'm thinking over ten years - Edge did a feature *on* game reviewing, and to accompany it they didn't actually put the score for any of the games reviewed on the review page - they were all tucked on a back page somewhere. The intent was to encourage people to actually read the review... and if they read the feature, they'd be *told* where to find the review scores.

It had mixed success. I think there were some complaints that the review scores just didn't exist, but others lauded what they were trying. I'd like to see them try that again sometime. Particularly in light of the fact that it's an excellent time to revisit that accompanying feature.


-Publishers see the effect it has on gamers and see it as the be all and end all in terms of the quality of a game. So they set some arbitrary cut off point whereby a game needs to be over X score on metacritic. It's frankly ridiculous, not only to put so much stake in metacritic but also suggest that there's any ounce of difference between an 84 and an 85 score. These numbers are next to meaningless.
I wonder if that's a quirk of our nature as gamers - it's instinctive for us to think of things in terms of 'winning' and 'losing', and so this gives a foundation to base that premise on, misguided though it is in this context. You don't really hear about movie fans reacting in similar ways to reviews, after all, but movies aren't (generally!) inherently confrontational.
 
This argument has been used by developers and publishers for years. They want game reviews to be treated like book reviews, which is a fair request, but it will never happen. Arguably, game reviews deserve to be treated like book reviews, it's just something that we'll never, ever see, because the typical person looking to consume a videogame review wants to consume the same type of review as a movie review.
 
i don't think scores are always a wrong thing to have but the way they are used on most videogamessites is stupid. There is no way the scores can be as accurate as a 100 point or even a 10 point scale suggests.

The Reviews are often far to subjective for that.
 
In those cases, the royalties would have been for all intents and purposes out of the hands of reviewers and then this article would have had no point. As is, aren't they in their full right to complain about the system if it involves them?

They're welcome to complain about the system, but generally a complaint has to be mindful of alternatives even if it doesn't provide a solution.

Here are the choices:
- Do not offer bonuses based on game quality; instead, offer no bonuses or offer bonuses based on some other metric (presumably sales)
- Offer bonuses based on game quality; calculate it using some other metric than Metacritic (such as some sort of internal, publisher-determined quality metric; focus groups or expert panels; a cherry-picked subset of reviews; etc)
- Offer bonuses based on game quality; calculate it using Metacritic.

In all three cases, the bonus condition could be widely missed, narrowly missed, or achieved. Narrow misses are very frustrating for all people involved. It sucks when you take a test that has a 75 pass mark and you get a 74. It sucks when you take a course that has a 50 pass mark and you get a 49. It sucks when you need $1,000 to get through the month and you get $980.

I don't think Metacritic is a cure-all solution, at least in part because as I mentioned earlier, the same publishers who use Metacritic as feedback also use marketing to try to influence Metacritic. But I think Metacritic is a better solution than more narrowly tailored options.

There's a structural problem in management in general when a manager uses an accountability metric on an employee, but the manager's decisions also affect the employee's ability to meet the metric. We see this also with No Child Left Behind, where underperforming schools are defunded leading to further underperformance. Konami pegging a bonus to one of their games getting an 85 Metacritic would be unfortunate, because Konami is not a good publisher and the support they give developers is clearly not sufficient to get an 85 Metacritic, regardless of the developer's talents. But I don't think the problem is the metric, I think the problem is the management and employee relationship.

Can we just abolish the numerical scale and just go with this:

Recommended

Limited Recommendation

Not Recommended

I think Dead Space is a stronger game than Resident Evil 5 (within the genre, and generally as a game), but I would recommend both without limitations. I'd give RE5 a B and DS an A, or so. Maybe an 8 and a 9 or a 7.5 and an 8.5 on the more exhaustive scales. *shrugs*
 
Metacritic puts reviewers together. I hope IGN realise their basically saying their the problem.

The real issue is that if a game:
a) Sticks to a formula and doesn't try to do much new; it'll probably be safe for a good score if its competent.
b) Tries something new and succeeds but maybe isn't the perfect game; any flaws that haven't be battered down like the FPS genre has over DECADES, are talked about wayy too much.

When Resident Evil: Revelations main complaints are - its not on a console and when Goldeneye Wii is only reviewed on the classic controller - the problems with reviewers and games journalism in general.

The fact is the entire industry has dreadful news sources. Metacritic takes all these incompetents and puts them in a single 'number' with no real explanation of how it got that number.


Rotten Tomatoes is a better idea imo. It just looks at 'positive' or 'negative'. If you want the 'score' of the game - read the reviews, whilst its generally a more friendly interface and gives customer reviews more space.

But blaming 'metacritic' is just idiotic, its the reviews behind it that are the big big issue. I don't pay much attention to them except for information now; I usually use Gaf as a thermometer on the action.
 
If IGN really wanted to take a stand against Metacritic, they would stop having their review scores be included in the MC aggregate. (And perhaps even stop scoring their reviews altogether.) The chances of IGN actually taking that step? Zero.

That's what's so strange about this article. They are in a perfect position to actually do something about it.
 
I still haven't seen any definite correlation between metacritic ratings and sales. While the tendancy is that high scoring games sell well and low scoring games do not, there are plenty of examples that buck that trend, nor does it really guarantee that say, ME3 sold well because of its high metacritic rating. I still think marketing trumps ratings any day of the week.

EDIT: I think using Metacritic is fine if you at least look at the individual reviews instead of just the aggregate.
 
They're welcome to complain about the system, but generally a complaint has to be mindful of alternatives even if it doesn't provide a solution.

Here are the choices:
- Do not offer bonuses based on game quality; instead, offer no bonuses or offer bonuses based on some other metric (presumably sales)
- Offer bonuses based on game quality; calculate it using some other metric than Metacritic (such as some sort of internal, publisher-determined quality metric; focus groups or expert panels; a cherry-picked subset of reviews; etc)
- Offer bonuses based on game quality; calculate it using Metacritic.

In all three cases, the bonus condition could be widely missed, narrowly missed, or achieved. Narrow misses are very frustrating for all people involved. It sucks when you take a test that has a 75 pass mark and you get a 74. It sucks when you take a course that has a 50 pass mark and you get a 49. It sucks when you need $1,000 to get through the month and you get $980.

I don't think Metacritic is a cure-all solution, at least in part because as I mentioned earlier, the same publishers who use Metacritic as feedback also use marketing to try to influence Metacritic. But I think Metacritic is a better solution than more narrowly tailored options.

There's a structural problem in management in general when a manager uses an accountability metric on an employee, but the manager's decisions also affect the employee's ability to meet the metric. We see this also with No Child Left Behind, where underperforming schools are defunded leading to further underperformance. Konami pegging a bonus to one of their games getting an 85 Metacritic would be unfortunate, because Konami is not a good publisher and the support they give developers is clearly not sufficient to get an 85 Metacritic, regardless of the developer's talents. But I don't think the problem is the metric, I think the problem is the management and employee relationship.
You're obviously right about how other metrics would be used in place of Metacritic, and I'm assuming other metrics were used before Metacritic was popular, just like how surely other metrics are used today in circumstances only known to developers and publishers. Sure. My gripe is more with how Metacritic mostly averages approximations from more or less credible sources in a way that sometimes obscures whether the work the publisher wanted done was actually performed. If the publisher, instead of using Metacritic, would just do the exact same approximation itself and base bonuses on basically exactly the same thing, then it obviously doesn't matter. Having no clue of the way publisher-developer relations work, I can't really guess what an alternative bonus setup would look like, other than that something thought through, agreed upon between all parties, feels more accurate and fair than arbitrary metascores.

If this looks like moving the goalposts then I'm sorry, but I don't feel like there ever were any goalposts in this discussion.

So basically, if what would be used instead of Metacritic would be basically the same thing, or any of the other sites that does the same thing, then there's no point complaining. I don't know to what extent this is true though.
 
I base my purchases off my interests and the gaming community. I know what I want before I buy it, but I rely on the gaming community to hear about cool indie games I never heard of.
 
I don't mind metacritic, but the whole "if your games doesn't get this score, then you won't be getting a bonus" really is a disgusting thing to do to developers. If the games sells then they should get a bonus regardless of the metacritic score.
 
You're obviously right about how other metrics would be used in place of Metacritic, and I'm assuming other metrics were used before Metacritic was popular, just like how surely other metrics are used today in circumstances only known to developers and publishers. Sure. My gripe is more with how Metacritic mostly averages approximations from more or less credible sources in a way that sometimes obscures whether the work the publisher wanted done was actually performed. If the publisher, instead of using Metacritic, would just do the exact same approximation itself and base bonuses on basically exactly the same thing, then it obviously doesn't matter. Having no clue of the way publisher-developer relations work, I can't really guess what an alternative bonus setup would look like, other than that something thought through, agreed upon between all parties, feels more accurate and fair than arbitrary metascores.

Chris Avellone talked about this in a podcast not too long ago. Basically, a lot of the time developers receive "milestone payments". Every 4-6 weeks developers get paid upon achieving certain objectives or making significant progress. Obviously, if developers have good relationship with the publisher then the objectives could be flexible. I believe Metacritic only comes into play when dealing with bonuses. So it really comes down to negotiating a good deal for the actual development of the game, so that missing out on bonus wouldn't affect too negatively. Obsidian, for example, got a not-so-good a deal for New Vegas, which was the bigger issue than scoring 84 on Metacritic.
 
I still wonder...

Can't review sites still give out scores, but just throw up some kind of copyright deal where Metacritic can't use or link back to the site reviews?
 
Serious question because I didn't play the game but do you think they really deserved it?

It was better than Fallout 3 ny quite a bit, so Obsidian definitely should have gotten a bonus. The management at Obsidian should have pressed for a sales metric bonus though since reviewers are inconsistent in terms of reviewing being far more lenient for some and far less forgiving for others.
 
Wait doesn't ign post press averages on their site next to their reviews? Why come at metacritic when they do something similar?

They do a video that looks a bit more indepth at reviews and games, even countering their own points. IGN is hypocritical and not perfect in a lot of ways - but I do think their 'trying' to give games more of a chance beyond 'scores'.
 
I think the concept of Metacritic is quite interesting, basically you get lots of subjective opinions and then by taking an average you get something that is kind of objective. It's a variation on the wisdom of crowds. But Metacritic spoil it by building in this weighting system (the metascore is not a straight average) and the weighting values are Metacritic's own subjective opinion of the quality of the contributor. Effectively the whole approach is destroyed by them reintroducing this subjectivity, and so I think it is absolutely crazy for people to use Metacritic scores as either an objective measure of quality or as a basis for rewards etc.
 
No matter what bonuses are attached to attaining a high metacritic score, the ultimate money paying vehicle is how the game performs at retail. If the success at retail is linked to metacritic score, then it makes perfectly sense to focus on it from a business perspective.

As Keza lets out a few times in the article - IGN's main concern is the fact that Metacritic lends the smaller sites an extra voice. An extra voice, which takes away a bit of the importance of mega-sites like IGN.

Keza's thinly veiled argument that sites like IGN has a higher standard of journalism than some small-time startup blog, and thus should matter more - is well, I don't know, just a little bit funny given the type of publication she's associated with.
 
Publishers need a metric for quality, and reviews are swayed a lot less by advertising and weird market conditions than sales are. Now, it would probably make sense for everybody if the bonuses kicked in gradually rather than having sharp cutoffs, but in principle it seems like a pretty obvious and valuable business practice. I don't know that doing it using rotten tomatoes style ratings would work out much differently.
 
Chris Avellone talked about this in a podcast not too long ago. Basically, a lot of the time developers receive "milestone payments". Every 4-6 weeks developers get paid upon achieving certain objectives or making significant progress. Obviously, if developers have good relationship with the publisher then the objectives could be flexible. I believe Metacritic only comes into play when dealing with bonuses. So it really comes down to negotiating a good deal for the actual development of the game, so that missing out on bonus wouldn't affect too negatively. Obsidian, for example, got a not-so-good a deal for New Vegas, which was the bigger issue than scoring 84 on Metacritic.

This raises more points about the developer-publisher relationship, and possibly highlights why several notable devs are looking at Kickstarter appraisingly. If Obsidian's finances meant they didn't have the leeway to cope without that bonus, then the bonus *was* important... but, as you say, had the original deal been more generous, that wouldn't have been an issue.

In many cases, publishers have devs by the proverbial short an' curlies. The dev *needs* signings to keep doling out paychecks to their employees; after a project's completed, the money's going to stop coming in until there's a new deal on the cards. Devs *need* projects to survive, and publishers know that and can exploit the fact; there aren't many independent developers who are in a position to turn down a project that comes around.

I've mentioned a couple of times - although not in this thread - that I was actually laid off from a developer a few years ago for precisely this sort of reason; the deals were falling through, things weren't getting signed, previous deals didn't leave us with a lot of leeway to have spells with no work. It's a bit of a catch-22 for a dev, I don't really resent the management for having to make the difficult decision to drop people.

There can't be that many professions where financial stability is so hard to maintain
 
There can't be that many professions where financial stability is so hard to maintain

Any company or organization which are mainly doing project work faces this every time projects close and new ones need to start to keep the business and workforce afloat.

Takes dedicated sales and technical staff focused on building a project pipe-line if an organization is to get out of the inherent roller coaster nature of their business.
 
No but publishers making contracts and hiring people on the back of an average Metacritic is a joke.

Publishers learn what Metacritic is and how it is aggregated!
 
considering any 7.5 range title seems to get a 9.0 @ IGN i can see where they're coming from. As a sony fanboy am I the only one who finds it odd that IGN gives out 9.0's to any sony game like candy? I got burned MANY times buying into their scores in the past.

I still remember that grudge match they had between resistance 2 and gears 2, calling resistance 2 the better game. LOL.

10/10 for uncharted 3? (92MC)
9.5/10 for resistance 2? (87 MC)
9.4/10 for Ratchet and clank Future: TOD? (89MC)
9.1/10 for resistance 1? (86MC)
9/10 for starhawk? (78MC)
9/10 for fat princess? (79MC)
9/10 for modnation racers? (82MC)
9/10 for resistance 3? (82MC)
9/10 for Twisted metal? (76MC)
9/10 for cuboid(PSN exclusive)? (79MC)
9/10 for folklore? (75MC)

i can see why they want to downplay MC, it showcases their sensationalist scores. Being consistently about +10% give or take on select publishers games doesn't look to good.

No offense to anyone who loves any of these games, but it's just something i've noticed.
 
I know I'm in the reviled, reportedly brainwashed minority here, but I disagree. I can't see how you can be against Metacritic and not against the concept of professional reviews itself.

In my experience, a high Metacritic score usually correlates pretty well with a game that is at the top of its genre. Of course, caveats still apply; if you hate the genre, any score is superfluous. But within those limits, a Metacritic-style score is as close as we can humanly get to a reviewer-agreed score. I think it's rather robust and more objective, if objectivity is applicable at all in this context.

I have loved games that didn't get a high Metacritic score, but in almost every single case I'll be the first to admit that they're not particularly good games or games I would recommend in general. Taste has to account for your decision as well, and if the one single grip most reviewers had with a game is irrelevant for you, and the game does something that particularly clicks with you, then scores, in general, will matter little.

I think the fact that IGN does not (rightly so, because they're amateurish and mostly suck) have a lot of weight in Metacritic's score might have more than a little to do with the article.

Metacritic is only a bad tool if you use it wrong. Links and quotes for every article averaged are there for a reason! Like everything, it has its good and bad sides. Thanks to Metacritic, universally acclaimed gems with little or nothing in the way of advertising are much more visible and turn out much more profitable; this is particularly good for smaller, or even indie studios. Conversely, companies can get away with releasing crap with good publicity much less now. Divisive games are perfectly recognizable as such by the lack of "middle" reviews.


Developers are ruining themselves. If a company decided to fire his employees if Gamespot gave a bad review of the game, did Gamespot ruin them?

This is another of the reasons to support that "Metacritic is evil"; that developers are doing bad stuff because of Metacritic scores. I sincerely call into question that the problem is with Metacritic itself, why does nobody criticise these insane practices themselves?

That being said, I agree that the industry's reliance on 90+ scores is ridiculous and hurting the business as a whole, by promoting cookie-cutter releases full of bullet-point feature-sets.

How is that a problem with Metacritic and not the game review industry at large? Which it indeed is; I'm sick of seeing Call of Duties and such consistently hit 90s while incredibly original gems get at the short end of the 70s or worse. But again, this is a (huge) problem with the review industry, not Metacritic itself.

It just seems to me that Metacritic is now the trendy, acceptable target that all the cool kids get a swing at, then we go home and pretend everything else is fine. No, Metacritic is a perfect indicator and reflection of what the game and review industry actually is. Don't smash the mirror because you don't like what is reflected on it; the problem is much larger and widespread.

Also, many people seem to think that because many popular sites' scores are bloated, Metacritic's are too, but remember they also apply a normalizing curve. This means that any game above 60-70 is usually rather good and worthy of attention, assuming you like the genre.
 
i'm sure there are games where this is not the case, but I think there's a lot of reasons that a publisher might rationally want to rate [and reward] a developer based on metacritic score versus something like sales. where a metacritic score is going to inevitably be tilted based on the reputation and generated expectations of the developer, sales data captures to a much larger degree the marketing campaign as well as the built-in brand loyalty to a particular IP. So with New Vegas, the fallout IP and ad campaign was probably strong enough that any decent game was going to sell a few million. the same goes for any publisher doing a deal with a developer to make a licensed game or game based on an existing property not owned by the developer. an internal review system controlled by the publisher would merely amount to a gratuity and would be a farce, so metacritic is the obvious choice because it's a pretty good proxy for quality, if imperfect, and it's easy to understand and negotiate around without getting caught up in complex sales forecast models.
 
Top Bottom