• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Valve battles review-bombers by introducing review histograms

Gestault

Member
This is also super useful in cases where a patch either improves or introduces major issues into a game after release.
 

Aselith

Member
That's a very solid answer to review bombing. Glad they came up with a reasonable solution

How does this help? It only visually organizes when it starts getting review bombed and does nothing to discourage or prevent it.

It helps by showing you visually that a review bomb is taking place. You can't solve it but you can come up with good mitigation techniques like this.
 

Nzyme32

Member
Is there a minimum playtime for reviews?

Should be at a minimum of 4 hours so you can buy and refund

And then what are you doing with games less than 4 hours / less than 1 hour / flexible to be beatable in less time or more time / games that are updated and become less than 4 hours or more than 4 hours / games with no definable end state?

So many questions
 

Wulfram

Member

Ask 500 random people who own the game and have played it "Do you recommend this game?". Use that to give it the headline percentage score.

So long as user reviews are self selecting, they'll be inherently flawed and vulnerable to brigading and campaigns. But Steam has the ability to take a leaf out of proper polling's book and get a much better figure.
 
Is there a minimum playtime for reviews?

Should be at a minimum of 4 hours so you can buy and refund

It's ten minutes. If you do too much, people having technical issues will have a difficult time getting the word out, people would have to crash to desktop 500 times to be "eligible". If you do too little, you're dealing with people who didn't play the game. I don't know if there is a balance to be struck but just having ownership being required puts it leagues ahead of most platforms.

I don't even think reviews should be removed in the event of a refund. Why should someone who crashes to desktop or has the game overheat their GPU to the point where the PC turns off (my experience with Friday the 13th) have to keep the game for negative feedback to be taken into account?
 

Nzyme32

Member
This is also super useful in cases where a patch either improves or introduces major issues into a game after release.

Yeah, my number one want here is that they throw in more data points for people to filter in or look at individually - every update to look at how that has effected perception, bug fixes, dlc launches, server stability issues that were recorded / periods.

It would also be good to somehow incorporate a timeline of more social issues, like obvious issues in Firewatch where a developer has stated something and that has had a dramatic effect - but of course this is much more dubious to implement and it isn't particularly hard to google what news there is around a game at once you know the timeframe where something odd has happened.
 

brad-t

Member
Did you actually read the blog post, or just driveby shitpost? They explain their reasoning for not making changes to the review system itself and the tendency of review ratings over time for games that did suffer from review bombing.

I understand why they did it this way, and a histogram for reviews is a great thing to have regardless of this issue, but it's impossible to ignore that this move exists in a broader context of Valve doing everything in their power to avoid moderation or curation. I don't think it's unreasonable to expect that reviews not related to a game's quality be removed, and that the Valve are is capable of creating a system flagging suspicious reviews and then moderating them. The conditions these reviews are left under are not exactly inconspicuous—they typically occur in clusters over a time period, are usually superlatively positive or negative (usually the latter), and often come from users who've only barely engaged with the game.

Valve's solution, instead of providing context for the review spike, requires the user to investigate the reasons behind why the spike occurred during a given period to even understand it. But I get that some might view any type of moderation or automated flagging as "arbitrary censorship." If the idea of a completely open platform for reviews with no real moderation is important to you, then no, I don't have any better suggestions. I don't think the alternatives that Valve provided in their blog post are very good ideas either.
 
Worth noting: Valve did consider locking down user reviews when they detected review bombing beginning, but they rejected the idea since they didn't want to "stop the community having a discussion about the issue they're unhappy about, even though there are probably better places to have that conversation than in Steam User Reviews".
 

Nzyme32

Member
Ask 500 random people who own the game and have played it "Do you recommend this game?". Use that to give it the headline percentage score.

So long as user reviews are self selecting, they'll be inherently flawed and vulnerable to brigading and campaigns. But Steam has the ability to take a leaf out of proper polling's book and get a much better figure.

This doesn't help at all. What you are looking for is a singular defining score with no context. The reason polling works is because the sample size can be large enough and you are getting a definitive piece of data - including all those that brigade and support a set of options based on such brigading - their "votes" are still valid. You do not remove the bias of the result in this way, just as is true for the final result of any political vote - you didn't remove anything that has effected the vote.

The histogram showing the reviews over time is objective with the whole data set and the timeframes for them. Ideally what you want to be built on top of that is a filter to take out the obvious deviations around the prior periods assuming you are certain this prior period is free of review bombing and deviations are not down to an update or other legitimate reasons. A function around standard deviation for given time points that we know are valid can provide this and effectively filter out anomalous data from brigading (whether positive or negative - if there is enough data to support it)

Worth noting: Valve did consider locking down user reviews when they detected review bombing beginning, but they rejected the idea since they didn't want to "stop the community having a discussion about the issue they're unhappy about, even though there are probably better places to have that conversation than in Steam User Reviews".

And this is the thing I'm happy about - I want to make my own decisions from data with context rather than attempt to block what might not be right to block, or forcing definitive scores with no context.
 
The uptime sine wave has been very useful over the years and I think visualising the review trend will be similarly effective.

It'll also be interesting to see net positive or negative reactions to specific game or content updates (Denuvo removal, large free updates etc).
 

Durante

Member
What I read here:
Nice to see Valve considering this issue deeply and on all levels, and providing a solution that empowers people to inform themselves better, rather than some half-baked "quick fix".

And then I read the GAF thread:
Like always, Valve came with half assed solution.
Come on, what the fuck?
Did you read that blog post? What exactly would be a not "half assed" solution in your opinion, which also doesn't have any of the other drawbacks of all the solutions explored and ultimately rejected by Valve?

Valve's solution, instead of providing context for the review spike, requires the user to investigate the reasons behind why the spike occurred during a given period to even understand it.
Yes, Valve caters their solutions to making sure that people who want to inform themselves (or want to buy a particular game, or in general, want to make their own decisions) have powerful tools to do so.

That's what makes their platform different. I would argue it is what makes their platform great.
If you want a platform where decisions are made for you instead, there are plenty out there.
 
Worth noting: Valve did consider locking down user reviews when they detected review bombing beginning, but they rejected the idea since they didn't want to "stop the community having a discussion about the issue they're unhappy about, even though there are probably better places to have that conversation than in Steam User Reviews".

Ideally, it isn't but realistically what's the alternative? It's a common sentiment among Steam Discussions users that publishers/developers don't read them so we should shut up and send emails to the publisher (assuming that information is even accessible to the public) and using the example of GTA5, I highly doubt that posting on the Social Club or sending complaints through a feedback form on their website would've resulted in them backing off from targeting modders.

If they're not reading the discussion board on the same site where millions of people buy their games, I highly doubt they're reading their own forums assuming they have them.
 

Riposte

Member
I remember hearing about this long before the recent case. Review bombing is a fairly common practice, pretty much anywhere it is possible, so I'm not surprised they are adding this.
 
I like it. It gives the user more information without restricting the community's ability to voice its opinion. If you see an influx of negative reviews on an otherwise well-received game you can see what the fuss was about and easily ignore them if they don't have anything to do with the game itself or be informed of any publisher and developer shenanigans.
 
A step in the right direction. It does put more onus on the customer to research why the review histogram is the way it is, so it's unclear to me how much benefit this will have to the system; the worst-case scenario is that people start gaming the histogram somehow, but probably more likely is that most people won't click on the histogram at all and not much will have changed.

A second potential side effect is what happens if people read the histograms out of context. For example, Valve notes that service-style games tend to show review scores decreasing over time simply because late purchasers tend to be less engaged and less satisfied than early ones. Someone reading that histogram without that note from Valve might think a game is getting worse over time when really it isn't.

But by and large, I think this is a good addition. I don't want to pooh-pooh it very much because more information will generally be useful, especially if a lot of people get in the habit of seeking it out. I'm not sure it solves the Firewatch/PDP problem but it does give added context to what's going on, just like the Eurogamer article that showed SteamSpy's view of the same information.
 

Papercuts

fired zero bullets in the orphanage.
This type of thing is necessary when you look at how games can progress and change over time, specifically MP titles. A game not receiving multiple patches is a rarity now and sometimes patches outright fuck a game up, so seeing recent negativity lining up with patch dates is nice.

I don't feel this really does much about review bombing though.
 

Gurnlei

Member
Not getting the half-assed complaint(s). Looks like it was introduced in a clean, thorough way.

The ๖ۜBronx;249360588 said:
With regard to Firewatch won't this make it seem like they just introduced a bad patch or something?

To the lay user that is.

You can click a column in the chart to see helpful reviews from that period.
 

Durante

Member
The ๖ۜBronx;249360588 said:
With regard to Firewatch won't this make it seem like they just introduced a bad patch or something?

To the lay user that is.
It will make it obvious that people were generally pretty happy with the game, and then at some point a huge review-bomb happened. Then everyone interested in the game can click on that and find out what it was about -- and if they aren't racists and/or die-hard youtuber fans they can mentally shrug and buy the game.
 
I'd love to see Valve allow some sort of crowdsourced mechanism that links the histogram to patch notes. I wouldn't trust the publisher to maintain it themselves but the community could. It'd be great to see notable events in a game's lifespan, especially in the context of multiplayer games that will live or die based on the number of active players and the amount of content being pumped out.
 

gelf

Member
Certainly a welcome change. Vast majority of occasions the reason for a review bomb is completely irrelevant to me so to be able to skip to a none bombed time period for more useful impressions is very handy.
 

Mesoian

Member
can you leave a review without buying the game?

No, but you can make a new acocunt, buy the game, write the review, return the game, and let the account linger forever.

Quite a few "there is 1 product in this user's library" reviews up on the firewatch stuff.

The Histogram thing is nice, but it still misses. It would be nice if it worked a bit more on a time bell curve for weird spikes. Right now they're filtering the 11th tot he 16th, but this all started a few days prior to that.
 

Twookie

Member
good on valve, would've appreciated it sooner but I'm happy it's here now

it's a shame that so many don't actually read the blog post before commenting
 

Arulan

Member
It's not surprising to see GAF's (statistically) poor reaction to any news concerning Valve. Valve is one of the most positive influences on the industry, and their insistence on keeping the PC an open-platform, and removing themselves from the position of overlords is unheard of for a company in their position. Yet it seems there is always much concern for why Valve isn't following the backwards closed-platform and my-say-is-absolute practices of other platform holders (consoles).

As for this update, I think revealing more information such as this is a good thing for the end-user. I thought the recent and overall change was smart as well. I'm sure there will continue to be ongoing changes to the system however.
 

Ikuu

Had his dog run over by Blizzard's CEO
good on valve, would've appreciated it sooner but I'm happy it's here now

it's a shame that so many don't actually read the blog post before commenting

People on here barely read the OP let alone a linked article.
 

Wulfram

Member
This doesn't help at all. What you are looking for is a singular defining score with no context. The reason polling works is because the sample size can be large enough and you are getting a definitive piece of data - including all those that brigade and support a set of options based on such brigading - their "votes" are still valid. You do not remove the bias of the result in this way, just as is true for the final result of any political vote - you didn't remove anything that has effected the vote.

Its not the sample size that matters, its the quality of the sample - how representative of the whole it is. A self selecting sample is useless, a random sample polled via Steam could be pretty much perfect.

I'm not trying to remove any opinions, including those who brigade, I'm trying to give voice to the silent majority who don't review their Steam games.

The histogram showing the reviews over time is objective with the whole data set and the timeframes for them. Ideally what you want to be built on top of that is a filter to take out the obvious deviations around the prior periods assuming you are certain this prior period is free of review bombing and deviations are not down to an update or other legitimate reasons. A function around standard deviation for given time points that we know are valid can provide this and effectively filter out anomalous data from brigading (whether positive or negative - if there is enough data to support it)

A histogram of garbage data is still garbage, it just lets you choose which periods nonsense you wish to believe.
 

Nzyme32

Member
I'd love to see Valve allow some sort of crowdsourced mechanism that links the histogram to patch notes. I wouldn't trust the publisher to maintain it themselves but the community could. It'd be great to see notable events in a game's lifespan, especially in the context of multiplayer games that will live or die based on the number of active players and the amount of content being pumped out.

Yeah! Very much in favour of this sort of thing. Thankfully, all this data is going to be available, so any other site could provide this or we could end up with browser extensions that add this data from other sources.
 
Its not the sample size that matters, its the quality of the sample - how representative of the whole it is. A self selecting sample is useless, a random sample polled via Steam could be pretty much perfect.

I'm not trying to remove any opinions, including those who brigade, I'm trying to give voice to the silent majority who don't review their Steam games.

Trying to scientifically poll the quality of any game on Steam is almost certainly not worth the amount of effort it would take.
 
If a game is being bombed it seems to put up a big warning and automatically show the histogram.

EcexQJo.png
 

Van Bur3n

Member
Unsurprisingly, the people upset about this are the least informed. As with most things in life. Hell, I wonder how many of such people actively use Steam, considering we're getting questions in here that any person that uses Steam should know by now.

But the histogram should prove useful in being able to show major shifts in a game's overall reception and show us why that is. A easier way to identify whether it is review bombing or not and why that bombing is taking place (a bad patch or reasons outside of the game). Can definitely be helpful for someone just trying to figure out whether a game is worth playing or not. The blog post explains things much better and even discusses the thought process they put into finding solutions.

It's not that hard to understand, guys.
 

Nzyme32

Member
Its not the sample size that matters, its the quality of the sample - how representative of the whole it is. A self selecting sample is useless, a random sample polled via Steam could be pretty much perfect.

I'm not trying to remove any opinions, including those who brigade, I'm trying to give voice to the silent majority who don't review their Steam games.

Which is once again useless - it's like saying you are going to make mandatory voting, and the "silent" groups that did not want to vote are going to be asked to vote. The result is largely either they decline / opt out, or when forced provide garbage data in protest or because they didn't care ("the polls are always wrong" with poor polling). This provides no change against whether data is garbage or not or if review bombing is collected or not (ie those with a position of hate against again for non related game reasons eg politics not represented in the game).

A histogram of garbage data is still garbage, it just lets you choose which periods nonsense you wish to believe.

And your system is not also "garbage data" by the same logic!? You have a data set full of both valid and invalid data, the histogram gives you a far greater understanding of this, but certainly still demands you pays some context to the time of the reviews and then actually READ the reviews submitted to determine for yourself what is "garbage" and what isn't.

This is far far superior to an arbitrary score or a random sampling with no context and timeframes to follow.
 

Morrigan Stark

Arrogant Smirk
Hmm, better than nothing, I guess.

You can click a column in the chart to see helpful reviews from that period.
The review bombers also bombed upvotes and downvotes, upvoting all the negative reviews and even downvoting positive ones. Including one review (it was posted on GAF in an earlier thread, I don't have the link handy atm) of someone who felt a deep personal connection to the game due to her loved one dying of dementia. Because people are that fucking petty.
 
Top Bottom