• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Rainbow Six Siege Technical Analysis And Frame Rate Test - Digital Foundry

Oh I am not messing them up, no confusion about that. The GI in the game is not real time, just baked... like every game out there on console basically. Almost any games that use cube maps would have that kind of "gi" (even though this game also has obvious baked light maps that you can see in some door entrances). Cube maps are not what I would really call GI, because it is not global and it isnt even necessarily local. Something like what alien Isolation uses is GI (and is also real time).

I think we should set up a thread about terminology usage at some point... just because.


Titan X. Sorry I should have typed that originally.

Ah, that's interesting, thanks for the info. So I take it now the info in the video above is inaccurate or just doesn't tell the full story? I mean, can GI be baked as well or does it always have to be real time to qualify as GI? I'm a tad confused.
 
Digital Foundry article is up (http://www.eurogamer.net/articles/digitalfoundry-2015-hands-on-with-rainbow-six-siege), will update the OP:

"In terms of the console betas, the PS4 game arrives with a native 1080p resolution, while Xbox One is pared back to 900p. As such, PS4 gains a clearer image due to the lack of upscaling, though the Microsoft platforms appears pretty similar in lower contrast scenes, due to the effectiveness of the anti-aliasing solution. Edge-smoothing is provided by a post-process algorithm in addition to a temporal sampling component that results in fairly clean quality, but this this does come with a few trade-offs: some blurring is present in motion, while the game takes on a slightly soft look in still scenes. On PC, we opt for 1080p resolution in combination with TAA and temporal filtering for anti-aliasing duties - the game's maximum preset for image quality. This delivers clean looking visuals similar to the PS4 game, but with less softness in still scenes, though temporal blurring is still noticeable in motion

The use of trilinear filtering leads to blurrier ground textures on PS4 (Xbox One uses something akin to 4x anisotropic filtering), while LOD streaming varies on both systems with neither gaining a permanent advantage. Some scenes feature shadows streaming in more slowly on PS4, while normal map and texture details are resolved to a higher degree. PC owners get higher resolution foliage and shadows, further draw distances, and improved texture filtering via the use of 16x AF, but otherwise the core assets and effects work - such as smoke and particles - remain the same as on consoles.

One major benefit to this is that both standard multiplayer and Terrorist Hunt modes operate at 60fps, whereas the latter is capped to 30fps on the PS4 and Xbox One. This lends the PC version a greater level of consistency across the different modes, and sees fast, low latency controls preserved across the whole Rainbow Six experience.

And yes, you read that correctly - Rainbow Six: Siege targets two different performance profiles on console, with the standard multiplayer mode operating at 60fps, while the Terrorist Hunt game is capped at 30fps. Both modes share the same maps, and graphical quality is also identical with no changes to levels of detail and the effects-work used - the baseline visuals are still budgeted around hitting 60fps, and yet, there's a big performance downgrade here. According to Ubisoft, Terrorist Hunt operates at a lower frame-rate due to this mode featuring advanced AI - far from acting like mindless drones, on higher difficulty settings, your opponents take time to put up barricades, place barbed wire around possible entry points, and set-up charges to destroy the environment to their advantage, in addition to shooting down players from behind boarded up windows or wooden doors

For the most part, Terrorist Hunt manages to run solidly at 30fps across both consoles during gameplay, with performance mostly impacted during the kill cam scenes where control is taken away from the player. At one point we encounter a substantial drop down to 18fps in the Xbox One game, though moments like these tend to be rare and not representative of the usual experience. That said, after playing the main multiplayer mode at 60fps, the drop down to 30fps is readily felt: the reduction in controller response and smoothness is substantial and the experience feels far less fluid and enjoyable to play as a result. After a few matches, it's possible to adjust to the change in motion and the appearance of heavier controls, though the shooting never feels quite as satisfying compared to the regular multiplayer games at double the frame-rate.

In comparison, the standard multi-player PvP mode offers up more refined shooting action: it's here where twitch-based action gels nicely with the game's focus in tactical combat. Fire-fights and explosive encounters sees the game loose it's initially solid 60fps update, with regular excursions between 50-60fps in these moments, and lows hitting around the mid-40s when the engine is more heavily stressed
. The use of adaptive v-sync leads to some tearing when frame-rates are impacted, but this often helps to keep controller response at a consistent level despite brief periods where judder is visible in screen. Ubisoft's implementation is mostly successful in keep controller feedback feeling consistent, although there are moments where dramatic variances in frame-times have a short, but tangible effect on precision during shootouts"
 
What i experienced was not a solid 30 either in Terrorist Hunt. I really hoe they can make this solid.
I'd be okay with the mediocre graphics. The ai is certainly amazing.

Terrorist Hunt is by far the most exciting mde of this game imo.
It's amazing fun. Especially at realistic.
 
What i experienced was not a solid 30 either in Terrorist Hunt. I really hoe they can make this solid.
I'd be okay with the mediocre graphics. The ai is certainly amazing.

Terrorist Hunt is by far the most exciting mde of this game imo.
It's amazing fun. Especially at realistic.

Wasn't this posted already, by the way?

This thread was posted 2 days ago (check the OP) with just the video of the frame rate test and technical analysis, and I said I'd update the OP once the more detailed article was up (so that it's all in one place). I'll ask a mod to lock if there is another article?
 
My comment from the other thread:

Terrorist hunt at 30 is not good, looking at the different modes one after another was jarring really.

Nice to see that overall the framerate is pretty good in the main mode, only dropping from 60 during explosions and the like. Normal physical destruction (aka chipping away walls and the like) doesn't seem to effect the framerate on either console.
 
This thread was posted 2 days ago with just the video, and I said I'd update the OP once the article was up?
Yeah i copy pasted that from the other thread, lol. I knew there as allready a thread. This one. I copied my reaction from the other thread. Including the last line that wasn't supposed to be in this thread.
 
Yeah i copy pasted that from the other thread, lol. I knew there as allready a thread. This one. I copied my reaction from the other thread.

Ohhh haha, I get ya! :P

My comment from the other thread:

Yeah, the 30fps is very jarring, and because it's online only, you'll continually be playing those modes. I know the AI is advanced and all, but I hope they can at least keep the frame rate unlocked as it seems to be locked at 30fps, so the frame rate in effect must be quite high, as is the case with games that are locked (eg. games like MGSV which are locked at 60fps must be running 70+fps most of the time to actually maintain that).
 
Here are some images from the article in case anyone is wondering how the each platform holds up:

965aee95b749de92e9608940986420cc.png


e99dfc4a5cd8251f842a439b14509359.png


aa3b34a2ae157ce4dcd3a6ce3da4b833.png


8c3ffed33f27b9e619f50d32c9113bdd.png


acb0eb80d6f46c7acfdbbf3d1c703058.png


93e74d71832c39e85ff2aa0ee6dea84b.png
 
Here are some images from the article in case anyone is wondering how the each platform holds up:

https://i.gyazo.com/965aee95b749de92e9608940986420cc.png[/IMG

[IMG]https://i.gyazo.com/e99dfc4a5cd8251f842a439b14509359.png[/IMG

[IMG]https://i.gyazo.com/aa3b34a2ae157ce4dcd3a6ce3da4b833.png[/IMG

[IMG]https://i.gyazo.com/8c3ffed33f27b9e619f50d32c9113bdd.png[/IMG

[IMG]https://i.gyazo.com/acb0eb80d6f46c7acfdbbf3d1c703058.png[/IMG

[IMG]https://i.gyazo.com/93e74d71832c39e85ff2aa0ee6dea84b.png[/IMG[/QUOTE]
Wow, that is not a game where you want black-crush..
Can they fix that for Xbox1-owners?
 
Wow, that is not a game where you want black-crush..
Can they fix that for Xbox1-owners?

Are you referring to this image or which one?

93e74d71832c39e85ff2aa0ee6dea84b.png


Because I do remember there being an issue with crushed blacks on Xbox One games in the past (particularly during the launch period), but I thought they got fixed long ago. I guess not!
 
Are you referring to this image or which one?

https://i.gyazo.com/93e74d71832c39e85ff2aa0ee6dea84b.png[/IMG

Because I do remember there being an issue with crushed blacks on Xbox One games in the past (particularly during the launch period), but I thought they got fixed long ago. I guess not![/QUOTE]
All of them actually. But you can see it best with the jeep. In the one where you can only see part of the jeep it's almost a silhouette. No detail.
[IMG]https://i.gyazo.com/8c3ffed33f27b9e619f50d32c9113bdd.png
I thought that was fixed too. But maybe it's up to the devs. Not sure.
 
All of them actually. But you can see it best with the jeep. In the one where you can only see part of the jeep it's almost a silhouette. No detail.
8c3ffed33f27b9e619f50d32c9113bdd.png

I thought that was fixed too. But maybe it's up to the devs. Not sure.

Hmmm, yeah I see what you mean now. That IS apparent. IIRC, I once read that the crushed blacks issue was to do with the automatic Xbox One upscaler (when it's sub-1080p resolutions I think) rather than the developer consciously putting it in IIRC.

Maybe someone can clarify that.
 
Ah, that's interesting, thanks for the info. So I take it now the info in the video above is inaccurate or just doesn't tell the full story? I mean, can GI be baked as well or does it always have to be real time to qualify as GI? I'm a tad confused.
No it's accurate GI has been baked for pretty much most games, realtime GI like the tomorrow children or SVOGI used in Unreal 4 just handle this better with more than 1 bounce. Driveclub also does this using screen space and this combination of effects will continue, Unity is completely baked and that has GI, realtime GI is another thing.
 
Not sure what they were thinking aiming for 30 FPS at terrorist hunt. It's pretty jarring jumping between MP and Terrorist Hunt on the PS4, often on the same map.

Pretty much ensures I won't be picking up the game, despite me enjoying the mode quite a bit with my buddy.
 
I'm amazed that this is still an issue.

The funny thing is that Sony's official statement is that it shouldn't be an issue and that they maybe have to make the SDK's documentation clearer, implying that developers just don't really know how to appropriately setup texture filtering. Which is hard to believe when it's about developers like Ubisoft.
 
Damn, yet another game with missing AF on PS4. Seriously, what the hell is going on? I thought Sony was helping devs to implement AF easier on the PS4. Did some devs not receive this help or are they just lazy? Also, I can understand if Ubisoft capped the frame rate at 30FPS for Terrorist Mode because the AI is bottlenecking the CPU, but wouldn't it be possible to ramp up the visuals a bit? I mean, if that mode is CPU bottlenecked, turning up a few settings shouldn't impact performance, right? Someone please correct me if wrong. I hope these issues get resolved before launch.

Despite its technical shortcomings, some aspects of the game did impress me. The SSR, in particular, looks fantastic from what I've seen, especially indoors. The GI also looks great, but seems like it's baked. Nice to hear that this game is using SSBC on consoles too, as long as it's implemented well. SSBC looked much better than plain old SSAO in Far Cry 4.

Oh I am not messing them up, no confusion about that. The GI in the game is not real time, just baked... like every game out there on console basically. Almost any games that use cube maps would have that kind of "gi" (even though this game also has obvious baked light maps that you can see in some door entrances). Cube maps are not what I would really call GI, because it is not global and it isnt even necessarily local. Something like what alien Isolation uses is GI (and is also real time).
Actually, Driveclub uses real time GI.
 
pretty disappointing that ps4 still can't get AF right. I'm not sure wtf is up and why hasn't a single dev come forward and confirmed its an issue. lame
 
Have they ever explained it or how it works? Or does anyone have screen and videos showing it`?
I don't know if they explained it and I don't have the game, so I can't really show it. I could be mistaken, but I thought it was generally accepted that Driveclub had real time GI?
 
I don't know if they explained it and I don't have the game, so I can't really show it. I could be mistaken, but I thought it was generally accepted that Driveclub had real time GI?

I think it is generally talked about, but I have never seen any examples of it or info on it.
 
The funny thing is that Sony's official statement is that it shouldn't be an issue and that they maybe have to make the SDK's documentation clearer, implying that developers just don't really know how to appropriately setup texture filtering. Which is hard to believe when it's about developers like Ubisoft.
Yeah, that explanation seemed viable enough a few months from release, but we still regularly get games with AF issues.
 
Yeah, that explanation seemed viable enough a few months from release, but we still regularly get games with AF issues.

That's gross.

Jesus, 16xAF has been commonplace on PC for ages....

pretty disappointing that ps4 still can't get AF right. I'm not sure wtf is up and why hasn't a single dev come forward and confirmed its an issue. lame

WTF is wrong with AF on PS4. There must be a reason why games have issues with it on Sony's console.

You see that's what irritates me. It's not like these consoles can't handle high AF or will suffer from severe frame drops if the implementation is a tad higher; I mean ideally, I would like 16x AF (like Dark Souls II: SotfS) although I know that may not be possible for current gen only titles, but the fact of the matter is, that history has shown an improvement in texture filtering doesn't have that much performance cost at all (see Dying Light and DmC Definitive edition pre and post patch; they had no performance cost whatsoever when a higher AF method was implemented.

At the very least on current gen titles, I expect 8x AF. I'm generally not sensitive to these issues (hell, I've never *ever* noticed screen tearing in any PS3/4 game I've played -I'm dead serious-), yet poor texture filtering implementation ruins the IQ for many games. Take Destiny as an example of this; despite a nice art direction and aesthetic design, the textures at a distance are really blurry and its quite off putting. Many other games do this as well, like the recent CoD games. It's irritating to say the least, as many of the games that suffer from this on PS4 are quite good graphically to begin with. They just need that extra polish.

Sorry for the rant lol, but it's something that has been going on for over a year, and I expect it to at least be sorted out (I know Sony have included something in their SDK to make it more "obvious" to devs or something, but clearly that hasn't worked). I'd expect some QA testers or at least developers to notice "Hey, this image looks a bit blurry at an angle or from a distance, maybe we can improve the texture filtering so our game can look more polished since it has minimal [to no] performance cost".

Perhaps I'm missing the point or something really obvious, for which I'd actually like to know lol. What do you think? Just sheer incompetence or something not seen as important?
 
I think it is generally talked about, but I have never seen any examples of it or info on it.
Yeah, this. But I think the racing face off thread is a great place to get images of the GI, though I don't remember if there were actually pics of it in that thread.

You see that's what irritates me. It's not like these consoles can't handle high AF or will suffer from severe frame drops if the implementation is a tad higher; I mean ideally, I would like 16x AF (like Dark Souls II: SotfS) although I know that may not be possible for current gen only titles, but the fact of the matter is, that history has shown an improvement in texture filtering doesn't have that much performance cost at all (see Dying Light and DmC Definitive edition pre and post patch; they had no performance cost whatsoever when a higher AF method was implemented.

At the very least on current gen titles, I expect 8x AF. I'm generally not sensitive to these issues (hell, I've never *ever* noticed screen tearing in any PS3/4 game I've played -I'm dead serious-), yet poor texture filtering implementation ruins the IQ for many games. Take Destiny as an example of this; despite a nice art direction and aesthetic design, the textures at a distance are really blurry and its quite off putting. Many other games do this as well, like the recent CoD games. It's irritating to say the least, as many of the games that suffer from this on PS4 are quite good graphically to begin with. They just need that extra polish.

Sorry for the rant lol, but it's something that has been going on for over a year, and I expect it to at least be sorted out (I know Sony have included something in their SDK to make it more "obvious" to devs or something, but clearly that hasn't worked). I'd expect some QA testers or at least developers to notice "Hey, this image looks a bit blurry at an angle or from a distance, maybe we can improve the texture filtering so our game can look more polished since it has minimal [to no] performance cost".

Perhaps I'm missing the point or something really obvious, for which I'd actually like to know lol. What do you think? Just sheer incompetence or something not seen as important?
I actually think your rant is warranted. It's pretty annoying how some devs seem to struggle with AF on the PS4. Like you said, the hardware is clearly able to support it without much impact to performance. While 4Ă— AF seems to be the most common setting for AF on consoles this gen, it's still much better than trilinear filtering. At the very least, devs should patch in better AF as it seems to have no impact to performance at all, which could mean it's already implemented, but something in the SDK turned it off by default for some reason. Just my theory.
 
They should have made the necessary sacrifices to keep it 60 fps at all times. Performance is far more important than visuals.
 
They should have made the necessary sacrifices to keep it 60 fps at all times. Performance is far more important than visuals.

Did you read or listen to what they said? It is at 30 due to the AI.

This inevitably means CPU issues.
 
Yeah, this. But I think the racing face off thread is a great place to get images of the GI, though I don't remember if there were actually pics of it in that thread.

I actually think your rant is warranted. It's pretty annoying how some devs seem to struggle with AF on the PS4. Like you said, the hardware is clearly able to support it without much impact to performance. While 4Ă— AF seems to be the most common setting for AF on consoles this gen, it's still much better than trilinear filtering. At the very least, devs should patch in better AF as it seems to have no impact to performance at all, which could mean it's already implemented, but something in the SDK turned it off by default for some reason. Just my theory.

Thanks and yeah you're completely right. 4x AF at the very[/] minimum perhaps, and extra points for 8x AF and 16x AF (I wonder if there's such thing as 32x AF lol).

Even from all the PC games I've played, (and I've checked this with an FPS counter on Steam), when I increase the texture filtering quality, or go from 4x AF to 16x AF, there is literally no performance cost at all. That's even more proof in the pudding.

It seems like it's something small that should be done without saying and implemented automatically...I guess not. And it's even more obvious this gen has games look better and are more graphically complex, so poor texture filtering is something that stands out as you'd expect it from the PS3 gen or something, if not before!

Like someone said above, it's unacceptable in 2015.
 
Thanks and yeah you're completely right. 4x AF at the very[/] minimum perhaps, and extra points for 8x AF and 16x AF (I wonder if there's such thing as 32x AF lol).

Even from all the PC games I've played, (and I've checked this with an FPS counter on Steam), when I increase the texture filtering quality, or go from 4x AF to 16x AF, there is literally no performance cost at all. That's even more proof in the pudding.

It seems like it's something small that should be done without saying and implemented automatically...I guess not. And it's even more obvious this gen has games look better and are more graphically complex, so poor texture filtering is something that stands out as you'd expect it from the PS3 gen or something, if not before!

Like someone said above, it's unacceptable in 2015.

Are we 100% sure that the performance is negligible? PS4 and xb1 have different bandwidth considerations than dedicated GPUs.
 
Are we 100% sure that the performance is negligible? PS4 and xb1 have different bandwidth considerations than dedicated GPUs.

No we aren't sure at all. The developers that do implement AF in their games never seem to use 16x either. It is usually 4x or 8x. IIRC Uncharted 4 is using 8x.
 
Are we 100% sure that the performance is negligible? PS4 and xb1 have different bandwidth considerations than dedicated GPUs.

Yup. IIRC, NX Gamer (or it may have been Digital Foundry - I need to check on that), did a pre and post patch analysis of Dying Light and DmC Definitive Edition, and the increase in texture filtering had no performance impact.

Even my PC, which mind you is weaker then both consoles, could run higher AF on modern games with no performance cost. Just don't see why some games can but others can't, which comes back to, IMO, incompetency.

And to be honest, texture filtering has always been regarded as not being GPU heavy; compared to other GPU features anyway.
 
No we aren't sure at all. The developers that do implement AF in their games never seem to use 16x either. It is usually 4x or 8x. IIRC Uncharted 4 is using 8x.

Huh? Dark Souls II Scholars of the First Sin has 16x AF IIRC on PS4.

Also, I'm not sure what the AF on Bloodborne or Order 1886 were either, but they were really good, particularly the former, which despite not having a good AA solution, had really good texture filtering.

Here, have a read:

http://gamingbolt.com/has-sony-finally-fixed-ps4-anisotropic-filtering-issues
 
Thanks and yeah you're completely right. 4x AF at the very[/] minimum perhaps, and extra points for 8x AF and 16x AF (I wonder if there's such thing as 32x AF lol).

Even from all the PC games I've played, (and I've checked this with an FPS counter on Steam), when I increase the texture filtering quality, or go from 4x AF to 16x AF, there is literally no performance cost at all. That's even more proof in the pudding.

It seems like it's something small that should be done without saying and implemented automatically...I guess not. And it's even more obvious this gen has games look better and are more graphically complex, so poor texture filtering is something that stands out as you'd expect it from the PS3 gen or something, if not before!

Like someone said above, it's unacceptable in 2015.

To be fair, like Dictator said, it may not be that simple on consoles. Devs could be using the memory bandwidth for effects they think matter more visually. I really wish we could find out if games this gen running on 4Ă— AF is the result of devs not putting too much effort to make sure textures are crisp or sacrifices have to be made for bandwidth hungry effects. I think the former is more likely.
 
Huh? Dark Souls II Scholars of the First Sin has 16x AF IIRC on PS4.

Also, I'm not sure what the AF on Bloodborne or Order 1886 were either, but they were really good, particularly the former, which despite not having a good AA solution, had really good texture filtering.

Here, have a read:

http://gamingbolt.com/has-sony-finally-fixed-ps4-anisotropic-filtering-issues

1886 had 4x on a lot of surfaces if I remember correctly.

Also, I am not sure if we should compare a decidedly last generation game's af levels versus a newer game with different bandwidth considerations. Do not forget, that although these consoles have nominally high levels of bandwidth... they share bandwidth the the CPU.
Yup. IIRC, NX Gamer (or it may have been Digital Foundry - I need to check on that), did a pre and post patch analysis of Dying Light and DmC Definitive Edition, and the increase in texture filtering had no performance impact.

Well that would just imply that those games had the overhead for it. Not every game are those games.
 
1886 had 4x on a lot of surfaces if I remember correctly.

Also, I am not sure if we should compare a decidedly last generation game's af levels versus a newer game with different bandwidth considerations. Do not forget, that although these consoles have nominally high levels of bandwidth... they share bandwidth the the CPU.

What about Dying Light though? And I know what you mean, but surely the principle still stays the same? I mean current gen only games that have obviously looked better than cross-gen games have had higher AF implementation? So there's obviously more to it then hardware, you get me?
 
1886 had 4x on a lot of surfaces if I remember correctly.

Also, I am not sure if we should compare a decidedly last generation game's af levels versus a newer game with different bandwidth considerations. Do not forget, that although these consoles have nominally high levels of bandwidth... they share bandwidth the the CPU.
Yeah, I don't remember The Order having particularly good AF. See here. Anyway, nothing is ever free. Things might use resources which are not fully utilized and thus have no impact on overall performance -- this often happens with AF on PC -- but that doesn't mean that they are always free.

Still, IMHO (and this has been my opinion for at least 10 years now) given the quality improvement good AF makes, if necessary other things should be sacrificed first before sacrificing AF quality.
 
No we aren't sure at all. The developers that do implement AF in their games never seem to use 16x either. It is usually 4x or 8x. IIRC Uncharted 4 is using 8x.
I honestly think 8x is a really nice amount for consoles and I hope we can see that more as time goes on. The difference between 8x and 16x is relatively minor and it would free up a bit more performance.
 
Huh? Dark Souls II Scholars of the First Sin has 16x AF IIRC on PS4.

Also, I'm not sure what the AF on Bloodborne or Order 1886 were either, but they were really good, particularly the former, which despite not having a good AA solution, had really good texture filtering.

DS2 is a last gen game. IIRC Bloodborne used 8x and the Order used adaptive 16x (used selectively).
 
Still, IMHO (and this has been my opinion for at least 10 years now) given the quality improvement good AF makes, if necessary other things should be sacrificed first before sacrificing AF quality.
AF is one of those tiny things that does a lot IMO. It seems strange to author really great high resolution textures and materials to basically have them all turn to matte smudge after 2 meters. Really bad priorities.
I honestly think 8x is a really nice amount for consoles and I hope we can see that more as time goes on. The difference between 8x and 16x is relatively minor and it would free up a bit more performance.

Yeah, 8x is seems like minimum it should be. It is high enough to prevent most of the problems you see at usual distances in most games. I think the road texture in every racing game should be 16x though.
 
I honestly think 8x is a really nice amount for consoles and I hope we can see that more as time goes on. The difference between 8x and 16x is relatively minor and it would free up a bit more performance.

In shorter distances it is minor. Room sized oblique textured surfaces will be fine. Moving outdoors with larger surfaces it becomes fairly obvious.
 
Yeah, 8x is seems like minimum it should be. It is high enough to prevent most of the problems you see at usual distances in most games. I think the road texture in every racing game should be 16x though.
Agreed about road textures in racing games. I don't play those so they slipped my mind.
 
Top Bottom