• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Digital Foundry: Tech Analysis: Rise of the Tomb Raider

Nothing smooth about drops to 52 or 53.
Drops are always annoying.
Even fron 90 to 80.



When's the pc version coming?
Edit q1 it seems.
Really, I'll take falls from 90-80 or 60-53/55 any day of the week, for every game that would be 30fps with dips other wise. Dips on the low end of 30fps is infinitely worse than dips at a much higher framerate....

Dont look good? I disagree, they look pretty good to me.
That's PBR and material work more than a character model. Show me some pics of Luke Skywalker and say with a straight face it's impressive. Character models are impressive based on their facial detail, skin shaders, SSS, wrinkles, hair, deformities/imperfections and of course animations. Luke Skywalker loses in all these formentioned aspects.......The animation in Battlefront is particularly unimpressive barring a lack of detail.
 
I don't find that particular character model impressive, cutscene or not relative to other games or rather character models in other games.


tlkItlu.jpg

OkIEYLC.jpg

sYehtYU.jpg

kDHkQey.jpg


As I said, anytime there was heavy alpha and effects on screen or/and many soldiers/big battles the framerate dropped massively. Now this scene is quite early in the game, there are definitely more tasking scenes as you progress and of course much later in the game, so picture that as you want.

Scenes where you engaged in formation also saw massive framedrops and even smaller battles saw it's fair share of being constantly under 30fps. Also, Until Dawn looks better than Ryse and it's not an action game as Ryse is, it's also 1080p and technically superior. Just a hint (look at some of the pools of water) whilst Marius is fighting in Ryse.....on the XB1.
Have you actually played the damn game all the way through? That is not the norm. The same issues exist in Until Dawn (which I agree looks better overall) where it drops into the basement frame-rate wise. UD is also a damn good game...Ryse...not so much.

Can you please just explain why you dislike Xbox One so much? I honestly have never seen such irrational hate aimed at a machine. It's a weak piece of hardware but I enjoy seeing underdog hardware in action. That's one of the things that made ps2 so damn interesting. PS3 as well.

In a world of free to play, mobile, and other bullshit I actually find it offensive that you take such a hard line against Xbox One. They're doing good things for gaming even if the hardware sucks yet you feel the need to shit on it constantly? Why? You're hurting the hobby.
 
Have you actually played the damn game all the way through? That is not the norm. The same issues exist in Until Dawn (which I agree looks better overall) where it drops into the basement frame-rate wise. UD is also a damn good game...Ryse...not so much.

Can you please just explain why you dislike Xbox One so much? I honestly have never seen such irrational hate aimed at a machine. It's a weak piece of hardware but I enjoy seeing underdog hardware in action. That's one of the things that made ps2 so damn interesting. PS3 as well.
You said there were no teen drops, I showed you that there is, what are you on about? Posting pics of a piece of software falling to the teens does not make me hate a piece of hardware, it's DF's very own video as well.

I think you know very well I don't hate any piece of hardware, I've owned every XBOX except this one, (I eventually buy all pieces of hardware if there is enough exclusives that interest me) and FYI, PS3 was not the underdog hardware last gen.
 
You said there were no teen drops, I showed you that there is, what are you on about? Posting pics of a piece of software falling to the teens does not make me hate a piece of hardware, it's DF's very own video as well.

I think you know very well I don't hate any piece of hardware, I've owned every XBOX except this one, (I eventually buy all pieces of hardware if there is enough exclusives that interest me) and FYI, PS3 was not the underdog hardware last gen.
I know what the experience of the entire game is like and I know that's one of the absoultey slowest scenes in the game. The performance is very much in line with Until Dawn barring the lack of a frame-rate cap.

In today's gaming world where these types of games are fighting for survival I think some of your posts border on irresponsible. I do not understand it. Try to be a little more upbeat or something. The XO is weak hardware but the people making games for it exclusively are doing their best to work around it. Give them some love once in a while. They have a damned difficult job. Your love of Until Dawn shows me how you are willing to overlook major issues on your preferred platform anyways.

The PS3 was hamstrung by the RSX. The Cell is wicked cool but it had to pick up ALL the slack. nVidia screwed Sony over, I feel. It had so much more potential that could have been realized with a better GPU. So much was accomplished on Cell.
 
I know what the experience of the entire game is like and I know that's one of the absoultey slowest scenes in the game. The performance is very much in line with Until Dawn barring the lack of a frame-rate cap.
There are scenes that task the GPU much more than this, even the beach scene which is still early on is a tear on framerate....
Drak10x said:
In today's gaming world where these types of games are fighting for survival I think some of your posts border on irresponsible. I do not understand it. Try to be a little more upbeat or something. The XO is weak hardware but the people making games for it exclusively are doing their best to work around it. Give them some love once in a while. They have a damned difficult job. Your love of Until Dawn shows me how you are willing to overlook major issues on your preferred platform anyways.
I don't think UD's framerate is great, I never said so, I'm only less harsh on it, due to the slow nature of it's gameplay and that's only comparing it to something like Ryse which is action heavy. I don't think I'm being unfair here in logic/reasoning.

For your information, I love action games, It's my favourite genre outside of fighters and racers. I can't wait till I see the unveiling of the next Ninja Gaiden, even though people shit on Team Hayashi, at least they give us native games at 60fps. Despite me loving NG, I will still be disappointed if they don't push the visuals for the next NG game. I can love something and want it to get better, I just don't accept mediocre efforts, especially when there's evidence that said work is mediocre.

Dark10x said:
The PS3 was hamstrung by the RSX. The Cell is wicked cool but it had to pick up ALL the slack. nVidia screwed Sony over, I feel. It had so much more potential that could have been realized with a better GPU. So much was accomplished on Cell.
Hardware is hardware regardless, the cell CPU is hardware, whether it made up for the RSX is besides the point, it was strong enough to do so and the first party efforts outshone the competing console as a result. Do you remember that the PS3 was going to be outfitted with two CELLS? It would not even have a traditional GPU as we know it......
 
I think you know very well I don't hate any piece of hardware, I've owned every XBOX except this one, (I eventually buy all pieces of hardware if there is enough exclusives that interest me) and FYI, PS3 was not the underdog hardware last gen.
It was in sales, mindshare and multiplatform games - for a very long time. Even the exclusives early on... you had Resistance coming out at the same time as Gears. To someone not counting flops but looking at games, PS3 sure would look as an underdog there.
 
It was in sales, mindshare and multiplatform games - for a very long time. Even the exclusives early on... you had Resistance coming out at the same time as Gears. To someone not counting flops but looking at games, PS3 sure would look as an underdog there.
Perhaps for multiplats early on, but I'll chalk that up more towards the exotic nature of the hardware (how difficult it was to program for) as opposed to it being weaker than 360, we all know it was not.

Eventually many multiplats performed on par or outshone the 360 versions as the better devs got to know the hardware and how they should approach it. Let's just take the biggest franchise atm, GTA. Rockstar went from 640p (in GTA4) on PS3 to 720p with a better framerate over the 360 in GTA5.

Again, it all boils down to the devs and if they are willing to use the hardware given or do the vast minimum. Look at High voltage's effort with ZOE2, then look at Hexadrive, same hardware, completely bonkers results on the flip. Was it the hardware's fault for Hexa's efforts? uhhhh...NO. That's all I'm saying error..

Btw Dark, all that extra power to run ZOE2 at 60fps, with bokeh dof, high rez alpha effects and a better resolution was all found in the hardware which was actually utilized. Hey, the PS3 has SPU's why don't we use the darn things to get good results, oh a foreign concept indeed......

I'm I harsh wanting more hexadrive efforts..? (in essence double the framerate efforts, higher resolution efforts, higher quality effects efforts). I don't think so, when lower-end hardware is actually accomplishing just that....So no, I don't think I'm wrong in wanting Alien Isolation, Xenoverse or RE-R2 to have been a solid 60fps on a PS4, when a 750ti does just that.
 
That's incorrect. It 99.9% never hits the 30s, and regularly stays to 50s or high 40s. As well as basically 60 in closed areas
https://youtu.be/RWWtm4Wq9QU?t=285

Come on ...

-----------

That's PBR and material work more than a character model. Show me some pics of Luke Skywalker and say with a straight face it's impressive. Character models are impressive based on their facial detail, skin shaders, SSS, wrinkles, hair, deformities/imperfections and of course animations. Luke Skywalker loses in all these formentioned aspects.......The animation in Battlefront is particularly unimpressive barring a lack of detail.
For 60fps MP title it is impressive yes. They look better than character models from U4 MP.
Animations are really good for FPS game with such responsive movement, so i dont know what You are talking about.

----
As I said, anytime there was heavy alpha and effects on screen or/and many soldiers/big battles the framerate dropped massively. Now this scene is quite early in the game, there are definitely more tasking scenes as you progress and of course much later in the game, so picture that as you want.

Scenes where you engaged in formation also saw massive framedrops and even smaller battles saw it's fair share of being constantly under 30fps. Also, Until Dawn looks better than Ryse and it's not an action game as Ryse is, it's also 1080p and technically superior. Just a hint (look at some of the pools of water) whilst Marius is fighting in Ryse.....on the XB1.
Thats the most demanding scene in the entire game.
And it completely breaks where there are two explosions close to the camera and like 70+ characters on screen ...

And how exactly Until Dawn is technically superior? Non action game makes it less demanding You know that? Also it runs on higher spec platform and was not a launch title made on engine that was already used on PS4 more than year earlier.

By going by those posts what do I even expect from You:
http://www.neogaf.com/forum/showpost.php?p=183247379&postcount=102
http://www.neogaf.com/forum/showpost.php?p=181227122&postcount=602
 
Thelastword won't give anything under 1080p on PS4 any credit, even at 60fps with a lot of action at all times and high fidelity.
Pretty much this. I would like to see what he has to say about UC4 MP. It may be 900p, but it is running at 60FPS and has a lot of shit going on at times. And personally, I think the visuals overall look closer to the SP than previous games's MP were to their respective SP's, so I find it very impressive overall.
 
Pretty much this. I would like to see what he has to say about UC4 MP. It may be 900p, but it is running at 60FPS and has a lot of shit going on at times. And personally, I think the visuals overall look closer to the SP than previous games's MP were to their respective SP's, so I find it very impressive overall.
I agree. The differences in previous Uncharted MP modes were drastic. I'm glad the compromises in that mode are at least in service of 60fps now.

I will say I'm not a fan of 900p myself at 30fps but 60s added temporal resolution helps a lot, especially when it's something as impressive as Battlefront or Uncharted 4.
 
I agree. The differences in previous Uncharted MP modes were drastic. I'm glad the compromises in that mode are at least in service of 60fps now.

I will say I'm not a fan of 900p myself at 30fps but 60s added temporal resolution helps a lot, especially when it's something as impressive as Battlefront or Uncharted 4.
Yep, I accept the compromises made by devs to hit 900p if the game runs at 60FPS or it looks ahead of most other games in that specific mode. Since Battlefront and UC4 MP meet both those conditions, I think 900p may be a good choice. Furthermore, we are likely to still see improvements for UC4 MP since the beta launches next month.
 
I agree. The differences in previous Uncharted MP modes were drastic. I'm glad the compromises in that mode are at least in service of 60fps now.

I will say I'm not a fan of 900p myself at 30fps but 60s added temporal resolution helps a lot, especially when it's something as impressive as Battlefront or Uncharted 4.

Agreed. I was very impressed how good Battlefront looked. It was harder to tell it was 900p compared to other 900p games. UC4 looks to be on the same image quality as well.

Those bells and whistles in it are well worth the tradeoff at 60fps.
 
TRDE on PS4 definitely wasn't a locked 60, but its average was around 52 or 53, so, even if it dropped, it maintained a good amount of consistent smoothness far beyond what 30fps and below TR2013 was on the last generation.

Im one of those who dont like variable framerates and it bothered me in second son and killzone, but in TRDE i had no problem with it. Just like you said the average framerate stays pretty high, so that overall it still felt good and smooth even if it droped a little bit here and there.

But with Rise i dont think they can get to that same level, but imo they dont need to. I would be happy with a perfect 30 frames per seconds without screen tearing!
 
There are scenes that task the GPU much more than this, even the beach scene which is still early on is a tear on framerate....
The beach scene runs better than the catapult sequence. That opening area really is the choppiest area in the entire game, you know. By far.

I don't think UD's framerate is great, I never said so, I'm only less harsh on it, due to the slow nature of it's gameplay and that's only comparing it to something like Ryse which is action heavy. I don't think I'm being unfair here in logic/reasoning.
I think the fact that UD is such a slow game yet still struggles is more disappointing. The scene in Ryse that drops into the teens features loads of enemies on screen and lots of alpha effects. In comparison, the areas that drop into the teens in UD have basically on or two characters on screen at most. In my case, the type of game in question does not change how I feel about frame-rate. It doesn't matter if it's an action game or not - I like smooth frame-rates for visual reasons, not control reasons.

Btw Dark, all that extra power to run ZOE2 at 60fps, with bokeh dof, high rez alpha effects and a better resolution was all found in the hardware which was actually utilized. Hey, the PS3 has SPU's why don't we use the darn things to get good results, oh a foreign concept indeed......
I've never doubted that. I think it's ridiculous how you belittle developers that struggled with SPU usage, however, as if it were a trivial thing. Not every studio is fortunate enough to have brilliant coders that can deal with this stuff. That doesn't mean they are being lazy. Coding well for SPUs is a very VERY difficult task and not everyone can pull it off.

Still, we know that High Voltage Software are well below average so the improvements are not shocking to ZOE2.

I'm I harsh wanting more hexadrive efforts..? (in essence double the framerate efforts, higher resolution efforts, higher quality effects efforts). I don't think so, when lower-end hardware is actually accomplishing just that....So no, I don't think I'm wrong in wanting Alien Isolation, Xenoverse or RE-R2 to have been a solid 60fps on a PS4, when a 750ti does just that.
Yes, you're harsh.

I will never agree with you on Alien Isolation. Have you actually played the game? With its incredible materials system, full volumetric lighting, and amazing visual design, the PS4 version is great. It runs at a completely stable 30fps. It doesn't matter if YOU think there is more potential there, the end result is a great product. Stop bringing up that damn 750ti - that's not the reason why PCs are pulling ahead. It's the CPU, not the GPU. I'm sure if you unlocked the frame-rate on PS4 it would turn in faster performance, but it would still be unstable. A locked 30fps is the better choice. Alien Isolation is a beautiful game that runs at a completely stable frame-rate. It launched with a few issues, mind you, but they were quickly addressed and it's completely solid now.

Xenoverse - who the **** cares? I mean, really? Was that a bad port? If so, does it REALLY matter? You're dredging up some low rent DBZ thing as an example.

RE-R2 is interesting as I feel it is the result of its very low budget and the need to release on five or six platforms. Clearly porting MT Framework to PS4 gave Capcom some issues but, even looking at the PC verison, it's clear the game is not well optimized. Again, I chalk that up as budgetary issues - the game was made on the cheap. My PC can easily run RE5, RE6, and both Lost Planet games are 4k60 - RE-R2, though? I actually encountered slowdown at 1080p in spots despite the game looking much worse. That's on a GTX-780 + i5-3570k. The developers made huge strides on PS4, though, and the current version actually runs much better. Aside from the forest areas, which still dip, the rest of the game is very solid now. Considering their constraints I'd say they did their best.

Basically, I think you don't understand game development realities or the fact that not all coders are created equally. It's not about "being lazy" here.

What kind of display are you using again? I see you mentioned a 1440p monitor. I hope to hell that's not what you're playing console games on.
 
Really good post here, but I fear it's for naught. Thelastword will always stay in his belief that if a game runs better on the i3/750Ti combo than the PS4, then it's a horrible port and the devs are lazy. If you follow his posts, you'll realize that he keeps on bringing up the same damn games. I'll be honest and say that I too think that the PS4 should outperform that combo, but I don't go whining about it in every thread.
 
Really good post here, but I fear it's for naught. Thelastword will always stay in his belief that if a game runs better on the i3/750Ti combo than the PS4, then it's a horrible port and the devs are lazy. If you follow his posts, you'll realize that he keeps on bringing up the same damn games. I'll be honest and say that I too think that the PS4 should outperform that combo, but I don't go whining about it in every thread.

Isn't an i3 stronger as a CPU than the 6 core Jaguar in the PS4?
 
Isn't an i3 stronger as a CPU than the 6 core Jaguar in the PS4?
Yeah, most i3's should be stronger than the Jaguar cores. That's probably what's giving it the edge in cross gen games. Some current gen only games that may be optimized better for consoles like F1 2015, Project Cars, Evolve and Star Wars Battlefront (beta, at least) run better on the PS4. My theory is that the i3 allows the combo to perform better than the PS4 in cross gen games based on brute force on the CPU side. I could be wrong, though.
 
Isn't an i3 stronger as a CPU than the 6 core Jaguar in the PS4?

Yes, most definitely. It can even hold its own against "actual" eight-core AMD CPUs with higher clockspeeds (the console processors have eight cores, too, but two are reserved for background functionality). The Jaguars in the PS4 and X1 aren't gaming-orientated in the slightest; they're intended for low-power mobile devices, which is what happens when profitability replaces power as the primary goal. Considering how much the PS3 and X360 cost Sony/MS, though, it's not a surprise, and I doubt we'll see a similar loss-leading approach ever again.
 
Yeah, most i3's should be stronger than the Jaguar cores. That's probably what's giving it the edge in cross gen games. Some current gen only games that may be optimized better for consoles like F1 2015, Project Cars, Evolve and Star Wars Battlefront (beta, at least) run better on the PS4. My theory is that the i3 allows the combo to perform better than the PS4 in cross gen games based on brute force on the CPU side. I could be wrong, though.
Yep, that's exactly right. It's not the 750ti that he keeps bringing up, it's the CPU. It was always the CPU.

It is the inverse problem of PS3 where you had a powerful CPU coupled with a terrible GPU.
 
Yes, most definitely. It can even hold its own against "actual" eight-core AMD CPUs with higher clockspeeds (the console processors have eight cores, too, but two are reserved for background functionality). The Jaguars in the PS4 and X1 aren't gaming-orientated in the slightest; they're intended for low-power mobile devices, which is what happens when profitability replaces power as the primary goal. Considering how much the PS3 and X360 cost Sony/MS, though, it's not a surprise, and I doubt we'll see a similar loss-leading approach ever again.

Yeah, but if games by design were made and optimized for the 6 core jaguar architecture from the outset like the technical director from 4A Games recommends, there would be no issue in regards to that right?

I wonder why devs don't just do that instead for multiplats with PC, wouldn't that theoretically just make it easier to scale up to more powerful components on PC? Or is it still the case that PC games are not well suited to high core counts?
 
Yeah, but if games by design were made and optimized for the 6 core jaguar architecture from the outset like the technical director from 4A Games recommends, there would be no issue in regards to that right?

I wonder why devs don't just do that instead for multiplats with PC, wouldn't that theoretically just make it easier to scale up to more powerful components on PC? Or is it still the case that PC games are not well suited to high core counts?

I find it very hard to believe that multiplats are not already tightly optimized for the multicore consoles. In fact this goes against the great scaling we found on PC games with activity on 8 cores or even more. Have you forgotten about Battlefield 4 ?
Johan explained that Battlefield utilize 90-95% of the 6 parallel CPU cores available to developers on the Octa-core AMD Jaguar that powers both PS4 and Xbox One.
http://www.gamepur.com/news/12874-battlefield-4-uses-90-95-cpu-power-ps4-and-xbox-one-says-dice.html


At this point shall we take the plunge and dare facing reality ? Those Jaguar cores are weak to put it politely.
 
I agree. The differences in previous Uncharted MP modes were drastic. I'm glad the compromises in that mode are at least in service of 60fps now.

I will say I'm not a fan of 900p myself at 30fps but 60s added temporal resolution helps a lot, especially when it's something as impressive as Battlefront or Uncharted 4.

Yeah, not a fan of 900p either...but Battlefront looks really impressive. Like a lot more impressive than any Frostbite game before it. Looking at it next to say Dragon Age Inquisition...it is a big difference. Now I am really keen to what Visceral's 3rd person action game looks like!
 
Yeah, but if games by design were made and optimized for the 6 core jaguar architecture from the outset like the technical director from 4A Games recommends, there would be no issue in regards to that right?
What do you mean with made and optimized? The CPUs are what they are. Games already got bigger and do more, also keep in mind that you need the CPU for the GPU stuff. The GPU depends on the CPU. They are also nothing special in any way, besides that they are slow and miss features other CPUs already have. Also, games are already tailored for these systems, otherwise we would see even worse performance.

I wonder why devs don't just do that instead for multiplats with PC, wouldn't that theoretically just make it easier to scale up to more powerful components on PC? Or is it still the case that PC games are not well suited to high core counts?

It shouldn't make any difference whether a PC has 2 cores and 4 ht threads or anything - the scheduler will take care of it and I don't think nowadays there are many games which are not that suited for multicore.
 
I find it very hard to believe that multiplats are not already tightly optimized for the multicore consoles. In fact this goes against the great scaling we found on PC games with activity on 8 cores or even more. Have you forgotten about Battlefield 4 ?

http://www.gamepur.com/news/12874-battlefield-4-uses-90-95-cpu-power-ps4-and-xbox-one-says-dice.html

At this point shall we take the plunge and dare facing reality ? Those Jaguar cores are weak to put it politely.

What do you mean with made and optimized? The CPUs are what they are. Games already got bigger and do more, also keep in mind that you need the CPU for the GPU stuff. The GPU depends on the CPU. They are also nothing special in any way, besides that they are slow and miss features other CPUs already have. Also, games are already tailored for these systems, otherwise we would see even worse performance.


Well we know that these Jaguar cores are not powerful by default...i'm not arguing otherwise.

My thing is, a lot of the games coming out seem to be trying to go beyond their means, especially in regards to FPS. But if your making a game well within the scope of the hardware you've got, performance should not be all that much of an issue as long as its not overboard.

UC4 single player is probably(probably...) gonna be a very solid 30 just because they were aiming for 60 initially before they scaled down the framerate ambition, and we know they had gotten to about 40fps before giving up on it.

Multiplayer would probably be 1080 locked 30 as well if they did not go for 60fps there too...

If your game design is fit with your hardware, then performance should not suffer all that much even if your trying to be more ambitious is my main argument

Is that a wrong assumption to make?
 
It would be great if the PS4-version will support PlayStation VR. By the end of 2016 it should be available. They could use the assets of the Xbox 360-version to achieve 60 fps.
 
Im one of those who dont like variable framerates and it bothered me in second son and killzone, but in TRDE i had no problem with it.

Yeah that's weird.. variable frame rates really bothered me as well in inFamous & Shadow Fall... so much so that I locked inFamous: SS to 30fps when that option was patched in.

I had no such issue with TRDE on PS4. Anyone know the technical reason why this would be?
 
Well we know that these Jaguar cores are not powerful by default...i'm not arguing otherwise.
So we are in agreement that devs do with what they have here, but at the same time it is true that if you refuse to acknowledge the limits of your target hardware performance is never going to be satisfactory. That is true, but I don't think they have gone too far in that respect, sure it drops frames for now but they are trying to keep it playable.

My thing is, a lot of the games coming out seem to be trying to go beyond their means, especially in regards to FPS. But if your making a game well within the scope of the hardware you've got, performance should not be all that much of an issue as long as its not overboard.
So you want them to reduce the scope and ambition of their games ? I don't think this is justified for now, even if the framerate is not rock solid performance is still acceptable. We are not talking about 20fps for extended periods of time, like Crysis 3 on PS3 or Skyrim on the same platform.
Fallout 4 performs much better than those two. It should perhaps perform even better in some cases, Bethesda might be working on something already.

If your game design is fit with your hardware, then performance should not suffer all that much even if your trying to be more ambitious is my main argument
Is that a wrong assumption to make?
Essentially your position is that devs should not push those CPUs too far, but as it stands I believe they have not. Drops do happen but they do not dilute the experience that Fallout 4 provides.
I don't think you would be okay with further scaled down LOD and draw distance so the framerate could stick closer to the 30fps target.
Devs make choices with what they have and apparently a sizeable part of them are okay with pushing those Jaguar cores even if its means framerate can drop below 30fps.

It is a sad truth that performance matters a lot less than visuals in the grand scheme of things. Publishers and developpers know their games are going to be judged on how good they look and not on how they run most of the time.
 
Yeah that's weird.. variable frame rates really bothered me as well in inFamous & Shadow Fall... so much so that I locked inFamous: SS to 30fps when that option was patched in.

I had no such issue with TRDE on PS4. Anyone know the technical reason why this would be?

Maybe higher average framerate. I don't recall Infamous and SF ever spending that much time above 50.
 
i3 is 2-3 times faster than a Jaguar in IPC. I pointed it out last year and even delved into intel notebook crap and found the Celeron is twice as fast in terms of per watt.

I wanted some different tests as an i3 is also dual core. Since AMD AFAIK haven't released a six core version of the Jaguar, a desktop Athlon 5350 Jaguar 4 core clocked at 2ghz might be a good starting point. They did a 5150 at 1.6ghz like PS4 but its only a 4 core.

AMD Phenom II 955 from 2009 is 2x faster. Seems like Jaguar is roughly in the ball park of a Athlon 620 in scores but 620 wasn't far behind the Phenom II in IPC. Jaguar is the lowest of the low as it's ~25w notebook CPU.

Alien Isolation doesn't take much to run, in pc terms yes it is a toaster
nT9b.jpg

But a 55W Celeron is faster, as is a ~25w Celeron from what I checked last year. Perhaps the Jag is just shy but CPU terms it's one of the least demanding. Being made a by a small dev team I think we should be happy they did all those version and not hang your hat on it which lastword often does and uses other one off situations.

For testing NXgamer throws in an A8 or A10 but that is faster than the Jaguar.

Anyways, I'd certainly be interested in how low you can go in desktop CPU terms to run Alien Isolaton. Can a Jaguar 5150 run it at 60fps?

Not sure how much overhead the windows OS takes and how apples to apples you can get when we get this low. For PC info terms there's literally no need to buy a CPU like this, the price becomes quite silly, you don't need to go this low when CPUs 2-3 times faster are available for cheap new. Used CPU market is system builders paradise as the last 5 years intel haven't progressed much and ultra bargains can be found
 
i3 is 2-3 times faster than a Jaguar in IPC. I pointed it out last year and even delved into intel notebook crap and found the Celeron is twice as fast in terms of per watt.

I wanted some different tests as an i3 is also dual core. Since AMD AFAIK haven't released a six core version of the Jaguar, a desktop Athlon 5350 Jaguar 4 core clocked at 2ghz might be a good starting point. They did a 5150 at 1.6ghz like PS4 but its only a 4 core.

AMD Phenom II 955 from 2009 is 2x faster. Seems like Jaguar is roughly in the ball park of a Athlon 620 in scores but 620 wasn't far behind the Phenom II in IPC. Jaguar is the lowest of the low as it's ~25w notebook CPU.

Alien Isolation doesn't take much to run, in pc terms yes it is a toaster
nT9b.jpg

But a 55W Celeron is faster, as is a ~25w Celeron from what I checked last year. Perhaps the Jag is just shy but CPU terms it's one of the least demanding. Being made a by a small dev team I think we should be happy they did all those version and not hang your hat on it which lastword often does and uses other one off situations.

For testing NXgamer throws in an A8 or A10 but that is faster than the Jaguar.

Anyways, I'd certainly be interested in how low you can go in desktop CPU terms to run Alien Isolaton. Can a Jaguar 5150 run it at 60fps?

Not sure how much overhead the windows OS takes and how apples to apples you can get when we get this low. For PC info terms there's literally no need to buy a CPU like this, the price becomes quite silly, you don't need to go this low when CPUs 2-3 times faster are available for cheap. Used CPU market is system builders paradise as the last 5 years intel haven't progressed much and ultra bargains can be found

According to Sebbi (Ubisoft devs) on Beyond3d a 1.6ghz Haswell is twice as fast as a Jaguar core. So a 3.2ghz Haswell core is 4 times faster.
DX12 is going to make those CPUs fly.
 
So you want them to reduce the scope and ambition of their games ? I don't think this is justified for now, even if the framerate is not rock solid performance is still acceptable. We are not talking about 20fps for extended periods of time, like Crysis 3 on PS3 or Skyrim on the same platform.
Fallout 4 performs much better than those two. It should perhaps perform even better in some cases, Bethesda might be working on something already.

I think your absolutely correct on Fallout 4. I don't think they actively need to scale back the engine load for that game, i feel its a matter of slightly more efficient coding being necessary to get that FPS up a bit more. Bethesda by default doesn't really think about technical polish until afterward, so we'll have to wait and see.

And i also think your correct on a majority of games being fine for now in regards to the tradeoff of the CPU power.

But some games i think do go too far already, like COD BO3. Sustained drops far below 60 even with dynamic res is uneccesary, considering how AW ran far better and more consistently while at the same time looking better graphics wise.

Having played the campaign, i don't think shoving a bazillion robots on screen at once was a fair trade off for performance. But i also felt the same about ACUnity in regards to its NPC count, so that may be just me. It comes down to each individual game

According to Sebbi (Ubisoft devs) on Beyond3d a 1.6ghz Haswell is twice as fast as a Jaguar core. So a 3.2ghz Haswell core is 4 times faster.
DX12 is going to make those CPUs fly.

How does that actually work by the way? If a single 3.2 Haswell core is hypothetically 4 times faster than a 1.6 Jaguar core, and there's 8 cores of them in each CPU, do they stack together?
 
According to Sebbi (Ubisoft devs) on Beyond3d a 1.6ghz Haswell is twice as fast as a Jaguar core. So a 3.2ghz Haswell core is 4 times faster.
DX12 is going to make those CPUs fly.

In my post last year I said 3-4 times faster. I'm just talking loosley here.

Thing is as fast as it is the i3, DF have found dual core in some games to a be problem due to the game.

This highlights the PC desktop space. Intel keep 4 core as some sort of premium. you can get away with 2 cores but I'd go for 4 core intel in building a system and even look into used and buy the rest new if must use new stuff. For testing I don't think an i3 for Digital foundry is a good solution as its IPC is way up there with the highest but it's just two core. This has been the case where even 5 years ago dual core can live with the the six core £1000 monster as the IPC is very high in everything they do. AMD have sliced up there poor CPUs seven ways to sunday in the last 10 years and barely moved.
 
I think your absolutely correct on Fallout 4. I don't think they actively need to scale back the engine load for that game, i feel its a matter of slightly more efficient coding being necessary to get that FPS up a bit more. Bethesda by default doesn't really think about technical polish until afterward, so we'll have to wait and see.
I'm not convinced they have not already made the most of the Jaguar cores considering the game on PC is very well multithreaded and DX11 is famously not that multithreading friendly, particularly on AMD. I assume console APIs do not have the same limitation. DX12 scaling seems very impressive compared to DX11.

And i also think your correct on a majority of games being fine for now in regards to the tradeoff of the CPU power.
Fine by me that is, I do not have the pretention of speaking for anybody else than myself. I understand some do not accept drops below 30fps at all, but this is difficult to maintain in every situations as games become more and more ambitious visually. You know as well as I do that visuals take precedence over performance and resolution, naturally the Xbox One is hit harder and sooner.

But some games i think do go too far already, like COD BO3. Sustained drops far below 60 even with dynamic res is uneccesary, considering how AW ran far better and more consistently while at the same time looking better graphics wise.
Treyarch may have gone a bit too far indeed, although they deserve a lot of credit for the visual spectacle of the SP campaign, it looks really great.
I would not instinctively suspect Treyarch do not know how to code for consoles, rather they have priorised tech over a pristine 60fps. You have every right not to agree with them, that they should have scaled things back somewhat to hit a consistent 60fps.
I do not agree with you regarding AW which I played 14 hours (SP only) on PC, it looks better sure but it not as "busy" as Black Ops 3. I don't think you can use it to downplay Treyarch's effort on consoles, they may not have made the right choice according to you but it is easy to tell why the game runs a bit worse than AW. Quite an insane amount of things happening on screen in the Cairo level for instance. Pay attention to the many AIs running around, the particles, the great volumetric lighting.
Tech wise Treyarch have done well, but the hardware apparently can't keep up.

Having played the campaign, i don't think shoving a bazillion robots on screen at once was a fair trade off for performance. But i also felt the same about ACUnity in regards to its NPC count, so that may be just me. It comes down to each individual game
Likewise, it is just my opinion that their trade-offs are valid. Yes, it is not consistent performance and again I wish to emphasize that your opinion is completely valid, but gamers are very quick to point out graphical flaws especially in AAA games, so I understand why they chose this aspect to take center-stage. I suspect they would have gotten far harsher complains about the toned down visuals even if their games sustained 30fps (Unity) or 60fps (COD).
I mean have you seen the reception Syndicate had ? Guess which aspect got a lot of flack. Yet performance is very good on PS4 and Xbox One, exactly what they did not go for with Unity.
Yet their performance achievement is seldom underlined. It is possible they reverse cours and set the graphical bar higher in the next entry with lower performance as a result.

How does that actually work by the way? If a single 3.2 Haswell core is hypothetically 4 times faster than a 1.6 Jaguar core, and there's 8 cores of them in each CPU, do they stack together?
I don't know but what it could mean is that a single 3.2ghz Haswell core can handle the workload of two 1.6ghz Jaguar cores during the same time frame. That would explain why a Core I3 does so well, two physical cores and two logical ones.
We need an API as efficient as the GNM/Custom D3D on Xbox One to properly assess CPU performance. The next wave of DX12 multiplats will be extremely interesting, I wonder what CPU you will need then to have a console experience. If a Core I3 does the job fine with the back-breaking overhead of DX11 imagine what it could do with near console API efficiency.....
 
I mean have you seen the reception Syndicate had ? Guess which aspect got a lot of flack. Yet performance is very good on PS4 and Xbox One, exactly what they did not go for with Unity.
Yet their performance achievement is seldom underlined. It is possible they reverse cours and set the graphical bar higher in the next entry with lower performance as a result.

I don't think that's necessarily the case. Far Cry 4 looks way better than Assasin's Creed this gen, and yet hits its performance target consistently so. The visuals and performance line should be balanced correctly, not just one or the other

I think the outcry against assasin's creed these days is more that the game itself is tired, and more importantly, the cutbacks are already obvious to see because of what we know they went with in Unity.

Since we know they went for crazy NPC counts in Unity at the cost of performance, simply cutting out the NPC's in the next game was not going to be received well even if it had higher performance.

If Unity had not had that high NPC count at the cost of performance to start, nobody would have known what was cut and what was kept at the cost of performance, thus no outrage
 
I don't think that's necessarily the case. Far Cry 4 looks way better than Assasin's Creed this gen, and yet hits its performance target consistently so. The visuals and performance line should be balanced correctly, not just one or the other
Balanced is subjective, they do have a balance actually, it's just that it is more towards visuals than performance these days. Syndicate looks better to me than Far Cry 4 hence the drop in resolution.
Ubisoft are everything but amateurs when it comes to tech as they have proved for years and years.
Now, whether or not you agree with their balance is another thing entirely.

I think the outcry against assasin's creed these days is more that the game itself is tired, and more importantly, the cutbacks are already obvious to see because of what we know they went with in Unity.
Unity perhaps pushed consoles too hard and we saw the results, however if the reception of Syndicate regarding visuals is not good enough by their standards then it is possible they will put further emphasis on graphics and aim for the same resolution.
Franchise fatigue is orthogonal to tech discussions.

I believe Unity and Syndicate are both very well optimized games. The hardware can only do so much and that is true on the PC side as well, although a very heavy and ancient (2009) API must not help at all. Surely current mainstream gaming CPUs could do more with DX12.

Since we know they went for crazy NPC counts in Unity at the cost of performance, simply cutting out the NPC's in the next game was not going to be received well even if it had higher performance.
Actually I'm not certain the number of NPC was the only culprit, rather the impressive amount of geometry must hit GPUs pretty hard. Syndicate also impresses in that department, but lacks the beautiful lighting. Once again a matter of balance, they have done well in regards to performance but some would disagree.
That will not be a problem on upper mid-end PCs (7970/770 and up) anyway.

If Unity had not had that high NPC count at the cost of performance to start, nobody would have known what was cut and what was kept at the cost of performance, thus no outrage
The outrage is still not justified. Gamers wanted solid performance ? Ubisoft delivered just that. They can't blame Ubisoft for making trade-offs with console hardware.
You can't have it all, and that applies for the PC platform as well to a degree.
 
Ubisoft said NPC count wasn't to blame. They even recycle old NPC data to get that amount to not impact CPU.

On playing the PC version I'd agree. Frame didn't suffer with one or 20k NPC on screen and more to do with the background systems and geometry
 
The beach scene runs better than the catapult sequence. That opening area really is the choppiest area in the entire game, you know. By far.
You said Ryse only dropped to the low 20's, I said it dropped to the teens. I showed you where it did and now your argument is "this is the only area it's dropping to the teens"......I have proven what I was saying, yet you are only changing the goalposts. I never expected you to be doing this tbh.....

That scene with the catapult is not the most intensive in Ryse, that scene is controlled and the enemies are far enough away with their reduced A.i and animations, they're basically fodder till you hit some key markers. Framerate tanks when there is a deluge of fire/brimstone/arrows (in essence alpha effects), especially if it lands near you. You are never really close to the enemies in that scene, yet there are scenes in Ryse, where you have just as many men on screen/or close enough running about and you're actively engaging them on foot (which means they will now posess better A.i and animations), yet these scenses also have heavy usage of brimstone/ fire or alpha effects just the same. Saying this is the most intensive scene in Ryse is like saying the controlled El Gigante fight in RE5 was the most intensive in that game.

In any case, DF usually only gives a small sample of footage very early in the games they show. Most setpieces later on in games are much more intense on the GPU, but that's rarely tested. If this game is falling to the teens in this small sample, (which even you did not believe), how surprised would you be to see that this was not the only place XB1 Ryse had issues. Even on smaller scale battles in this very sample, Ryse would fall into the mid 20's and low 20's quite often.

Yep, that's exactly right. It's not the 750ti that he keeps bringing up, it's the CPU. It was always the CPU.

It is the inverse problem of PS3 where you had a powerful CPU coupled with a terrible GPU.
Both consoles have the same CPU, how is it that you want to push a CPU front when much weaker CPU's than the i3 are running all the games I mentioned at 60fps with the weaker 750ti. How is it that the same CPU and weaker GPU (XB1) is running RE-R2 at 60fps at 1080p, when the PS4 is not? So the 40% additional headroom of the PS4 GPU can't give it an advantage over the XB1 in a GPU bound game when they're both at 1080p, when the PS4 is struggling with GPU related effects? Is it the CPU doing the rendering of grass in RE-R2?

Moreso, is it the overclocked assy CPU in the XB1 which we all like to declare as garbage conveniently pulling in a 20fps difference in framerate over the PS4 version on release. Even now after a patch to the PS4 the XB1 version is still ahead by 10fps in said scenes.

I'd like some answers with evidence......can you tell me how all these CPU bound games that was spoken of in DF articles got magical boosts in framerates? Did the CPU of the PS4 change overnight in such cases?
 
Top Bottom