Alpha Phoenix
Member
I think you will be disappointed if you're expecting the PS4 version to be 60fps.Sounds not good.
Will look forward to seeing 60fps versionin action to see how the game looks then though.(s)
I think you will be disappointed if you're expecting the PS4 version to be 60fps.Sounds not good.
Will look forward to seeing 60fps versionin action to see how the game looks then though.(s)
Really, I'll take falls from 90-80 or 60-53/55 any day of the week, for every game that would be 30fps with dips other wise. Dips on the low end of 30fps is infinitely worse than dips at a much higher framerate....Nothing smooth about drops to 52 or 53.
Drops are always annoying.
Even fron 90 to 80.
When's the pc version coming?
Edit q1 it seems.
That's PBR and material work more than a character model. Show me some pics of Luke Skywalker and say with a straight face it's impressive. Character models are impressive based on their facial detail, skin shaders, SSS, wrinkles, hair, deformities/imperfections and of course animations. Luke Skywalker loses in all these formentioned aspects.......The animation in Battlefront is particularly unimpressive barring a lack of detail.Dont look good? I disagree, they look pretty good to me.
Have you actually played the damn game all the way through? That is not the norm. The same issues exist in Until Dawn (which I agree looks better overall) where it drops into the basement frame-rate wise. UD is also a damn good game...Ryse...not so much.I don't find that particular character model impressive, cutscene or not relative to other games or rather character models in other games.
![]()
![]()
![]()
![]()
As I said, anytime there was heavy alpha and effects on screen or/and many soldiers/big battles the framerate dropped massively. Now this scene is quite early in the game, there are definitely more tasking scenes as you progress and of course much later in the game, so picture that as you want.
Scenes where you engaged in formation also saw massive framedrops and even smaller battles saw it's fair share of being constantly under 30fps. Also, Until Dawn looks better than Ryse and it's not an action game as Ryse is, it's also 1080p and technically superior. Just a hint (look at some of the pools of water) whilst Marius is fighting in Ryse.....on the XB1.
You said there were no teen drops, I showed you that there is, what are you on about? Posting pics of a piece of software falling to the teens does not make me hate a piece of hardware, it's DF's very own video as well.Have you actually played the damn game all the way through? That is not the norm. The same issues exist in Until Dawn (which I agree looks better overall) where it drops into the basement frame-rate wise. UD is also a damn good game...Ryse...not so much.
Can you please just explain why you dislike Xbox One so much? I honestly have never seen such irrational hate aimed at a machine. It's a weak piece of hardware but I enjoy seeing underdog hardware in action. That's one of the things that made ps2 so damn interesting. PS3 as well.
You would basically need an igpu r9 290, and if you consider ps4 is running essentially an igpu 7850, I don't really think 4k 30 fps is a pipe dream a few years from now.
I know what the experience of the entire game is like and I know that's one of the absoultey slowest scenes in the game. The performance is very much in line with Until Dawn barring the lack of a frame-rate cap.You said there were no teen drops, I showed you that there is, what are you on about? Posting pics of a piece of software falling to the teens does not make me hate a piece of hardware, it's DF's very own video as well.
I think you know very well I don't hate any piece of hardware, I've owned every XBOX except this one, (I eventually buy all pieces of hardware if there is enough exclusives that interest me) and FYI, PS3 was not the underdog hardware last gen.
There are scenes that task the GPU much more than this, even the beach scene which is still early on is a tear on framerate....I know what the experience of the entire game is like and I know that's one of the absoultey slowest scenes in the game. The performance is very much in line with Until Dawn barring the lack of a frame-rate cap.
I don't think UD's framerate is great, I never said so, I'm only less harsh on it, due to the slow nature of it's gameplay and that's only comparing it to something like Ryse which is action heavy. I don't think I'm being unfair here in logic/reasoning.Drak10x said:In today's gaming world where these types of games are fighting for survival I think some of your posts border on irresponsible. I do not understand it. Try to be a little more upbeat or something. The XO is weak hardware but the people making games for it exclusively are doing their best to work around it. Give them some love once in a while. They have a damned difficult job. Your love of Until Dawn shows me how you are willing to overlook major issues on your preferred platform anyways.
Hardware is hardware regardless, the cell CPU is hardware, whether it made up for the RSX is besides the point, it was strong enough to do so and the first party efforts outshone the competing console as a result. Do you remember that the PS3 was going to be outfitted with two CELLS? It would not even have a traditional GPU as we know it......Dark10x said:The PS3 was hamstrung by the RSX. The Cell is wicked cool but it had to pick up ALL the slack. nVidia screwed Sony over, I feel. It had so much more potential that could have been realized with a better GPU. So much was accomplished on Cell.
It was in sales, mindshare and multiplatform games - for a very long time. Even the exclusives early on... you had Resistance coming out at the same time as Gears. To someone not counting flops but looking at games, PS3 sure would look as an underdog there.I think you know very well I don't hate any piece of hardware, I've owned every XBOX except this one, (I eventually buy all pieces of hardware if there is enough exclusives that interest me) and FYI, PS3 was not the underdog hardware last gen.
Perhaps for multiplats early on, but I'll chalk that up more towards the exotic nature of the hardware (how difficult it was to program for) as opposed to it being weaker than 360, we all know it was not.It was in sales, mindshare and multiplatform games - for a very long time. Even the exclusives early on... you had Resistance coming out at the same time as Gears. To someone not counting flops but looking at games, PS3 sure would look as an underdog there.
https://youtu.be/RWWtm4Wq9QU?t=285That's incorrect. It 99.9% never hits the 30s, and regularly stays to 50s or high 40s. As well as basically 60 in closed areas
For 60fps MP title it is impressive yes. They look better than character models from U4 MP.That's PBR and material work more than a character model. Show me some pics of Luke Skywalker and say with a straight face it's impressive. Character models are impressive based on their facial detail, skin shaders, SSS, wrinkles, hair, deformities/imperfections and of course animations. Luke Skywalker loses in all these formentioned aspects.......The animation in Battlefront is particularly unimpressive barring a lack of detail.
Thats the most demanding scene in the entire game.As I said, anytime there was heavy alpha and effects on screen or/and many soldiers/big battles the framerate dropped massively. Now this scene is quite early in the game, there are definitely more tasking scenes as you progress and of course much later in the game, so picture that as you want.
Scenes where you engaged in formation also saw massive framedrops and even smaller battles saw it's fair share of being constantly under 30fps. Also, Until Dawn looks better than Ryse and it's not an action game as Ryse is, it's also 1080p and technically superior. Just a hint (look at some of the pools of water) whilst Marius is fighting in Ryse.....on the XB1.
How is that even relevant to this thread?
Thelastword won't give anything under 1080p on PS4 any credit, even at 60fps with a lot of action at all times and high fidelity.https://youtu.be/RWWtm4Wq9QU?t=285
Come on ...
-----------
For 60fps MP title it is impressive yes. They look better than character models from U4 MP.
Animations are really good for FPS game with such responsive movement, so i dont know what You are talking about.
Thelastword won't give anything under 1080p on PS4 any credit, even at 60fps with a lot of action at all times and high fidelity.
Pretty much this. I would like to see what he has to say about UC4 MP. It may be 900p, but it is running at 60FPS and has a lot of shit going on at times. And personally, I think the visuals overall look closer to the SP than previous games's MP were to their respective SP's, so I find it very impressive overall.Thelastword won't give anything under 1080p on PS4 any credit, even at 60fps with a lot of action at all times and high fidelity.
This is off topic regarding the thread, but as people are still discussing the choice of the Tomb Raider release date. Aaron Greenberg discussed this a few weeks ago:
Source: https://www.youtube.com/watch?v=6-5Z_7yM_5o
Via: http://attackofthefanboy.com/news/a...-vs-tomb-raider-we-hope-people-will-buy-both/
I agree. The differences in previous Uncharted MP modes were drastic. I'm glad the compromises in that mode are at least in service of 60fps now.Pretty much this. I would like to see what he has to say about UC4 MP. It may be 900p, but it is running at 60FPS and has a lot of shit going on at times. And personally, I think the visuals overall look closer to the SP than previous games's MP were to their respective SP's, so I find it very impressive overall.
Yep, I accept the compromises made by devs to hit 900p if the game runs at 60FPS or it looks ahead of most other games in that specific mode. Since Battlefront and UC4 MP meet both those conditions, I think 900p may be a good choice. Furthermore, we are likely to still see improvements for UC4 MP since the beta launches next month.I agree. The differences in previous Uncharted MP modes were drastic. I'm glad the compromises in that mode are at least in service of 60fps now.
I will say I'm not a fan of 900p myself at 30fps but 60s added temporal resolution helps a lot, especially when it's something as impressive as Battlefront or Uncharted 4.
I agree. The differences in previous Uncharted MP modes were drastic. I'm glad the compromises in that mode are at least in service of 60fps now.
I will say I'm not a fan of 900p myself at 30fps but 60s added temporal resolution helps a lot, especially when it's something as impressive as Battlefront or Uncharted 4.
TRDE on PS4 definitely wasn't a locked 60, but its average was around 52 or 53, so, even if it dropped, it maintained a good amount of consistent smoothness far beyond what 30fps and below TR2013 was on the last generation.
The beach scene runs better than the catapult sequence. That opening area really is the choppiest area in the entire game, you know. By far.There are scenes that task the GPU much more than this, even the beach scene which is still early on is a tear on framerate....
I think the fact that UD is such a slow game yet still struggles is more disappointing. The scene in Ryse that drops into the teens features loads of enemies on screen and lots of alpha effects. In comparison, the areas that drop into the teens in UD have basically on or two characters on screen at most. In my case, the type of game in question does not change how I feel about frame-rate. It doesn't matter if it's an action game or not - I like smooth frame-rates for visual reasons, not control reasons.I don't think UD's framerate is great, I never said so, I'm only less harsh on it, due to the slow nature of it's gameplay and that's only comparing it to something like Ryse which is action heavy. I don't think I'm being unfair here in logic/reasoning.
I've never doubted that. I think it's ridiculous how you belittle developers that struggled with SPU usage, however, as if it were a trivial thing. Not every studio is fortunate enough to have brilliant coders that can deal with this stuff. That doesn't mean they are being lazy. Coding well for SPUs is a very VERY difficult task and not everyone can pull it off.Btw Dark, all that extra power to run ZOE2 at 60fps, with bokeh dof, high rez alpha effects and a better resolution was all found in the hardware which was actually utilized. Hey, the PS3 has SPU's why don't we use the darn things to get good results, oh a foreign concept indeed......
Yes, you're harsh.I'm I harsh wanting more hexadrive efforts..? (in essence double the framerate efforts, higher resolution efforts, higher quality effects efforts). I don't think so, when lower-end hardware is actually accomplishing just that....So no, I don't think I'm wrong in wanting Alien Isolation, Xenoverse or RE-R2 to have been a solid 60fps on a PS4, when a 750ti does just that.
Really good post here, but I fear it's for naught. Thelastword will always stay in his belief that if a game runs better on the i3/750Ti combo than the PS4, then it's a horrible port and the devs are lazy. If you follow his posts, you'll realize that he keeps on bringing up the same damn games. I'll be honest and say that I too think that the PS4 should outperform that combo, but I don't go whining about it in every thread.*snip*
Really good post here, but I fear it's for naught. Thelastword will always stay in his belief that if a game runs better on the i3/750Ti combo than the PS4, then it's a horrible port and the devs are lazy. If you follow his posts, you'll realize that he keeps on bringing up the same damn games. I'll be honest and say that I too think that the PS4 should outperform that combo, but I don't go whining about it in every thread.
Dont look good? I disagree, they look pretty good to me.
Isn't an i3 stronger as a CPU than the 6 core Jaguar in the PS4?
Yeah, most i3's should be stronger than the Jaguar cores. That's probably what's giving it the edge in cross gen games. Some current gen only games that may be optimized better for consoles like F1 2015, Project Cars, Evolve and Star Wars Battlefront (beta, at least) run better on the PS4. My theory is that the i3 allows the combo to perform better than the PS4 in cross gen games based on brute force on the CPU side. I could be wrong, though.Isn't an i3 stronger as a CPU than the 6 core Jaguar in the PS4?
Isn't an i3 stronger as a CPU than the 6 core Jaguar in the PS4?
Yep, that's exactly right. It's not the 750ti that he keeps bringing up, it's the CPU. It was always the CPU.Yeah, most i3's should be stronger than the Jaguar cores. That's probably what's giving it the edge in cross gen games. Some current gen only games that may be optimized better for consoles like F1 2015, Project Cars, Evolve and Star Wars Battlefront (beta, at least) run better on the PS4. My theory is that the i3 allows the combo to perform better than the PS4 in cross gen games based on brute force on the CPU side. I could be wrong, though.
Yes, most definitely. It can even hold its own against "actual" eight-core AMD CPUs with higher clockspeeds (the console processors have eight cores, too, but two are reserved for background functionality). The Jaguars in the PS4 and X1 aren't gaming-orientated in the slightest; they're intended for low-power mobile devices, which is what happens when profitability replaces power as the primary goal. Considering how much the PS3 and X360 cost Sony/MS, though, it's not a surprise, and I doubt we'll see a similar loss-leading approach ever again.
Yeah, but if games by design were made and optimized for the 6 core jaguar architecture from the outset like the technical director from 4A Games recommends, there would be no issue in regards to that right?
I wonder why devs don't just do that instead for multiplats with PC, wouldn't that theoretically just make it easier to scale up to more powerful components on PC? Or is it still the case that PC games are not well suited to high core counts?
http://www.gamepur.com/news/12874-battlefield-4-uses-90-95-cpu-power-ps4-and-xbox-one-says-dice.htmlJohan explained that Battlefield utilize 90-95% of the 6 parallel CPU cores available to developers on the Octa-core AMD Jaguar that powers both PS4 and Xbox One.
I agree. The differences in previous Uncharted MP modes were drastic. I'm glad the compromises in that mode are at least in service of 60fps now.
I will say I'm not a fan of 900p myself at 30fps but 60s added temporal resolution helps a lot, especially when it's something as impressive as Battlefront or Uncharted 4.
What do you mean with made and optimized? The CPUs are what they are. Games already got bigger and do more, also keep in mind that you need the CPU for the GPU stuff. The GPU depends on the CPU. They are also nothing special in any way, besides that they are slow and miss features other CPUs already have. Also, games are already tailored for these systems, otherwise we would see even worse performance.Yeah, but if games by design were made and optimized for the 6 core jaguar architecture from the outset like the technical director from 4A Games recommends, there would be no issue in regards to that right?
I wonder why devs don't just do that instead for multiplats with PC, wouldn't that theoretically just make it easier to scale up to more powerful components on PC? Or is it still the case that PC games are not well suited to high core counts?
I find it very hard to believe that multiplats are not already tightly optimized for the multicore consoles. In fact this goes against the great scaling we found on PC games with activity on 8 cores or even more. Have you forgotten about Battlefield 4 ?
http://www.gamepur.com/news/12874-battlefield-4-uses-90-95-cpu-power-ps4-and-xbox-one-says-dice.html
At this point shall we take the plunge and dare facing reality ? Those Jaguar cores are weak to put it politely.
What do you mean with made and optimized? The CPUs are what they are. Games already got bigger and do more, also keep in mind that you need the CPU for the GPU stuff. The GPU depends on the CPU. They are also nothing special in any way, besides that they are slow and miss features other CPUs already have. Also, games are already tailored for these systems, otherwise we would see even worse performance.
Im one of those who dont like variable framerates and it bothered me in second son and killzone, but in TRDE i had no problem with it.
So we are in agreement that devs do with what they have here, but at the same time it is true that if you refuse to acknowledge the limits of your target hardware performance is never going to be satisfactory. That is true, but I don't think they have gone too far in that respect, sure it drops frames for now but they are trying to keep it playable.Well we know that these Jaguar cores are not powerful by default...i'm not arguing otherwise.
So you want them to reduce the scope and ambition of their games ? I don't think this is justified for now, even if the framerate is not rock solid performance is still acceptable. We are not talking about 20fps for extended periods of time, like Crysis 3 on PS3 or Skyrim on the same platform.My thing is, a lot of the games coming out seem to be trying to go beyond their means, especially in regards to FPS. But if your making a game well within the scope of the hardware you've got, performance should not be all that much of an issue as long as its not overboard.
Essentially your position is that devs should not push those CPUs too far, but as it stands I believe they have not. Drops do happen but they do not dilute the experience that Fallout 4 provides.If your game design is fit with your hardware, then performance should not suffer all that much even if your trying to be more ambitious is my main argument
Is that a wrong assumption to make?
Yeah that's weird.. variable frame rates really bothered me as well in inFamous & Shadow Fall... so much so that I locked inFamous: SS to 30fps when that option was patched in.
I had no such issue with TRDE on PS4. Anyone know the technical reason why this would be?
i3 is 2-3 times faster than a Jaguar in IPC. I pointed it out last year and even delved into intel notebook crap and found the Celeron is twice as fast in terms of per watt.
I wanted some different tests as an i3 is also dual core. Since AMD AFAIK haven't released a six core version of the Jaguar, a desktop Athlon 5350 Jaguar 4 core clocked at 2ghz might be a good starting point. They did a 5150 at 1.6ghz like PS4 but its only a 4 core.
AMD Phenom II 955 from 2009 is 2x faster. Seems like Jaguar is roughly in the ball park of a Athlon 620 in scores but 620 wasn't far behind the Phenom II in IPC. Jaguar is the lowest of the low as it's ~25w notebook CPU.
Alien Isolation doesn't take much to run, in pc terms yes it is a toaster
![]()
But a 55W Celeron is faster, as is a ~25w Celeron from what I checked last year. Perhaps the Jag is just shy but CPU terms it's one of the least demanding. Being made a by a small dev team I think we should be happy they did all those version and not hang your hat on it which lastword often does and uses other one off situations.
For testing NXgamer throws in an A8 or A10 but that is faster than the Jaguar.
Anyways, I'd certainly be interested in how low you can go in desktop CPU terms to run Alien Isolaton. Can a Jaguar 5150 run it at 60fps?
Not sure how much overhead the windows OS takes and how apples to apples you can get when we get this low. For PC info terms there's literally no need to buy a CPU like this, the price becomes quite silly, you don't need to go this low when CPUs 2-3 times faster are available for cheap. Used CPU market is system builders paradise as the last 5 years intel haven't progressed much and ultra bargains can be found
So you want them to reduce the scope and ambition of their games ? I don't think this is justified for now, even if the framerate is not rock solid performance is still acceptable. We are not talking about 20fps for extended periods of time, like Crysis 3 on PS3 or Skyrim on the same platform.
Fallout 4 performs much better than those two. It should perhaps perform even better in some cases, Bethesda might be working on something already.
According to Sebbi (Ubisoft devs) on Beyond3d a 1.6ghz Haswell is twice as fast as a Jaguar core. So a 3.2ghz Haswell core is 4 times faster.
DX12 is going to make those CPUs fly.
According to Sebbi (Ubisoft devs) on Beyond3d a 1.6ghz Haswell is twice as fast as a Jaguar core. So a 3.2ghz Haswell core is 4 times faster.
DX12 is going to make those CPUs fly.
I'm not convinced they have not already made the most of the Jaguar cores considering the game on PC is very well multithreaded and DX11 is famously not that multithreading friendly, particularly on AMD. I assume console APIs do not have the same limitation. DX12 scaling seems very impressive compared to DX11.I think your absolutely correct on Fallout 4. I don't think they actively need to scale back the engine load for that game, i feel its a matter of slightly more efficient coding being necessary to get that FPS up a bit more. Bethesda by default doesn't really think about technical polish until afterward, so we'll have to wait and see.
Fine by me that is, I do not have the pretention of speaking for anybody else than myself. I understand some do not accept drops below 30fps at all, but this is difficult to maintain in every situations as games become more and more ambitious visually. You know as well as I do that visuals take precedence over performance and resolution, naturally the Xbox One is hit harder and sooner.And i also think your correct on a majority of games being fine for now in regards to the tradeoff of the CPU power.
Treyarch may have gone a bit too far indeed, although they deserve a lot of credit for the visual spectacle of the SP campaign, it looks really great.But some games i think do go too far already, like COD BO3. Sustained drops far below 60 even with dynamic res is uneccesary, considering how AW ran far better and more consistently while at the same time looking better graphics wise.
Likewise, it is just my opinion that their trade-offs are valid. Yes, it is not consistent performance and again I wish to emphasize that your opinion is completely valid, but gamers are very quick to point out graphical flaws especially in AAA games, so I understand why they chose this aspect to take center-stage. I suspect they would have gotten far harsher complains about the toned down visuals even if their games sustained 30fps (Unity) or 60fps (COD).Having played the campaign, i don't think shoving a bazillion robots on screen at once was a fair trade off for performance. But i also felt the same about ACUnity in regards to its NPC count, so that may be just me. It comes down to each individual game
I don't know but what it could mean is that a single 3.2ghz Haswell core can handle the workload of two 1.6ghz Jaguar cores during the same time frame. That would explain why a Core I3 does so well, two physical cores and two logical ones.How does that actually work by the way? If a single 3.2 Haswell core is hypothetically 4 times faster than a 1.6 Jaguar core, and there's 8 cores of them in each CPU, do they stack together?
I mean have you seen the reception Syndicate had ? Guess which aspect got a lot of flack. Yet performance is very good on PS4 and Xbox One, exactly what they did not go for with Unity.
Yet their performance achievement is seldom underlined. It is possible they reverse cours and set the graphical bar higher in the next entry with lower performance as a result.
Performance aside, I am really disliking Lara's new face.
Balanced is subjective, they do have a balance actually, it's just that it is more towards visuals than performance these days. Syndicate looks better to me than Far Cry 4 hence the drop in resolution.I don't think that's necessarily the case. Far Cry 4 looks way better than Assasin's Creed this gen, and yet hits its performance target consistently so. The visuals and performance line should be balanced correctly, not just one or the other
Unity perhaps pushed consoles too hard and we saw the results, however if the reception of Syndicate regarding visuals is not good enough by their standards then it is possible they will put further emphasis on graphics and aim for the same resolution.I think the outcry against assasin's creed these days is more that the game itself is tired, and more importantly, the cutbacks are already obvious to see because of what we know they went with in Unity.
Since we know they went for crazy NPC counts in Unity at the cost of performance, simply cutting out the NPC's in the next game was not going to be received well even if it had higher performance.
Actually I'm not certain the number of NPC was the only culprit, rather the impressive amount of geometry must hit GPUs pretty hard. Syndicate also impresses in that department, but lacks the beautiful lighting. Once again a matter of balance, they have done well in regards to performance but some would disagree.
That will not be a problem on upper mid-end PCs (7970/770 and up) anyway.
The outrage is still not justified. Gamers wanted solid performance ? Ubisoft delivered just that. They can't blame Ubisoft for making trade-offs with console hardware.If Unity had not had that high NPC count at the cost of performance to start, nobody would have known what was cut and what was kept at the cost of performance, thus no outrage
You can't have it all, and that applies for the PC platform as well to a degree.
You said Ryse only dropped to the low 20's, I said it dropped to the teens. I showed you where it did and now your argument is "this is the only area it's dropping to the teens"......I have proven what I was saying, yet you are only changing the goalposts. I never expected you to be doing this tbh.....The beach scene runs better than the catapult sequence. That opening area really is the choppiest area in the entire game, you know. By far.
Both consoles have the same CPU, how is it that you want to push a CPU front when much weaker CPU's than the i3 are running all the games I mentioned at 60fps with the weaker 750ti. How is it that the same CPU and weaker GPU (XB1) is running RE-R2 at 60fps at 1080p, when the PS4 is not? So the 40% additional headroom of the PS4 GPU can't give it an advantage over the XB1 in a GPU bound game when they're both at 1080p, when the PS4 is struggling with GPU related effects? Is it the CPU doing the rendering of grass in RE-R2?Yep, that's exactly right. It's not the 750ti that he keeps bringing up, it's the CPU. It was always the CPU.
It is the inverse problem of PS3 where you had a powerful CPU coupled with a terrible GPU.