Then what is it?
1.23 versus 1.84!!
Then what is it?
1.23 versus 1.84!!
The thing that doesn't make sense is the Xbox 360 is already pumping out third party games with sub optimal framerates, so why would they go into a new gen with a low power GPU pumping out more sub optimal framerates compared to PS4?
All the services & gimmicks in the world will not forgive poor performing console versions. That just doesn't gel with me.
And if they just make third party games with Durango in mind, it just sets third party games back graphically with it being the lowest common denominator.
This is the problem that happens when you have one console that is much more powerful than the other.
You must have poor memory. Adding AA and 480p alone set it apart from PS2 which usually ran in sub 480p and only had 480i modes.
1.23 versus 1.84!!
Seriously though, as per every other generation I suggest you all wait until you see and then actually play the games, I'm not down playing the fact that PS4 can leverage more power than Durango according to specs, but thats does nothing for me if I want to play say PGR5 (hopefully) or the next Halo.
If it means that ps4 can have 2xMsaa on 1080p vs Durango 1080p only. Is already a big difference. And considering how popular digital foundry comparisons are next gen will not make durango stands well.
But then we have that EA guy said that both consoles are comparable in performance.
So maybe the extra stuff in durango can offset it by making the difference only 10% between the ps4.
Dam i hope some stuff leaks this week from the internal microsoft presentation.
April is to far away.
Even if the PS4 version is better, Digital Foundry would probably still say "buy the 720 version".If it means that ps4 can have 2xMsaa on 1080p vs Durango 1080p only. Is already a big difference. And considering how popular digital foundry comparisons are next gen will not make durango stands well.
You have an interesting definition of 'significantly more powerful'.
This. We have an apples to apples comparison with these new systems, unlike systems if the past. It's not as complex of a comparison as PS2/GC/Xbox or PS3/360. So bringing up old games like Splinter Cell is silly. If the rumored specs are true, PS4 will have better looking games. End of story. It will have, in general, better looking first party games and it will have slightly to moderately better looking and performing third party games. The difference between 1.2 and 1.84 tflops of the same architecture (not to mention much faster RAM) is quite a large difference and it's funny seeing it get downplayed so much. Anybody who thinks third party devs won't take advantage of that extra power to AT LEAST give the PS4 version of games some better AA, better textures, and/or better framerate are kidding themselves.
Even for first party games, I think a Durango GPU in the PS4 ballpark range would greatly help them.
More power is alway good ,well dependent on cost I, however I still say wait till you see the games!
Nah. According to the leaked slide (The Durango conference one with the "100% GPU efficiency", 1080p 2XMSAA should not be a problem on Durango.
Even if the PS4 version is better, Digital Foundry would probably still say "buy the 720 version".
The people in this thread arguing that PS4 will have better visual are clearly wrong. I posted about this before. Both systems have the same graphics hardware and same CPU, both gpu's are capable of EXACTLY the same FX. It's even closer then it was with PS3 and 360 ( nvidia vs ATI as both had different ways and methods that were used to handle graphics.)
the only differences we will see are:
resolution
frame-rate
That's pretty much it. if one system can render the same but slower it will drop down in resolution. look at a PC. is there really a difference between a 7980 and a 7970 other then speed? they both can render exactly the same stuff, but one would use a lower resolution to keep up.
The people in this thread arguing that PS4 will have better visual are clearly wrong. I posted about this before. Both systems have the same graphics hardware and same CPU, both gpu's are capable of EXACTLY the same FX. It's even closer then it was with PS3 and 360 ( nvidia vs ATI as both had different ways and methods that were used to handle graphics.)
the only differences we will see are:
resolution
frame-rate
That's pretty much it. if one system can render the same but slower it will drop down in resolution. look at a PC. is there really a difference between a 7980 and a 7970 other then speed? they both can render exactly the same stuff, but one would use a lower resolution to keep up.
Ok, now I'm confused... for you better res or fps not means better visual?
If you render at a lower resolution then you are not rendering the exact same stuff.That's pretty much it. if one system can render the same but slower it will drop down in resolution. look at a PC. is there really a difference between a 7980 and a 7970 other then speed? they both can render exactly the same stuff, but one would use a lower resolution to keep up.
Not sure if serious with this post all. You're wrong. The constant post saying they're the same when they are not. Again stop using a PC setup as a reference.The people in this thread arguing that PS4 will have better visual are clearly wrong. I posted about this before. Both systems have the same graphics hardware and same CPU, both gpu's are capable of EXACTLY the same FX. It's even closer then it was with PS3 and 360 ( nvidia vs ATI as both had different ways and methods that were used to handle graphics.)
the only differences we will see are:
resolution
frame-rate
That's pretty much it. if one system can render the same but slower it will drop down in resolution. look at a PC. is there really a difference between a 7980 and a 7970 other then speed? they both can render exactly the same stuff, but one would use a lower resolution to keep up.
Even if the PS4 version is better, Digital Foundry would probably still say "buy the 720 version".
Cheap shot, which is not even true.
The people in this thread arguing that PS4 will have better visual are clearly wrong. I posted about this before. Both systems have the same graphics hardware and same CPU, both gpu's are capable of EXACTLY the same FX. It's even closer then it was with PS3 and 360 ( nvidia vs ATI as both had different ways and methods that were used to handle graphics.)
the only differences we will see are:
resolution
frame-rate
That's pretty much it. if one system can render the same but slower it will drop down in resolution. look at a PC. is there really a difference between a 7980 and a 7970 other then speed? they both can render exactly the same stuff, but one would use a lower resolution to keep up.
Most multi-plaform titles look and perform the same on PS360. So, even if PS4 had the better specs, wouldn't MS force multi-plaform developers to make the games as equally as possible?
People assume too much about the Xbox.
Going by the rumors they both don't have the same graphics hardware.
PS4: 18 Shader Cores, 1152 Ops/clock and 32 ROPS.
720: 12 Shader Cores, 768 Ops/clock and 16 ROPS.
I'm just can't see MS let Sony to have better specs. It would be first time ever, I'm very curious how it take from the devs.
If for real, then we are looking the new MS.
People need to also consider the OS drain on the graphical capabilities.
3GB memory and 2 CPU cores reserved for the OS on Durango makes the gap quite significant between the two.
But I do think NextBox will have games at the unveiling that look in the same ballpark, visually, as what we saw on 20th Feb. So from that p.o.v, the power difference won't matter to some.
Why? The only precedent being the Xbox? 360 - PS3 were more or less equal in terms of power and Xbox was more powerful than PS2 because it released much later.
This is the first time they've released a console at the same time, there's no precedent at all to suggest MS couldn't possibly release a console that's slightly underpowered. Thinking like that is idiotic.
If you take two identical PCs,
one with Pitcairn with (PS4: 18 Shader Cores, 1152 Ops/clock and 32 ROPS.)
one with Cape Verde with (720: 12 Shader Cores, 768 Ops/clock and 16 ROPS)
what would be the expected benchmark results? I know apples and oranges maybe, but how would they bench compared to each other?
The 3GB for the OS figure is absolutely ludicrous and I can't believe that people are still throwing it around as though it's fact. If anyone can come up with anything that would require that much memory, go ahead, but otherwise I'm going to have a hard time taking it seriously.
If you take two identical PCs,
one with Pitcairn with (PS4: 18 Shader Cores, 1152 Ops/clock and 32 ROPS.)
one with Cape Verde with (720: 12 Shader Cores, 768 Ops/clock and 16 ROPS)
what would be the expected benchmark results? I know apples and oranges maybe, but how would they bench compared to each other?
The 3GB for the OS figure is absolutely ludicrous and I can't believe that people are still throwing it around as though it's fact. If anyone can come up with anything that would require that much memory, go ahead, but otherwise I'm going to have a hard time taking it seriously.
I think, and this is the only logical conclusion i can draw, that by 3GB for OS he means 3GB reserved for the OS and other applications/services
Anyone else heard from devs that they double a specific part of the hardware prior to Sony annoucement ? Or is it already " public " knowledge ?
I think it would be more relevant if you post a table which indicates performance at 1980 x 1080 (or 1200) only since it's the resolution most games will be aiming for
Anyone else heard from devs that they double a specific part of the hardware prior to Sony annoucement ? Or is it already " public " knowledge ?
Sorry but can somebody educate me on how 32ROP is a significant advantage? do bear in mind the shading power of both consoles and their targeted resolution. I ask because the TMU and the extra CUs are a better measure of the ps4 gpu superiority and not the extra ROP.
The closest i can think of would be 7770 vs 7850
7770: 10 Shader cores, 16 ROPS, 1.28TF Card
7850 16 Shader Cores 32 ROPS, 1.76TF Card
This closes the gap between the two a little as it's 10 vs 16 rather than 12 vs 18, as can be seen by the TF count.
While close, this comparison is crude: see: http://www.hwcompare.com/12033/radeon-hd-7770-vs-radeon-hd-7850/
The problem with this is the 7770 above is clocked at 1000Mhz rather than the expected 800Mhz, there are no direct comparisons I can find that are a closer approximation however.
No the common belief is that MS would not have changed their specs at all. No downclocking. No upclocking. Nadda.
BTW... Were you told this? Or are you repeating BClifford's post on B3d?
Edge said specifically the Sony were trying to match Durango's 8GB ram several days before PS Meeting.
If that is what you are referring to...
The difference is pretty much the same at 1920x1200. Since we're not even certain that games will actually be running at 1080p, might as well link this.
So if one were to normalize the data then
7850 100%
7770 66%
33% difference in performance?
So if one were to normalize the data then
7850 100%
7770 66%
33% difference in performance?
Sorry but can somebody educate me on how 32ROP is a significant advantage? do bear in mind the shading power of both consoles and their targeted resolution. I ask because the TMU and the extra CUs are a better measure of the ps4 gpu superiority and not the extra ROP.
A friend from a PS3 dev. They were talking the dev kit and it appears that Durango would match the PS3 8gb because MS fear the public freak out over " the ps4 as twice more ram "
Not sure. I'm refering to a conversation that happened after the PS meeting, it appears to be a reaction to that
Let me see if im right.
Fillrate = ROPS * hz
16 * 800Mhz = 12.8 GPixel/s
32 * 800Mhz = 25.6 GPixel/s
1920 * 1080 = 2073600 pixels per frame.
2073600 / 100.000.000 = 0.0020736 GPixel per frame.
For a 60fps using only one rendertarget of 1920 * 1080.
0.0020736 * 60(as in fps) = 0.124416 GPixel/s
You can fill 102 rendertargets of 1920*1080.
So yeah 32 Rops is probably over kill for 1080p for gaming.
But 32 rops seems to fit well with 24fps 4k movies.