Could it have been possible (or mattered) if Xbox got bumped to 10-12GB of RAM?
This is the Nintendo line of defense. It's the Alamo. And while it worked for the Wii it wasn't quite so successful second time out. There are many factors that lead to the success or failure of a console. Some more than others. Power really can matter. And timing. And price. And message. It all factors. And whatever game can be made on weaker hardware can be made better on more capable hardware. No company has a monopoly on creativity. But every game designer eventually hits a technological wall, a point where the hardware says, "Sorry, no further" and they have to start making compromises. Hard choices, sometimes. And cuts. With more power, you get to push further before you start making compromises. I believe it's great games that ultimately sell consoles and lead to success. Even with a shaky start, a console with enough must-have games will get back into the race. And a power gap (all other things equal) allows devs to create best versions of great mutliplatform titles, or fantastic first party exclusives. Power matters. It's not all there is to it, but it's still a factor.
People saying power doesn't matter... well lets just say I wonder about their motivations.
We really dont have exact PC equivalent of PS4 gpu. It sits right in the middle between 7850 [16CU] and 7870 [20CU].
It's closer to the 7850 than 7870.We really dont have exact PC equivalent of PS4 gpu. It sits right in the middle between 7850 [16CU] and 7870 [20CU].
It's closer to the 7850 than 7870.
So that guy who is part of the silicon maker is lying about over 200 gbps of data? Have you watched the architecture panel?
Is the PS4 GPU more powerful than the 7850?
They've lied before.
Yes, a little.
Goddamn, X360 is beating even the PS4. The true RAPID SPEED RAM.
The only standard feature that the PS4 has over either is bandwidth I think.
They've lied before.
Its not only that. PS4 GPU has 2 more Compute Units than 7850, and it has different architecture for direct compute [8 schedulers with 8 pipelines each, standard is 2-2], streamlined communication path between GPU-CPU.
True, but I was indicating that the bandwidth is also higher compared to the 7870 as well.
I'd like to know the visual or real world performance difference when it comes to multiplat titles. Guess we'll be waiting for eurogamer's famous comparisons.
Again, no. PS4 has 176GB/s ram bandwith that must be shared between GPU and CPU. [7850 130GB/s, 7870 175 GB/s]
Sony choose their bandwiths to perfectly eliminate bottlenecks. PS4 GPU will use ~150 GB/s, which will leave great amount for octacore jaguar [~30 GB/s].
So we may well see 30fps vs 60fps games then.
Lol is this graphic really posted to MS? My God 360 seems next generation here.
The Xbox 360 has 22.4 GB/s of GDDR3 bandwidth and a 256 GB/s of EDRAM bandwidth for a total of 278.4 GB/s total system bandwidth.
The 7770 shown does not have this same hardware architecture, some low level bits are different,and there is no ESRAM...
In other words, it's a bit misleading..
Sebbi on B3D said it's more likely that resolution would drop rather then framerates based on looking at it from their architectures.
We all know the PS4 has the potential for a sizeable lead,but the school yard graphs and simple numbers comparisons is sad.
The 7770 shown does not have this same hardware architecture, some low level bits are different,and there is no ESRAM...
In other words, it's a bit misleading..
Sebbi on B3D said it's more likely that resolution would drop rather then framerates based on looking at it from their architectures.
We all know the PS4 has the potential for a sizeable lead,but the school yard graphs and simple numbers comparisons is sad.
Lol is this graphic really posted to MS? My God 360 seems next generation here.
The Xbox 360's CPU has more general purpose processing power because it has three general purpose cores, and Cell has just one.
Why is it sad? Given that the PS4 gpu will also be faster than the one in the graph, this should be a decent indication of performance differences.
Well, the difference will be down to a developers willingness to exploit the advantages on offer to them with the PS4..some won't.
Xbone GPU is 7850 without 4 CUs, and PS4 GPU is 7850 with two additional CUs.
That's not the worst one:
The Xbox 360's CPU has more general purpose processing power because it has three general purpose cores, and Cell has just one.
Has this been confirmed?
Has this been confirmed?
Yes, all old VGleaks/Digital Foundry rumors were 100% spot on [except crazy GDDR5 ammount]. Both Xbone and PS4 use standard GCN architecture from 78xx series, with Sony having few specific changes inside GPU to make it better for gaming.
Has this been confirmed?
I'm not a techie so can someone explain why they went with 102 GB/s ESRAM and not the EDRAM of x360 which seems to have huge bandwith?Yes.
EDIT: Read "to" as "by". I dunno if anyone has rubbed it to their noses.
http://mobile.pcauthority.com.au//Article.aspx?CIID=23155&type=Feature&page=4
They sure like to add numbers together.
I'm not a techie so can someone explain why they went with 102 GB/s ESRAM and not the EDRAM of x360 which seems to have huge bandwith?
Yes, all old VGleaks/Digital Foundry rumors were 100% spot on [except crazy GDDR5 ammount]. Both Xbone and PS4 use standard GCN architecture from 78xx series, with Sony having few specific changes inside GPU to make it better for gaming.
Didn't the rumors put XB1 more in line with the 77 series? First I'm hearing it's similar to the 78 series like the PS4.
Gemüsepizza;59553621 said:It's GCN. And it needs eSRAM because it has DDR3 RAM instead of GDDR5 RAM.
What's "more likely" is up to developers. They decide if the use the increased hardware power for higher resolutions or for higher framerates. In both cases the GPU has to render more pixels.
Cloud bullshit needs to stop, at least in the terms of it helping the game to render better visuals in real time. ~95% of the game rendering pipeline requires extremely fast access times, and the remaining 5% could receive help from cloud but it would not be much. Global Illumination states could be rendered on cloud and buffered to the console for later use [no one will notice if global lightning is off by 5 minutes of sun travel time].
Holy Hell! I had forgotten that one. Their bullshit knows no limits.
I'm not a techie so can someone explain why they went with 102 GB/s ESRAM and not the EDRAM of x360 which seems to have huge bandwith?
Technically this and the bandwidth figures are true. Too bad the CELL's SPE's more then make up for its single general purpose core and the vast majority of that bandwidth can only be used to access a small frame buffer.