Sorry I was just using the letters to as acronyms of a sort.
I could have also said Customized Unit and De-constructor Can't Barbeque.
Death by Bus...
edit: ah... it's a C... -.-
me stupid
Sorry I was just using the letters to as acronyms of a sort.
I could have also said Customized Unit and De-constructor Can't Barbeque.
Ditto. Some of you people are hilarious. 4X MSAA @ 1080p? MSAA is the most brute force, expensive feature available today. I personally like 2XMSAA + High Quality FXAA. That's it. Anymore and you might as well lower the resolution.
Ditto. Some of you people are hilarious. 4X MSAA @ 1080p? MSAA is the most brute force, expensive feature available today. I personally like 2XMSAA + High Quality FXAA. That's it. Anymore and you might as well lower the resolution.
32 MB is 100% correct . I'll throw in one more:
102 GB/S
I think the 8GB figure is nonsense, DDR4 is not even in production at CES it was still being validated and what not. Maybe MS got some advanced order or some pre JEDEC certified order but that is a hell of a risk to take for a mass market console that has to be delivered on time. How high would you have to clock DDR3 to get that kind of bandwidth, DDR3 struggles to get above 2Ghz in decent supply quantities as well.
I think 4GB of GDDR5.
Ok, consider me underwhelmed if that is the speed of the ED/ESRAM
It's too little, and too slow.
Ideally it'd be GDDR5 level at least, and be enough to store plenty of buffers in esram for processing without touching external ram.
10MB was supposed to be enough for 360 until deferred rendering turned up, and I think 32MB would be barely enough on Durango.
If you still end up moving buffers in and put of main memory, then the already limited bandwidth of DDR3 gets squeezed even more.
3(21) 432
CU 4x3x2?
CU (or CP) DCB
there are acronyms in computing, but none of those make sense to me for the box.
He said GAF is wrong. So it be nice for a little more hint on the RAM side from him.
He said GAF is wrong. So it be nice for a little more hint on the RAM side from him.
Ok, consider me underwhelmed if that is the speed of the ED/ESRAM
It's too little, and too slow.
Won't be a problem.
It's probably the bandwidth between the GPU and the eDRAM unit...the equivalent of the 32GB/s between the Xenos parent die and daughter die. It'd be a perfect scaling, 3.2x the memory, 3.2x the bandwidth. The internal bandwidth between the ROPs and the eDRAM memory is a different thing and will be as large as you'd expect.
Ok, consider me underwhelmed if that is the speed of the ED/ESRAM
It's too little, and too slow.
Ideally it'd be GDDR5 level at least, and be enough to store plenty of buffers in esram for processing without touching external ram.
10MB was supposed to be enough for 360 until deferred rendering turned up, and I think 32MB would be barely enough on Durango.
If you still end up moving buffers in and put of main memory, then the already limited bandwidth of DDR3 gets squeezed even more.
Some of you guys saying you want AA and that 2xAA is okay have never wandered into the the high res screenshot thread where they're doing 4xSSSGAA and shit.
IQ matters less for console games, just give me 16xAF and high quality SMAA at 1080p.Some of you guys saying you want AA and that 2xAA is okay have never wandered into the the high res screenshot thread where they're doing 4xSSSGAA and shit.
Wrong.probably at very low frame rate....
Depends on the game, but older games are totally doable with PC hardware.probably at very low frame rate....
probably at very low frame rate....
Some of you guys saying you want AA and that 2xAA is okay have never wandered into the the high res screenshot thread where they're doing 4xSGSSAA and shit.
Control Unit; Data Control Block
Maybe.
MS going mainframe. z/os on this biatch.
MS going mainframe. z/os on this biatch.
im driving to work in san francisco and i hear over the radio station that the PS4 and next xbox will be revealed soon around gdc.
Some people here beg to differ I think.
DDR3 is widely available with 2133MHz, so far the only difference between early DDR4 and DDR3 is power consumption. After DDR4 takes over it gets cheaper and will be available with higher clocks and can be stacked.
Maybe 720 will use 2.5D stacking and DDR3 - I don't know anything. You have to ask around here maybe someone will come forward.
Hmm, ok. Plus maybe the blitter comes in somewhere there too.
But that still means your GPU only has half the available bandwidth vs orbis, doesn't it?
They should be close enough,if you don't care for their exclusive software you know what to buy already.As long as both consoles aren't on parity so as to not be a clone of this gen it's all good with me.
As our well-placed sources told us last year, custom silicon based on AMD's A8-series APU and HD 7670 GPU will be used in the PlayStation 4, while the Xbox 720 will combine an IBM PowerPC CPU and a custom version of AMD's 6670 GPU.
DDR3 is available @ 2133Mhz yes but not in quantities required to supply 1 console manufacturer to output 10 million plus consoles a year. The vast majority of DDR3 is supplied at speed below 2000MHz if you try and launch a console which uses high spec DDR3 I guarantee you will run into supply issues as your suppliers will be cherry picking parts for you I am sure not a lot of the chips will be capable of such speed.
As for starting on DDR3 and switching to DDR4 that will take months and a significant redisgn of the memory controller, possibly impact program timings etc. Not something you really want to be doing. Throw in the fact that you will need a wider memory bus to hit such a high bandwidth than a GDDR5 solution and you can see why I would throw out the DDR3/4 logic.
Wait what ign?? they say its the custom 6670 mwahhhh not sure about that ign.
http://uk.ign.com/articles/2013/01/16/next-gen-secrets-allegedly-leaked-to-nvidia-by-former-amd-executives?utm_campaign=ign+main+twitter&utm_source=twitter&utm_medium=social
edit: oh wait they're citing sources a year old 0_0Wait what ign?? they say its the custom 6670 mwahhhh not sure about that ign.
http://uk.ign.com/articles/2013/01/1..._medium=social
jeovaTheThird said:hint : KYZIKPKNF DS VJIRD
deanos said:i know who told you that, tell him that they know and to expect a phone call.
Wait what ign?? they say its the custom 6670 mwahhhh not sure about that ign.
http://uk.ign.com/articles/2013/01/16/next-gen-secrets-allegedly-leaked-to-nvidia-by-former-amd-executives?utm_campaign=ign+main+twitter&utm_source=twitter&utm_medium=social
SHIT JUST GOT REAL
32 MB is 100% correct . I'll throw in one more:
102 GB/S
Wait what ign?? they say its the custom 6670 mwahhhh not sure about that ign.
http://uk.ign.com/articles/2013/01/16/next-gen-secrets-allegedly-leaked-to-nvidia-by-former-amd-executives?utm_campaign=ign+main+twitter&utm_source=twitter&utm_medium=social
Wait what ign?? they say its the custom 6670 mwahhhh not sure about that ign.
http://uk.ign.com/articles/2013/01/16/next-gen-secrets-allegedly-leaked-to-nvidia-by-former-amd-executives?utm_campaign=ign+main+twitter&utm_source=twitter&utm_medium=social
How much DDR3?DDR3 2133 with 384bit bandwidth=102.3 GB/s
DDR3 2133 with 384bit bandwidth=102.3 GB/s
LOL! Yeah, he's been offline for the past 15+ minutes. I love this bitter "drama".
SHIT JUST GOT REAL
DDR3 2133 with 384bit bandwidth=102.3 GB/s