• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.
  • The Politics forum has been nuked. Please do not bring political discussion to the rest of the site, or you will be removed. Thanks.

Rumored Chinese Forum Xbox720 specs: 8CoreCPU,8GB,HD8800GPU,W8,640GBHDD

AzaK

Member
Jun 11, 2011
8,363
1
0
32mb Esram isn't cheap either, especially when you have to embedded in within the GPU itself.

Both companies went their own ways for ram implementation.

I think it's a premature to determine which one is the right choice.

Both will be the right choice because apart from exclusives the games will be identical no matter who's "weakest".
 
Aug 27, 2009
10,999
0
0
We need to assume that both MS and Sony have done their maths and optimised their architectures for end to end efficiency

If MS had shot bandwidth, they wouldn't need a decent GPU because it'd just be sitting there waiting for data, and that would be a waste of money to include it. So you have to assume that along the pipeline overall, DDR3 with 60-70GB/s is enough

Exactly. I expect the next Xbox to perform like a well oiled machine, while perhaps not looking as good when you look at the separate components on paper. You know, the same way Xbox 360 overcame its alleged deficiencies compared to the much stronger (again, on paper) PS3 because it was a brilliantly optimized design.

But who knows, maybe my expectations will be shattered. Then again, I'm by no means a graphics whore, and I would much rather see next gen games improved in other ways. A large pool of memory, even if it's relatively slow by today's standards, offers exactly that opportunity.

By the way, there's been a lot of talk about the possibility of ray tracing, but what about voxels and point clouds (you know, the thing used in Euclideon's "infinite detail" technology)? Voxels are known to be very demanding when it comes to memory, so Durango's large memory pool seems like a natural fit.
 

eso76

Member
Jun 25, 2004
14,919
67
1,615
44
We need to assume that both MS and Sony have done their maths and optimised their architectures for end to end efficiency

If MS had shot bandwidth, they wouldn't need a decent GPU because it'd just be sitting there waiting for data, and that would be a waste of money to include it. So you have to assume that along the pipeline overall, DDR3 with 60-70GB/s is enough

Yep, that's exactly what i've been saying.

It wouldn't make sense for them to go with a decent/good GPU and gimp it with low bandwidth ram.

An AMD 78XX comes with 2GB of DDR5, without which, the card just stops being a 78XX. Might as well opt for something weaker and cheaper if you're going that route and we shouldn't even consider the possibility that MS engineers overlooked that.
They certainly did their maths and questioning their experience is just silly.

In short, it wouldn't make sense to intentionally cripple the GPU and it certainly wasn't a mistake on their side.

(i hope)
 
Aug 27, 2009
10,999
0
0
In short, it wouldn't make sense to intentionally cripple the GPU and it certainly wasn't a mistake on their side.

(i hope)

As long as Nick Baker and his team are around (and according to his LinkedIn profile, he's still the general manager of Xbox console architecture), I have faith that they won't make any such oversights. Among other things, he's the one who realized that PS3's memory subsystem and Cell's complex architecture would pose problems for most developers back before the two machines even launched.
 

mrklaw

MrArseFace
Jun 10, 2004
59,895
2
0
Windsor, UK
Yep, that's exactly what i've been saying.

It wouldn't make sense for them to go with a decent/good GPU and gimp it with low bandwidth ram.

An AMD 78XX comes with 2GB of DDR5, without which, the card just stops being a 78XX. Might as well opt for something weaker and cheaper if you're going that route and we shouldn't even consider the possibility that MS engineers overlooked that.
They certainly did their maths and questioning their experience is just silly.

In short, it wouldn't make sense to intentionally cripple the GPU and it certainly wasn't a mistake on their side.

(i hope)

however...if they have specced a slightly lower powered GPU (whether for cost or performance reasons) then they would have a lower theoretical target to hit. But then if they hit 80% use and Sony 60% then it becomes a wash.
 

specialguy

Banned
Feb 17, 2011
4,527
0
630
We need to assume that both MS and Sony have done their maths and optimised their architectures for end to end efficiency

If MS had shot bandwidth, they wouldn't need a decent GPU because it'd just be sitting there waiting for data, and that would be a waste of money to include it. So you have to assume that along the pipeline overall, DDR3 with 60-70GB/s is enough



The fun part is how the edram works, how much of the rendering pipeline can genuinely use that to save main memory bandwidth, and what special sauce there is. (Somewhat offset by promises of magic from edram with 360 that didn't entirely pan out)


All plenty of unclear details to fuel crazy speculation. Love it :)

yeah, of course it's not "just" 70 GB/s. It's 70GB/s combined with whatever the ESRAM can save out of the equation to reduce the load that would otherwise hit the main bandwidth. The question is how all that plays out in the real world.

more or less the PS3 had twice the system bandwidth as 360, yet in general I think 360 was less bandwidth constrained thanks to EDRAM.
 

Ashes

Member
Dec 11, 2008
23,378
0
1,085
Greater London
Combined bandwidth? huh?

Anyway, if it's a wash, why as a gamer, deciding primarily on better hardware, would you opt for X2 then?

Surely, the rumoured specs suggest much better frame rates for PS4. If it is a wash though.
 

derFeef

Member
Jan 21, 2010
37,914
3
0
Combined bandwidth? huh?

Anyway, if it's a wash, why as a gamer, deciding primarily on better hardware, would you opt for X2 then?

Surely, the rumoured specs suggest much better frame rates for PS4. If it is a wash though.

No they don't. Consoles work on a target-framerate basis, if they can't get it to work, there will be compromises made. If there is bad framerate in a port, blame the developer for not putting enough effort into it.
 

mrklaw

MrArseFace
Jun 10, 2004
59,895
2
0
Windsor, UK
Combined bandwidth? huh?

Anyway, if it's a wash, why as a gamer, deciding primarily on better hardware, would you opt for X2 then?

Surely, the rumoured specs suggest much better frame rates for PS4. If it is a wash though.

...because if it is a wash, then there wouldn't be much better framerates for PS4.

We simply don't know enough about Durango's architecture to be able to dismiss the slower ram.
 

eso76

Member
Jun 25, 2004
14,919
67
1,615
44
however...if they have specced a slightly lower powered GPU (whether for cost or performance reasons) then they would have a lower theoretical target to hit. But then if they hit 80% use and Sony 60% then it becomes a wash.

can't wait for today's scoop, and let's hope it's something relevant.
(and let's hope it is indeed a scoop)
(and let's hope it's really coming)

So far, it's basically confirmed that there's 3 special units (at least 2 of which are helping the cpu iirc) and that the system uses 'we have no idea what' (not hdd not ssd) as storage.

That and we obviously don't know the full story on RAM.
Are the 32mb ESRAM even confirmed or just hypothetical at this point ?
 

amstradcpc

Member
May 26, 2009
2,502
372
960
can't wait for today's scoop, and let's hope it's something relevant.
(and let's hope it is indeed a scoop)
(and let's hope it's really coming)

So far, it's basically confirmed that there's 3 special units (at least 2 of which are helping the cpu iirc) and that the system uses 'we have no idea what' (not hdd not ssd) as storage.

-One is the audio dsp.
-Another said to be able to help the graphics tasks too. This surely will be similar to the PS4 one, vector units assisting the CPU.
-The third one is graphics centric and also rumored to be Gandalf hidden into a chip form.
 

Ashes

Member
Dec 11, 2008
23,378
0
1,085
Greater London
...because if it is a wash, then there wouldn't be much better framerates for PS4.

We simply don't know enough about Durango's architecture to be able to dismiss the slower ram.

Oh.. I thought you implied that developers would target 4 gigs of ram, using slower bandwidths, kinda like finding the middle ground...

How else would you predict a fifth less in utilisation in terms of discrepancy between the two systems? Surely you, don't think, Sony are that inept? Or Microsoft that far ahead?

By the amount suggested, it feels like they [devs] would target 30 for one and 60 for the other hoping that most people wouldn't notice the difference in frame rate, and dive in for the eye 1080p candy.

We simply don't know enough about Durango's architecture to be able to dismiss the slower ram.

This. 100% agree with this. And not just ram, the gpu too.

To not have to type it up again, here, and the argument with the poster aside, this too.
 

oldergamer

Member
Aug 20, 2004
3,653
3,406
1,605

thehypocrite

Member
Mar 28, 2010
6,252
3
0
Dominican Republic
I meant for public consumption. I knew developers could license it for a little while. Still what makes everyone think the lighting engine has been removed. I don't see where this is coming from...

Where do you think leaks come from? Developers. Who do you think would detect a feature like that is missing? Developers. It isn't as far fetched as you think.
 

rudieboy77

Member
Apr 8, 2012
2,927
5
580
I just noticed a news crawl on MSNBC talking about the specs for both consoles being leaked. I'm wondering if they are just repeating what was written on Reddit. Not that I trust the news per se, but that would be really weak to cite someone on Reddit, then put it on national TV.
 

Reiko

Banned
Apr 29, 2012
10,989
2
0
I just noticed a news crawl on MSNBC talking about the specs for both consoles being leaked. I'm wondering if they are just repeating what was written on Reddit. Not that I trust the news per se, but that would be really weak to cite someone on Reddit, then put it on national TV.

lol That would be hilarious.
 

amstradcpc

Member
May 26, 2009
2,502
372
960
He must be checking the post with the british encyclopedia!.

Gandalf chip must be the most difficult to explain in history. Magic is strange by nature.
 

jaypah

Member
Nov 1, 2006
11,061
5
910
41
Why are people referencing Karak for this supposed Monday news? There was a journalist who said he'd be able to leak some info last Friday and mentioned it could be delayed until today, via twitter.

I believe Karak said he had info but he needed to get it all together so yesterday was a possibility and if not then today. I think.
 

Musiol

Member
Aug 30, 2009
457
0
840
Poland
http://www.vgleaks.com/world-exclusive-durango-unveiled/

Here's VGLeaks post about Durango.

As we promised during the weekend in the next weeks we will unveil Durango and Orbis. All the technical info you want to know about the next generation machines from Sony and Microsoft.

The first one is Durango. In this article we present the system overview with the general components and some technical details about them.

How are durango components connected?

Here you can see the Durango system block diagram:

Durango Arquitecture



Let’s check what’s inside the box:

CPU:

- x64 Architecture

- 8 CPU cores running at 1.6 gigahertz (GHz)

- each CPU thread has its own 32 KB L1 instruction cache and 32 KB L1 data cache

- each module of four CPU cores has a 2 MB L2 cache resulting in a total of 4 MB of L2 cache

- each core has one fully independent hardware thread with no shared execution resources

- each hardware thread can issue two instructions per clock

GPU:

- custom D3D11.1 class 800-MHz graphics processor

- 12 shader cores providing a total of 768 threads

- each thread can perform one scalar multiplication and addition operation (MADD) per clock cycle

- at peak performance, the GPU can effectively issue 1.2 trillion floating-point operations per second

High-fidelity Natural User Interface (NUI) sensor is always present

Storage and Memory:

- 8 gigabyte (GB) of RAM DDR3 (68 GB/s)

- 32 MB of fast embedded SRAM (ESRAM) (102 GB/s)

- from the GPU’s perspective the bandwidths of system memory and ESRAM are parallel providing combined peak bandwidth of 170 GB/sec.

- Hard drive is always present

- 50 GB 6x Blu-ray Disc drive

Networking:

- Gigabit Ethernet

- Wi-Fi and Wi-Fi Direct

Hardware Accelerators:

- Move engines

- Image, video, and audio codecs

- Kinect multichannel echo cancellation (MEC) hardware

- Cryptography engines for encryption and decryption, and hashing
 

JonathanPower

Member
Jan 15, 2012
1,920
0
0
yeah, of course it's not "just" 70 GB/s. It's 70GB/s combined with whatever the ESRAM can save out of the equation to reduce the load that would otherwise hit the main bandwidth. The question is how all that plays out in the real world.

more or less the PS3 had twice the system bandwidth as 360, yet in general I think 360 was less bandwidth constrained thanks to EDRAM.

The problem is that it's much easier to get performance out of a more "traditional" architecture with one high bandwidth memory pool than having to work around eDRAM. I think PS4 games will have better frame rate and higher resolution, I think 1080p gaming will be more common on PS4 games.
 

luffeN

Member
Sep 13, 2006
5,025
0
1,030
HDMI out? <--- Edit: Me being stupid.

VGLeaks said:
 

JaggedSac

Member
Jan 14, 2010
17,476
2
0
- from the GPU&#8217;s perspective the bandwidths of system memory and ESRAM are parallel providing combined peak bandwidth of 170 GB/sec.

Say what? Can you just add the numbers together like that?

Hardware Accelerators:

- Move engines

- Image, video, and audio codecs

- Kinect multichannel echo cancellation (MEC) hardware

- Cryptography engines for encryption and decryption, and hashing

What the hell is a move engine? Something specifically dedicated to moving the data between RAM and esram?

High-fidelity Natural User Interface (NUI) sensor is always present

This doesn't make sense given the fact that the Kinect has an input port to the box, how can they assume it is always present?
 

Can Crusher

Banned
Dec 10, 2012
11,377
0
0
High-fidelity Natural User Interface (NUI) sensor is always present

What?

from the GPU’s perspective the bandwidths of system memory and ESRAM are parallel providing combined peak bandwidth of 170 GB/sec.

Does it mean what I think it means?

Hardware Accelerators:

- Move engines What?

- Image, video, and audio codecs

- Kinect multichannel echo cancellation (MEC) hardware

- Cryptography engines for encryption and decryption, and hashing What?


Well I'll let the experts dissect it...I don't know if this is good news or not!