• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DigitalFoundry: X1 memory performance improved for production console/ESRAM 192 GB/s)

Status
Not open for further replies.

Proelite

Member
These are tales of a science called mathematics.


To summarize

Before
750 * 128 = 102 GB/sec
After
750 (freq) * 128 (bits) = 96 GB/s
96 GB/s * 2 (simultaneous read/write) = 192 GB/s

Please point at the part that is supposedly "from my ass".

If you're so confident in your math, bet your account on it that this is gpu is now 750 mhz.
 
No downclocking? B...b...b...but it was a #truthfact. Like Pop. And Mirror's Edge 2. And no PS4 paywall. And even worse DRM.

CBOAT never said anything about downclocking.
CBOAT admitted PoP was a mistake/was pulled.
ME2 was at EA's conference.
CBOAT never said PS4 wouldn't have a paywall.
MS clearly removed their DRM on short notice, so we'll never know if the DRM was going to get worse.

So, no.

Edit: I see many others already replied basically the same thing, oh well.
 

SSM25

Member
These are tales of a science called mathematics.


To summarize

Before
750 * 128 = 102 GB/sec
After
750 (freq) * 128 (bits) = 96 GB/s
96 GB/s * 2 (simultaneous read/write) = 192 GB/s

Please point at the part that is supposedly "from my ass".

It was supposed to be 204 Gb/sec before ?
 

Acheteedo

Member
Leadbetter copy pasting MS PR, why am I not surprised?

This has been explained in the DF comments already:

Reads more like creative accounting to disguise a GPU downclock from 800MHz by 50MHz.

750 (freq) * 128 (bits) = 96 GB/s
96 GB/s * 2 (simultaneous read/write) = 192 GB/s

How much does MS pay you to write this garbage, Leadbetter?

Has a 50Mhz downclock been rumoured before? If not then aren't you just re-aligning facts/figures to meet your agenda? Sorta like truthers.
 

benny_a

extra source of jiggaflops
It was supposed to be 204 Gb/sec before ?
See here:
Let's go through this logically, I'll lay it out so everyone can understand.

Previously MS engineers though read/write was only unidirectional, and the APU to ESRAM bandwitdth was pegged at 102GB/s, the reality was that it was bidirectional, regardless of what they thought which meant it was actually 204GB/s which lines up perfectly to 800MHz for the GPU.

Now we have information saying that MS engineers have discovered that information is bidirectional (not that it is now, just that they found out it is) and the consolidated read/write bandwidth is 192GB/s which is 96GB/s in each direction. That figure is lower than the old 102GB/s figure and it implies a GPU clock of 750MHz.

So yes, 192 is higher than 102, but it is not comparable as the latter is unidirectional bandwidth and the former is bidirectional. The fact that MS engineers didn't know or realise that you could run read/write operations simultaneously is irrelevant because it was still possible, this is not a new addition, more a new discovery. Think of it like a scientific discovery, just because an apple fell on Newton it doesn't mean he invented gravity, it existed before that, he just discovered it.

So we've actually gone from 204GB/s to 192GB/s or on the old measure, 102GB/s to 96GB/s, it's not that hard to understand. Leadbetter has this one wrong and he should try to correct it.
 

spwolf

Member
People act like the two GPUs are light years apart, which cracks me up. In reality they are very similar. That "50% power" difference is funny too. At 1080p you simply aren't going to see much of a difference.

In reality you get a tiny FPS (frames per second) boost that may not even matter and similar very high settings vs possible high settings (using PC "wording" there). We'll probably have 3rd party games where the X1 runs a game at 68fps and the PS4 runs it at 85fps...both would be locked to 60fps on both. They're so similar I doubt you'll see any worthy games sub-30fps.

Granted, sub-30fps seemed to be a norm for multiple PS4 games at E3.

if all we know about power is true, then difference will be easy to spot in multiplatform games, not just fps but better graphics/effects, etc.
 
Doesn't this all hinge on whether or not DF's information is correct?

He and others are assuming that there is some truth to this 192GB/s figure.
Being DF said single read or write is 102 still people are choosing one number that fits their math assumption but ignoring the rest. First sign to me that people are stretching.
 
Has a 50Mhz downclock been rumoured before? If not then aren't you just re-aligning facts/figures to meet your agenda? Sorta like truthers.


Yeah a downclock was talked about in the same thread as the #TRUEFACT thread. A few of the GAF insiders said they "heard" that there may have been a downclock to get yields up. Nothing really confirmed though.
 

IcyEyes

Member
90% of games will look the same on both. Stop dreaming.

Maybe this can help :

differences in multi-platform games may not become evident until developers are working with more mature tools and libraries.
At that point it's possible that we may see ambitious titles operating at a lower resolution on Xbox One compared to the PlayStation 4.
 
These are tales of a science called mathematics.


To summarize

Before
750 * 128 = 102 GB/sec
After
750 (freq) * 128 (bits) = 96 GB/s
96 GB/s * 2 (simultaneous read/write) = 192 GB/s

Please point at the part that is supposedly "from my ass".

Oops =P

Has a 50Mhz downclock been rumoured before? If not then aren't you just re-aligning facts/figures to meet your agenda? Sorta like truthers.

The rumor from Thuway was 100-300. Which would have been nuts if true.
 

SRTtoZ

Member
I believe it was Thuway who was one of the people behind the downclock rumor AND yield issues and then Cboat came in the thread and said confirmed but we dont know if he meant both or just yield issues.

Please correct me if I am wrong.
 

charsace

Member
Because the math in the article makes ZERO sense. When someone actually started trying to make sense of it all they came up with this:



That shows a downclock and makes sense when actual math is considered. Is it 100% confirmed? Idk

The article is saying that there some extra resources that are available in a different interval. At best every X time in the processing pipeline you have something extra to play around with. The math people are posting in this thread is based on the interval that they know, which is that you perform one read/write per clock edge. We already know that a read and a write are performed each clock. The article isn't talking about that. The problem the article gives is that every X read/write edge you get an extra one or something in the edge.

IMO the math that people are posting has nothing to do with the problem I am picking up from the article.
 

Scooter

Banned
If you're so confident in your math, bet your account on it that this is gpu is now 750 mhz.


Ban bets are not allowed anymore afaik but if a mod allows it I'll do it as long as you also bet yours that it's 800 mhz. Btw I'm not the one acting like an expert judging others from my pedestal, all I did is use some common sense and basic math.
 

shandy706

Member
You can see a difference in 480p let alone 1080p. It's not the resolution, it's how much computation you do before finding which color a pixel should be.

I'm mainly a PC gamer that buys all consoles. The two GPUs in the end products simply aren't massively different in the world of gaming GPUs.


Or you'll get 40 fps in the Xbone and 60 on the PS4. Or 24 on the Xbone and 30 on the PS4.

Possibly, but unlikely when it comes to 3rd party. They'll most likely aim for 30 or 60. Twenty FPS is probably an extreme example.

That clearly means PS4 is shit, amirite?

Why do you think the PS4 is crap? I think it'll be just fine. If you feel it is, that's your choice. :)
 

ekim

Member
These are tales of a science called mathematics.


To summarize

Before
800 * 128 = 102 GB/sec
After
750 (freq) * 128 (bits) = 96 GB/s
96 GB/s * 2 (simultaneous read/write) = 192 GB/s

Please point at the part that is supposedly "from my ass".

From the article:
However, with near-final production silicon, Microsoft techs have found that the hardware is capable of reading and writing simultaneously.
Apparently, there are spare processing cycle "holes" that can be utilised for additional operations.

But I guess you know how the ESram works in Xbox One.
 

coldone

Member
It has be done before the chip has been designed. You will need a asynchronous bridge between the GPU and Memory to clock them independently. The bridge will buffer requests, which will add latency.

Clocking them independently only helps when the GPU is operating on cached data. Most of the time it will stall and wait for the data to arrive from eSRAM.

Well it is possible the eSRAM has an independent clock maybe?
 

Acheteedo

Member
Yeah a downclock was talked about in the same thread as the #TRUEFACT thread. A few of the GAF insiders said they "heard" that there may have been a downclock to get yields up. Nothing really confirmed though.

Yes I know of the downclock rumour, never heard of 50Mhz specifically. Such a specific rumour would need to exist to give this any weight at all, otherwise it's just agenda-driven speculation.
 
It's hilarious the attempt to make a positive news become negative.

I guess you are free to bring up arguments and engage in the discussion.

Yes I know of the downclock rumour, never heard of 50Mhz specifically. Such a specific rumour would need to exist to give this any weight at all, otherwise it's just agenda-driven speculation.

So you are disregarding the previous rumors because they did not specify the exact same figure?
 

benny_a

extra source of jiggaflops
Yes I know of the downclock rumour, never heard of 50Mhz specifically. Such a specific rumour would need to exist to give this any weight at all, otherwise it's just agenda-driven speculation.
Why is a specific number necessary to be rumored?

Do you understand the math that is being done here or do you think the values are chosen at random to make the Xbox One look bad?

I feel the posts about the downclock in this thread are very clear and specific with how they arrived at that conclusion. They are based on the numbers that seem to come from the source of the article and aren't the part that is editorialized.
 

Freki

Member
Ban bets are not allowed anymore afaik but if a mod allows it I'll do it as long as you also bet yours that it's 800 mhz. Btw I'm not the one acting like an expert judging others from my pedestal, all I did is use some common sense and basic math.

I'd suggest you bet against it beeing 192GB/s peak performance @ 800MHz... (This combination makes little sense on the informations we have gotten so far)
 

chubigans

y'all should be ashamed
I'll wait on the new thread until we can get more info. This is all so crazy I don't know what to think anymore. Someone else can make it. :p
 
The article is saying that there some extra resources that are available in a different interval. At best every X time in the processing pipeline you have something extra to play around with. The math people are posting in this thread is based on the interval that they know, which is that you perform one read/write per clock edge. We already know that a read and a write are performed each clock. The article isn't talking about that. The problem the article gives is that every X read/write edge you get an extra one or something in the edge.

IMO the math that people are posting has nothing to do with the problem I am picking up from the article.

I see forwarding getting called a few times on beyond3D.
It sounds like coalesced read and writes.
 

Ding-Ding

Member
Ban bets are not allowed anymore afaik but if a mod allows it I'll do it as long as you also bet yours that it's 800 mhz. Btw I'm not the one acting like an expert judging others from my pedestal, like you said all I do is basic math.

Dont worry about it. Proelites maths have been questionable for a while. I am pretty sure he was the guy calling the the GPU in the "One" more powerful that the PS4's (I think he was saying it was 2.5TF if I remember correctly)

Besides, he was relegated to junior for a reason
 

Acheteedo

Member
Why is a specific number necessary to be rumored.

Do you understand the math that is being done here or do you think the values are chosen at random to make the Xbox One look bad?

I see someone without inside information trying to debunk someone with inside information based on essentially nothing.
 
Dont worry about it. Proelites maths have been questionable for a while. I am pretty sure he was the guy calling the the GPU in the "One" more powerful that the PS4's (I think he was saying it was 2.5TF if I remember correctly)

Besides, he was relegated to junior for a reason

Wasn't that Reiko and MisterXteam or something like that.

Its hilarious how the day after Cerny talks about XB1's eSRAM this article is put out and the math doesnt quite add up.

Im pretty sure that talk had nothing to do with it. While interesting it was more a trip down memory lane.
 

Scooter

Banned
From the article:


But I guess you know how the ESram works in Xbox One.


What's your point? They're pretty much admitting that they're adding read/write to give us theoretical bullshit numbers which was the basic premise of my argument. Even if their PR is somehow true the downclocking is still real.
 
Lemme get this straight. Yesterday , Cerny talks very precisely about the high system bandwidth of the PS4, then out of nowhere Microsoft responds the next day by saying their bandwidth is somehow like 80 percent bigger that they expected and thus similar to ps4. They also claim that us gamers should not worry about specs anyway... And that this is no longer about bits, even though they talked about transistors at their reveal...

And some people believe them... scary stuff.
 

Proelite

Member
Dont worry about it. Proelites maths have been questionable for a while. I am pretty sure he was the guy calling the the GPU in the "One" more powerful that the PS4's (I think he was saying it was 2.5TF if I remember correctly)

Besides, he was relegated to junior for a reason

No I didn't.

I said it'll take at least 2.5TF GPU in a pc to do what the One's GPU is capable. A sentiment that I still stand by.

I was optimistic about the One's graphical power despite the 1.24 teraflop GPU because of the move engine and Esram, and when PS4 only had 4GB of ram, confident that it would best the ps4 in ports due to a 1.5 gigabyte ram advantage.
 

GribbleGrunger

Dreams in Digital
I believe it was Thuway who was one of the people behind the downclock rumor AND yield issues and then Cboat came in the thread and said confirmed but we dont know if he meant both or just yield issues.

Please correct me if I am wrong.

Cboat only ever confirmed yield problems. The rest was speculation based on what Microsoft's approach would be. A downclock seemed the obvious solution because it meant they could 'ship' more consoles. With MS suddenly announcing that power doesn't matter, I'd say it's likely they went with this option.
 
Status
Not open for further replies.
Top Bottom