This comment coming from you...Gaf are experts at silicon behavior, don't doubt.
This comment coming from you...Gaf are experts at silicon behavior, don't doubt.
So a 50mhz downclock for the ESRAM looks like it was true? Oh man the Xbone can't catch a brake.
At 750MHz the GPU would be 1.152TFLOPS, but we know that games only have access to 90% of that which makes the actual number available to devs 1.037TFLOPS.
These are tales of a science called mathematics.
To summarize
Before
750 * 128 = 102 GB/sec
After
750 (freq) * 128 (bits) = 96 GB/s
96 GB/s * 2 (simultaneous read/write) = 192 GB/s
Please point at the part that is supposedly "from my ass".
No downclocking? B...b...b...but it was a #truthfact. Like Pop. And Mirror's Edge 2. And no PS4 paywall. And even worse DRM.
Well it is possible the eSRAM has an independent clock maybe?
These are tales of a science called mathematics.
To summarize
Before
750 * 128 = 102 GB/sec
After
750 (freq) * 128 (bits) = 96 GB/s
96 GB/s * 2 (simultaneous read/write) = 192 GB/s
Please point at the part that is supposedly "from my ass".
Gaf are experts at silicon behavior, don't doubt.
Leadbetter copy pasting MS PR, why am I not surprised?
This has been explained in the DF comments already:
Reads more like creative accounting to disguise a GPU downclock from 800MHz by 50MHz.
750 (freq) * 128 (bits) = 96 GB/s
96 GB/s * 2 (simultaneous read/write) = 192 GB/s
How much does MS pay you to write this garbage, Leadbetter?
See here:It was supposed to be 204 Gb/sec before ?
Let's go through this logically, I'll lay it out so everyone can understand.
Previously MS engineers though read/write was only unidirectional, and the APU to ESRAM bandwitdth was pegged at 102GB/s, the reality was that it was bidirectional, regardless of what they thought which meant it was actually 204GB/s which lines up perfectly to 800MHz for the GPU.
Now we have information saying that MS engineers have discovered that information is bidirectional (not that it is now, just that they found out it is) and the consolidated read/write bandwidth is 192GB/s which is 96GB/s in each direction. That figure is lower than the old 102GB/s figure and it implies a GPU clock of 750MHz.
So yes, 192 is higher than 102, but it is not comparable as the latter is unidirectional bandwidth and the former is bidirectional. The fact that MS engineers didn't know or realise that you could run read/write operations simultaneously is irrelevant because it was still possible, this is not a new addition, more a new discovery. Think of it like a scientific discovery, just because an apple fell on Newton it doesn't mean he invented gravity, it existed before that, he just discovered it.
So we've actually gone from 204GB/s to 192GB/s or on the old measure, 102GB/s to 96GB/s, it's not that hard to understand. Leadbetter has this one wrong and he should try to correct it.
Doesn't this all hinge on whether or not DF's information is correct?If you're so confident in your math, bet your account on it that this is gpu is now 750 mhz.
It was supposed to be 204 Gb/sec before ?
People act like the two GPUs are light years apart, which cracks me up. In reality they are very similar. That "50% power" difference is funny too. At 1080p you simply aren't going to see much of a difference.
In reality you get a tiny FPS (frames per second) boost that may not even matter and similar very high settings vs possible high settings (using PC "wording" there). We'll probably have 3rd party games where the X1 runs a game at 68fps and the PS4 runs it at 85fps...both would be locked to 60fps on both. They're so similar I doubt you'll see any worthy games sub-30fps.
Granted, sub-30fps seemed to be a norm for multiple PS4 games at E3.
Gaf are experts at silicon behavior, don't doubt.
See here:
Being DF said single read or write is 102 still people are choosing one number that fits their math assumption but ignoring the rest. First sign to me that people are stretching.Doesn't this all hinge on whether or not DF's information is correct?
He and others are assuming that there is some truth to this 192GB/s figure.
Has a 50Mhz downclock been rumoured before? If not then aren't you just re-aligning facts/figures to meet your agenda? Sorta like truthers.
90% of games will look the same on both. Stop dreaming.
differences in multi-platform games may not become evident until developers are working with more mature tools and libraries.
At that point it's possible that we may see ambitious titles operating at a lower resolution on Xbox One compared to the PlayStation 4.
These are tales of a science called mathematics.
To summarize
Before
750 * 128 = 102 GB/sec
After
750 (freq) * 128 (bits) = 96 GB/s
96 GB/s * 2 (simultaneous read/write) = 192 GB/s
Please point at the part that is supposedly "from my ass".
Has a 50Mhz downclock been rumoured before? If not then aren't you just re-aligning facts/figures to meet your agenda? Sorta like truthers.
Because the math in the article makes ZERO sense. When someone actually started trying to make sense of it all they came up with this:
That shows a downclock and makes sense when actual math is considered. Is it 100% confirmed? Idk
If you're so confident in your math, bet your account on it that this is gpu is now 750 mhz.
You can see a difference in 480p let alone 1080p. It's not the resolution, it's how much computation you do before finding which color a pixel should be.
Or you'll get 40 fps in the Xbone and 60 on the PS4. Or 24 on the Xbone and 30 on the PS4.
That clearly means PS4 is shit, amirite?
These are tales of a science called mathematics.
To summarize
Before
800 * 128 = 102 GB/sec
After
750 (freq) * 128 (bits) = 96 GB/s
96 GB/s * 2 (simultaneous read/write) = 192 GB/s
Please point at the part that is supposedly "from my ass".
However, with near-final production silicon, Microsoft techs have found that the hardware is capable of reading and writing simultaneously.
Apparently, there are spare processing cycle "holes" that can be utilised for additional operations.
Well it is possible the eSRAM has an independent clock maybe?
Yeah a downclock was talked about in the same thread as the #TRUEFACT thread. A few of the GAF insiders said they "heard" that there may have been a downclock to get yields up. Nothing really confirmed though.
It's hilarious the attempt to make a positive news become negative.
Yes I know of the downclock rumour, never heard of 50Mhz specifically. Such a specific rumour would need to exist to give this any weight at all, otherwise it's just agenda-driven speculation.
It's hilarious the attempt to make a positive news become negative.
Why is a specific number necessary to be rumored?Yes I know of the downclock rumour, never heard of 50Mhz specifically. Such a specific rumour would need to exist to give this any weight at all, otherwise it's just agenda-driven speculation.
Ban bets are not allowed anymore afaik but if a mod allows it I'll do it as long as you also bet yours that it's 800 mhz. Btw I'm not the one acting like an expert judging others from my pedestal, all I did is use some common sense and basic math.
The article is saying that there some extra resources that are available in a different interval. At best every X time in the processing pipeline you have something extra to play around with. The math people are posting in this thread is based on the interval that they know, which is that you perform one read/write per clock edge. We already know that a read and a write are performed each clock. The article isn't talking about that. The problem the article gives is that every X read/write edge you get an extra one or something in the edge.
IMO the math that people are posting has nothing to do with the problem I am picking up from the article.
Ban bets are not allowed anymore afaik but if a mod allows it I'll do it as long as you also bet yours that it's 800 mhz. Btw I'm not the one acting like an expert judging others from my pedestal, like you said all I do is basic math.
Seriously, you're going to argue with him on a 5% difference?more like 95%
Why is a specific number necessary to be rumored.
Do you understand the math that is being done here or do you think the values are chosen at random to make the Xbox One look bad?
It's hilarious the attempt to make a positive news become negative.
Dont worry about it. Proelites maths have been questionable for a while. I am pretty sure he was the guy calling the the GPU in the "One" more powerful that the PS4's (I think he was saying it was 2.5TF if I remember correctly)
Besides, he was relegated to junior for a reason
Its hilarious how the day after Cerny talks about XB1's eSRAM this article is put out and the math doesnt quite add up.
From the article:
But I guess you know how the ESram works in Xbox One.
Gemüsepizza;66988141 said:I guess you are free to bring up arguments and engage in the discussion.
But it is based on the number from the source in the article.I see someone without inside information trying to debunk someone with inside information based on essentially nothing.
Dont worry about it. Proelites maths have been questionable for a while. I am pretty sure he was the guy calling the the GPU in the "One" more powerful that the PS4's (I think he was saying it was 2.5TF if I remember correctly)
Besides, he was relegated to junior for a reason
Wasn't that Reiko and MisterXteam or something like that.
But it is based on the number from the source in the article.
I believe it was Thuway who was one of the people behind the downclock rumor AND yield issues and then Cboat came in the thread and said confirmed but we dont know if he meant both or just yield issues.
Please correct me if I am wrong.
Its hilarious how the day after Cerny talks about XB1's eSRAM this article is put out and the math doesnt quite add up.
The RAM is embedded in the GPU.I ask again are we certain that the esram is synchronous??
So - which is it?People shouldn't do math with things they don't fully understand.
Sorry, I will not join the superficial speculation.