• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is the Console GPU FLOPs era going the way of the Bit era? I for one welcome our new GB/s marketing overlord

Onocromin

Banned
Also, I still call bs. I don't suddenly live in a world where people forgot about the term graphical bit rate or 256+ bit graphics. Instead you're all mentally inept and incapable of coming to terms that you're not actually legitimately mentally adept or stable if you think graphical bit rate never mattered and can't manage to you yourself use this terminology.

In legitimate real society which is where I'm from, where I'm quiet certain I can find real, actual people/actual gamers - the terms graphical bit rate ect are still terms that are used and appreciated in the face to face gaming community. And anyone arguing otherwise is simply going to be ignored by me here on out because as I've pointed out - people online have lost their mental aptitude when speaking about relevant matters.

And I argue that the only reason people here or on other forums refuse to believe graphical bit rate ( still cant believe no one married the words bit with graphics after browsing this particular thread.. literally about graphical bit rates) ever was a relevant term is because they either have mental issues or were in fact born after manufacturers - not gamers - abandoned attaching the graphical bit rate to its naming convention.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Bits was so important in marketing until we reached 32-bits , for example Sony didn't make a big deal about PS1 being 32-bit even though that was the marking thing that was big in the 80's & 90's but it was probably the CDs that gave PS1 the marketing edge. Atari & Nintendo used 64 in their names but it was too late no one cared about bits anymore Dreamcast made sure to mention that it was 128-bit but it was dead Jim. PS2 was 128-bit but it was the DVD format that sold the PS2. The bit era of marketing faded away & the mention of FLOPs mostly went unnoticed until PS4 had 1.8 TFLOPs of GPU power then came the mid-gen consoles with 4 & 6 TFLOPs using it as a buzz word to sell consoles but neither one of them set the world on fire & now we are here with the new consoles one is boasting that it has 12TFLOPs & the other is boasting that it's SSD can move 5.5GB/s.


Is Xbox Series X wearing that 64 on it's chest? Will PS5 SSD be another PS3 Blu-ray drive? Find out next time on Dragon Ball Z
This is actually a good question/point. But I think if the XSX had 40% more TFs than the PS5 it would have actually mattered some. So I don't think the TF era is over.
 

Genx3

Member
Teraflops are perfect for comparing GPU's built on the same Architecture such as the PS5 and XSX, PS4 and XB1, Pro and XB1X.
When comparing totally different Architectures then it has less significance.

If you know the general performance of one Architecture vs a different one on a 1TF vs 1TF basis then even different Architectures can be compared.
 

phil_t98

#SonyToo
For PS4 people made a bigger deal of the 8GB of GDDR5 but it did start the console fans talking about GPU FLOPs but it was PS4 Pro & Xbox One X that had the warriors talking about FLOPs on the level of the Bit wars from back in the day.
Not at all, you must be misremembering the talk about power. All the talk was ps4 being more powerful
 

Journey

Banned
You're right, we can't just look at Teraflops.

Bandwidth is huge!

560GB/s transfer rate for all of 10GB reserved for VRAM is pretty impressive.

Knowing that PS4 reserves 3.5GB for OS, that leaves 12.5GB for games at 448GB/s even if the CPU and audio used ZERO, so the number used for games will likely be closer to 10GB which is what MS allocated @560GB/s

Although I do LOVE what Sony is doing with its SSD, I think gamers are exaggerating the impact it will have on performance as far as graphics. It will be great to see what they can do with level design and instant loading, but I don't believe it will do anything to close the gap as far as graphics go.
 
Last edited:

onQ123

Member
You're right, we can't just look at Teraflops.

Bandwidth is huge!

560GB/s transfer rate for all of 10GB reserved for VRAM is pretty impressive.

Knowing that PS4 reserves 3.5GB for OS, that leaves 12.5GB for games at 448GB/s even if the CPU and audio used ZERO, so the number used for games will likely be closer to 10GB which is what MS allocated @560GB/s

Although I do LOVE what Sony is doing with its SSD, I think gamers are exaggerating the impact it will have on performance as far as graphics. It will be great to see what they can do with level design and instant loading, but I don't believe it will do anything to close the gap as far as graphics go.


It's going to do a lot for graphics because it will keep higher quality assets in the RAM as it's needed instead of devs having to load mostly lower quality assets.
 

-Minsc-

Member
Back in the simpler days. When it was all about the bits, Hz, MHz, GHz....

Hearing teraflops makes me wonder if the machine can time travel.
 

M1chl

Currently Gif and Meme Champion
Good thing that we are not talking about GB/s in terms of overall bandwidth of consoles.
 

onQ123

Member
Because this hype is centered about super fast SSD and not on super fast internal data bandwidth.

It all matters & if you look on the 1st page I mentioned that it's internal bandwidth that's over 1TB/s but that's small & need to have data fed to it by a bigger pool of memory & that bigger pool of memory need to be fed data by a even bigger pool of storage.
 

onQ123

Member
Kinda crazy how the 1st demo of next generation games that got everyone talking focused on having a high level of detail thanks to fast I/O & even though GPU power is needed everyone is talking about the SSD even the people who tried to downplay it from the start.
 

onQ123

Member
zWP8MR4.jpg



vMTG1v6.png
 

RayHell

Member
SSD speed make a big difference at the Engine level.

VRAM = What you currently see on screen + Buffer
SSD = What is coming next once you start digging into VRAM Buffer.

The faster the SSD, the smaller this buffer can be. So the more VRAM is available what you currently see on screen.
 
As long as both consoles share the same or similar GPU architecture, then it will always be relevant to compare them using TFLOPS.
 

RaySoft

Member
Bits was so important in marketing until we reached 32-bits , for example Sony didn't make a big deal about PS1 being 32-bit even though that was the marking thing that was big in the 80's & 90's but it was probably the CDs that gave PS1 the marketing edge. Atari & Nintendo used 64 in their names but it was too late no one cared about bits anymore Dreamcast made sure to mention that it was 128-bit but it was dead Jim. PS2 was 128-bit but it was the DVD format that sold the PS2. The bit era of marketing faded away & the mention of FLOPs mostly went unnoticed until PS4 had 1.8 TFLOPs of GPU power then came the mid-gen consoles with 4 & 6 TFLOPs using it as a buzz word to sell consoles but neither one of them set the world on fire & now we are here with the new consoles one is boasting that it has 12TFLOPs & the other is boasting that it's SSD can move 5.5GB/s.


Is Xbox Series X wearing that 64 on it's chest? Will PS5 SSD be another PS3 Blu-ray drive? Find out next time on Dragon Ball Z
Even though the N64's processor could do 64bit calculations without splitting, it's FSB was 32bit.
All games was compiled as 32bit as well since it's processor were faster in 32bit mode than 64.
Similar things could be said by many consoles of the era.
One could have 32bit dataregisters, but 16bit address regs.
Atari Jaguar for instance, was claiming 64bits, even though it's two main custom chips were 32bits.

Only way to navigate through the propaganda is education, and that holds true for everything in life.
 
Last edited:
Slowest part, yes, but least important? Does not seem like that to developers, as it was the number one demand for them when Cerny was doing interviews. With slow storage, you have to waste time having to adapt game design and waste precious space on RAM to compensate for that.
Right, so if developers were saying this to Sony, they must have said it to Microsoft too?

to be personally honest, I feel Sony may have went a bit overboard on the SSD while they cut corners / skimped on everything else like making out GPU performance. That’s all too clear looking at the specs on paper.

I’ll wait and see though.
 

Rob_27

Member
Leave it as it is until Petaflops are in consoles. its about raw power. Do you think they sell mainframes on their subsystem storage performance, no its how many MIPS or transactions per day they can do.
 

Silver Wattle

Gold Member
Hard to see how they could switch to memory bandwidth of all things, just look at the XSX and it's "560GB/s" bandwidth, look at how many dunces think it's 560GB/s for the full 16GB, or even worse the special people that think it's 560GB/s + 336GB/s.

Also what will memory bandwidth even be by the ps5 generation? 2TB/s+? At that point it's hard to care.
 
Top Bottom