• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is the Console GPU FLOPs era going the way of the Bit era? I for one welcome our new GB/s marketing overlord

onQ123

Member
Look, I'm no expert but this sounds like a load of waffle. Your storage isn't going to do anything for graphics, since when did HDDs/SSDs have processing power? Stuff is going to load a bit quicker, that's it, keep your expectations down.
I'll bet the difference in loading times between the two consoles will be small, too. Haven't noticed much after moving my games from SSD to NVME.


SMH you people (yes I said you people ) keep saying it's just loading times but you're wrong because the speed enables better memory management so you're not filling the RAM up with stuff that you may or may not need way ahead of time, with a fast SSD a lot of stuff that used to fill up RAM can just be streamed in as needed so you can have higher quality assets in RAM.


Why is this so hard to get?

but that's not even what this thread is about.
 

NT80

Member
The Sega Megadrive actually had 16 bit on the front of the console to show it off. I remember there being a big difference between the NES/Master System and the PC Engine. I was really surprised at the time to learn the PC Engine was 8 bit like them.
 

onQ123

Member
The Sega Megadrive actually had 16 bit on the front of the console to show it off. I remember there being a big difference between the NES/Master System and the PC Engine. I was really surprised at the time to learn the PC Engine was 8 bit like them.

And their next console had 16 in the name lol

turbografx16-rev-tan.jpg
 

SonGoku

Member
Not until we move further into RT
By PS6/XB5 launch it'll be all about which GPU can push more Giga Rays
 

bohrdom

Banned
It's kind of unfortunate that the marketing flames up these console wars.

The reality is both FLOPS and bandwidth are important. However in most throughput optimized systems on the market today, memory bandwidth is currently the most critical resource. Both consoles address this issue, Xbox has a split RAM solution that averages to 476 GB/s and PS has 448 GB/s. However it's important to keep in mind that with more CUs you need more bandwidth.
 
Of course not.
Because 520GB/s is bigger than 5GB/s

They would use memory bandwidth and not the slower SSD throughput.
But if they put 520GB/s against 5500MB/s they may get away with it!

Or even better add all the bus speeds and the frequencies to get a performance metric, you don't even have to use the same unit for everything, just use the numbers and make a big one for marketing!
 

SonGoku

Member
I think it's now because even with the new PC GPU's the focus haven't been on TFLOPs for the marketing
That's because up until RDNA, AMD beat nvidia in TFLOPs even though their cards performed worse
Nvidia PR focuses on benchmarks, cuda cores and architecture features
 

DaGwaphics

Member
TF rating is a valid measure of performance when comparing cards of the same arch. As is GB/s when measuring memory or storage bandwidth. These things are here to stay. You have to look at the entire system to get an idea of overall performance, I think both of the new boxes will be close.

Bits never really translated into performance the way the GP thought it did, not something comparable to the performance metrics used today.
 

UnNamed

Banned
TF has little use, but still a better measurement than bit.

When "bit" was a thing, people didn't know shit about what type of bit was inportant: data bus? Address? Internal registers? And why nobody talked about ALU? For example, Motorola 68000 was actually a 32bit processor but MD CPU had no match with a PSX or even a GBA CPU, PC Engine was 8bit but capable of games similar to SNES and MD (even better with the arcade card), Jaguar was 32bit and it was shit, PS4 have a 64BITCPU but it has no sense to compare it with the N64CPU.

TF have a bit sense nowadays, "bit" never had sense.
 
Last edited:

martino

Member
I was going to say something about the I/O chip being Fuel Injection but I didn't want to feed into the car thing lol

it works better with a carburetor.
You have a small one and you add an even smaller one in front of it in the pipeline.
cheaper but best solution would be to have a bigger carburetor.
 
Last edited:

Onocromin

Banned
The jump from 64bit to 128bit was widely appreciated by most and discussed in my recollection, which is actually a legitimate recollection unlike other's here speaking out of thine ass.

And most with little knowledge wildly speculated we could not in fact go much higher than 128bits but loved to discuss the possibility. Which demonstrates the fact that graphical bit rate was a very appreciated annotation when manufacturers advertised to that attribute.

Personally, I am a computer scientist. So this has always been important to me and was such among my friends who loved gaming too.

I wasn't a computer scientist then and still very much appreciated bit rate discussion and... *GASP* imaging a world with 512bit graphics. 1028bit Graphics. Ect.

We have far exceeded both of those mile stones today and Graphical Bit Rate Performance still keeps climbing. The common consumer lacks the ability to grasp how to keep up with bit rate. The average gamer relies on news/websites for this, and the average non game never cared about graphical bit rate to begin with.
 

Texas Pride

Banned
Flops ain't shit without the games to back em up. 1 company has learned this lesson and 1 hasn't. And in Nintendo's case their games are so desirable the flops don't matter. MS bought studios yes we all know. But at this point they're just more promises of hope's and dreams than anything of substance. Or more likely just gamespass filler for next gen.
 
Last edited:

Onocromin

Banned
After reading further

"When Bit was a thing" people didn't know what kinda bits... What??

It was widely considered "128bit Graphics" as in "the Graphical Bit Rate is 256bits" and people understood clearly that it meant "Graphical Bit rate" which is in fact the correct label for the term many complaining are in fact grasping for their-self. And Gamers then, always, always used the correct term.

64bit graphics, 128 GRAPHICS. Theres that word. GRAPHICS. 128BIT GRAPHICS

So I really don't understand why others here insist the term was difficult for the average gamer to grasp.

It's actually laughable to me that there are gamers today who complain about this - yet cannot their-self use the correct term and come up short/with a loss for words when grasping for the correct term (that term being Graphical Bit Rate) when discussing the Graphical Bit Rate of those consoles Era's.

And hammer in the fact that they theirself, did not know it was considered the graphical bit rate by proceeding to not mention this fact when trying to wrongly describe consumers who might mistake graphical bit rate with memory bit rate ect.
 

Onocromin

Banned
In fact, after reading THE ENTIRE F'ING THREAD! NO ONE, not ONCE married the term 128Bit with the word GRAPHICS or 256Bit with the term GRAPHICS... or even mention 32bit GRAPHICS and 64bit GRAPHICS... nor was there a SINGLE MENTION OF GRAPHICAL BIT RATE. HOW PATHETIC GAF! WTF! I bet this is creepily a blatant issue across this forum after thinking about this, shying away from using the term 256 bit graphics..GRAPHICS... and FORGETTING that graphical bit rate WAS EVER important, or even.. *gasp* A LEGITIMATE TERM!
 

nosseman

Member
The jump from 64bit to 128bit was widely appreciated by most and discussed in my recollection, which is actually a legitimate recollection unlike other's here speaking out of thine ass.

And most with little knowledge wildly speculated we could not in fact go much higher than 128bits but loved to discuss the possibility. Which demonstrates the fact that graphical bit rate was a very appreciated annotation when manufacturers advertised to that attribute.

Personally, I am a computer scientist. So this has always been important to me and was such among my friends who loved gaming too.

I wasn't a computer scientist then and still very much appreciated bit rate discussion and... *GASP* imaging a world with 512bit graphics. 1028bit Graphics. Ect.

We have far exceeded both of those mile stones today and Graphical Bit Rate Performance still keeps climbing. The common consumer lacks the ability to grasp how to keep up with bit rate. The average gamer relies on news/websites for this, and the average non game never cared about graphical bit rate to begin with.

Has there been a true 128 bit console?

I dont think so.

The word "bit" in consoles referred to the CPU.

NES - 8 bit
SNES - 16 bit
Nintendo 64 - 64 bit
Gamecube - 32 bit
Wii - 32 bit
Wii U - 32 bit

The term "bit" has in i way lost its meaning.
 

Onocromin

Banned
Has there been a true 128 bit console?

I dont think so.

The word "bit" in consoles referred to the CPU.

NES - 8 bit
SNES - 16 bit
Nintendo 64 - 64 bit
Gamecube - 32 bit
Wii - 32 bit
Wii U - 32 bit

The term "bit" has in i way lost its meaning.
Are you stupid or just dead?

People always referred to the nintendo as
either an 8 bits system or 8bit graphics.

Sega/SNES -16bit GRAPHICS

Nintendo - 64bit GRAPHICS.

the CPU was NEVER even considered. EVER. Hell I have countless old game magazines to attest to this fact
unless they are articles focusing on console teardown and specs. every other instance or citation Insists...
that the SNES had 16bit GRAPHICS. The N64 had 64bit GRAPHICS. the Dreamcast and PS2 Had 128bit GRAPHICS.

WHAT WORLD HAVE I ENDED UP ON WHERE THIS HAS BEEN FORGOTTEN?

Xbox 360, was widely considered to be almost capable of 300bit Graphics. 283 maybe bits when the hardware communities dug into it.

Dreamcast, was ANNOUNCED AS THE FIRST 128 Bit CONSOLE.

Wii and WiiU both above 128 GRAPHICAL GPU bits.

This is why I quit posting on forums, nonsense claims and blatant bad CRINGE WORTHY/wrong information discussion.
 

Sosokrates

Report me if I continue to console war
The only people saying this are salty Playstation fans.
Tflops are just as good a performance indicator as they were in at the beginning of this gen lol.
That gen also saw efficiency gains in the GPU, yet no one had an issue back then lol.
 

nosseman

Member
Are you stupid or just dead?

People always referred to the nintendo as
either an 8 bits system or 8bit graphics.

Sega/SNES -16bit GRAPHICS

Nintendo - 64bit GRAPHICS.

the CPU was NEVER even considered. EVER. Hell I have countless old game magazines to attest to this fact
unless they are articles focusing on console teardown and specs. every other instance or citation Insists...
that the SNES had 16bit GRAPHICS. The N64 had 64bit GRAPHICS. the Dreamcast and PS2 Had 128bit GRAPHICS.

WHAT WORLD HAVE I ENDED UP ON WHERE THIS HAS BEEN FORGOTTEN?

Xbox 360, was widely considered to be almost capable of 300bit Graphics. 283 maybe bits when the hardware communities dug into it.

Dreamcast, was ANNOUNCED AS THE FIRST 128 Bit CONSOLE.

Wii and WiiU both above 128 GRAPHICAL GPU bits.

This is why I quit posting on forums, nonsense claims and blatant bad CRINGE WORTHY/wrong information discussion.

Yes - people used the term 8 bit graphics and 16 bit graphics but 8/16bit refereed to the CPU.

In the 8bit/16bit era everything was done in CPU since dedicated graphical hardware was not there - so with a 8 bit CPU you could produce 8 bit graphics and with a 16 bit CPU you could produce 16 bit graphics.

When 3D arrived "x" bit graphics lost its meaning.

Also - dont hold back. Say what you really mean.
 
Last edited:

Onocromin

Banned
As a computer scientist, and an Xbox fan. I assure you, you are wrong.

Tflops, intellectually, is not a terrible indicator of performance however I am no PS fan. But are we talking gen 3 RDNA Teraflops, or?


1 original... ORIGINAL teraflop... was equal to 93 thousand state of the art PC's in 99. That is the FLOPS original measure.

That measure of TFLOP performance has not changed, but the amount of computers and the era they were built in have. The second generation of TFLOP was equal to 235,000 State of the Art Desktop PCS in 2003.


After examining the Xbox Series X specs - Series X will in fact have 4,193bit Graphics - Legitimately.
 

Onocromin

Banned
As a computer scientist, and an Xbox fan. I assure you, you are wrong.

Tflops, intellectually, is not a terrible indicator of performance however I am no PS fan. But are we talking gen 3 RDNA Teraflops, or?


1 original... ORIGINAL teraflop... was equal to 93 thousand state of the art PC's in 99. That is the FLOPS original measure.

That measure of TFLOP performance has not changed, but the amount of computers and the era they were built in have. The second generation of TFLOP was equal to 235,000 State of the Art Desktop PCS in 2003.


After examining the Xbox Series X specs - Series X will in fact have 4,193bit Graphics - Legitimately.
Yes - people used the term 8 bit graphics and 16 bit graphics but 8/16bit refereed to the CPU.

In the 8bit/16bit era everything was done in CPU since dedicated graphical hardware was not there - so with a 8 bit CPU you could produce 8 bit graphics and with a 16 bit CPU you could produce 16 bit graphics.

When 3D arrived "x" bit graphics lost its meaning.

Also - dont hold back. Say what you really mean.


When Dreamcast was heralding and touting the first 128bit Graphic Console - most everyone had already disagreed with you.
Also, is your next arguement going to insinuate Dreamcast was incapable of 3d graphics... or.... are you just going to accept the facts?
 

Neo Blaster

Member
The only people saying this are salty Playstation fans.
Tflops are just as good a performance indicator as they were in at the beginning of this gen lol.
That gen also saw efficiency gains in the GPU, yet no one had an issue back then lol.
And Xbox fans are desperately attached to TF because that's the only proven thing MS have so far, they're still lacking in the games department.
 

Sosokrates

Report me if I continue to console war
And Xbox fans are desperately attached to TF because that's the only proven thing MS have so far, they're still lacking in the games department.

What?

No its just a fact that the xsx is more powerful, its not about being more attached to anything, its just aknowledging reality.

Also game tastes are subjective.

The xsx is more powerful overall the ps5 has an ssd advantage, now its time to move on and be hyped for what these consoles will do.
 
Last edited:
I think teraflops is a decent enough barometer for a GPU's power, so long as people realise that it is a theoretical peak.

Previous generations seen people getting laughed at for taking these numbers as gospel truths, but lately that doesn't seem to be the case
 
"Bits" as a marketing tool was always kind of stupid.
Who would care about this except some poor sods still coding in assembler.
 

Alphagear

Member
Will be interesting to see what happens when one from Sony or Microsoft goes Nvidia and one sticks to AMD.

Heard Nvidia cards have lower TFLOP value compared to performance.
 

Clear

CliffyB's Cock Holster
The cap is labor time and cost.

No matter how good the tech no tiny outfit is going to be able to create a GTA/RDR scale game in anything approximating a realistic time-frame.

Stop fan-wanking over tech, because without investment to utilize that power, its value is limited.
 

Alphagear

Member
Like 95% of games are 3rd party.....
And ms i think have more devs then sony now...
More doesnt mean better.

Its exclusive games which makes gamers like me choose one console over another since like you said both consoles play 95% of the games.

Would I choose Playstation over Microsoft? Right now yes. MS have alot to prove on the games side of things.

Reason why for me Playstation and Nintendo are must buy every generation.
 

DGrayson

Mod Team and Bat Team
Staff Member
When you buy a car do you research for spec, reliability, performance and mpg?


Of course but its not quite comparible IMO.

Ultimately I want to play games. So I care about console exclusives. Until we start talking about games I dont really care which system is looking slightly better etc. Once games are announced then it might make a difference.

Your example only makes sense if you can only go to certain places with certain cars.
 

Alphagear

Member
Any pc experts here?

Would TFlops if one of Sony or Microsoft went Nvidia eventually and one sticking to AMD?

Will TFLOPS become irrelevant?

Heard Nvidia cards have lower TFLOPs but perform better than AMD with higher TFLOPS.
 

Sosokrates

Report me if I continue to console war
More doesnt mean better.

Its exclusive games which makes gamers like me choose one console over another since like you said both consoles play 95% of the games.

Would I choose Playstation over Microsoft? Right now yes. MS have alot to prove on the games side of things.

Reason why for me Playstation and Nintendo are must buy every generation.
I never said they did.
Why do you have to just add on somthing completely out of context.

Bottom line is the XsX power advantage is an advantage, same as it was with the PS4 or 1X, will it determine overall success, no ot wont , but that does not mean its not an advantage and no amount of mental gymnastics will change this, games are more important, I never said they were not, but one aspect of console does not make other aspects not important.
I dont know what the fuck is wrong with some people, i think some may actually have mental illness here.
 
Top Bottom