• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

So we have a 30TF GPU now. Do you still think it's a good idea to launch a 4TF "NG"-console alongside in 2020?

Spukc

always chasing the next thrill
I played Fallout 4 the other day on my PC, what a boring generic game with boring story and ugly art direction. But then I put a 30TF graphic card in my PC and boom. All of sudden the game was so fun and the story so interesting. I even now have a crisp 4K resolution to show the ugly art direction. Who needs console now that we can render a 8K 1000fps pile of horse shit.
Exactly. Textures and artstyle are overated as fuck.

framerate and resolution however not..
That makes the game just sharper and better to play.

sub 4k and sub60 is kinda pointless in 2020.
Esp when the 3090 is all about 8k gaming.
 

FranXico

Member
12 is better than 10


What is your point?
🤷‍♂️

Oh nm, you wanted to jab Xbox lmao 🤣
My point is obvious.
30 is the best there will be. Even 10.9 would be better than 10.3, which is already much better than 4.. Who cares?

Oh, you have to jab on Playstation as usual, ok. Have the tallest midget.
 
Last edited:

BluRayHiDef

Banned
30TF GPU

This figure is highly misleading.
E.g. 3070 is "20TF GPU" but is touted as merely "=2080Ti", which is, wait for it, 13.7TF.

Perhaps teraflops and performance don't scale linearly? Or maybe Nvidia factored in DLSS scaling into their calculations.
 
Was thinking more of graphics professional usage myself, you know productivity like 3D modelling and whatnot.
Its just so far above the current minimum, I'd expect it to be rarely required or really factored in.

Got to be honest too, the recent advancements in AI image processing have got me wondering if the diminishing returns on increasing resolution aren't going to kick in even harder going forwards.
Fuck 8k tbh.

Gimme more AI and Physics.
 

llien

Member
Perhaps teraflops and performance don't scale linearly? Or maybe Nvidia factored in DLSS scaling into their calculations.
More likely Huang simply decided to double the number of claimed CUs last minute (perf/CU was steadily raising until it sharply dropped with Ampere). All leaks were spot on with half of the claimed CUs.
 

Ellery

Member
Perhaps teraflops and performance don't scale linearly? Or maybe Nvidia factored in DLSS scaling into their calculations.

Yeah they are different. It is complicated and no DLSS is not factored in. However DLSS benefits more from the new way of doing it and that is probably why Nvidia did it. Realizing how important RTX and DLSS are going to be.

TFLOPS aren't TFLOPS. That is why it is a flawed way of measuring and seeing people on forums comparing TF all day is extremely pointless especially to older architectures.
The AMD Vega 64 has like 13 TF and is a lot slower than the 9 TF 5700 XT.

So in essence I'd recommend people to pay less attention to TF and more to actual gaming graphics and framerate/resolution etc. Usually as soon as someone screams "This has more TF so it must be better" you can safely assume that they have absolutely no clue what they are talking about.
 

Entroyp

Member
Switch is portable 1st and foremost. You can't compare it. Even then, it sells because content is king not TFs.

I agree, but I’m referring to price. A cheapo 4TF console will surely find customers in the console market.

As for content, I couldn’t agree more. If MS wants to venture into next gen without good content, then failure is on them, not the TF number or price.
 

BluRayHiDef

Banned
More likely Huang simply decided to double the number of claimed CUs last minute (perf/CU was steadily raising until it sharply dropped with Ampere). All leaks were spot on with half of the claimed CUs.

How could he just double the number of claimed CUs? Do you mean that he's basing the purported TFLOPS of the cards on the idea that each CU can do double the work of a CU from Turing?
 

llien

Member
How could he just double the number of claimed CUs? Do you mean that he's basing the purported TFLOPS of the cards on the idea that each CU can do double the work of a CU from Turing?
CUs are not just dumb "do this floating point operation" pieces, they are more like mini (haha, ok, compared to normal CPUs fairly dumb) CPUs. You could allow them to, in certain circumstances, allow to do twice fp ops (note how it is generic FLOPS means floating point operation, without specifying which ones).

3090 is all about 8k gaming.
Oh dear. You've missed that lovely and not totally misleading "DLSS" letter combination, haven't you?
 
Last edited:

EverydayBeast

thinks Halo Infinite is a new graphical benchmark
That's absurd overreaction by OP yes 30tf is more but $1500 is a problem for the masses
 

DaGwaphics

Member
Perhaps teraflops and performance don't scale linearly? Or maybe Nvidia factored in DLSS scaling into their calculations.

I suspect that the comparisons between GPU generations are based on RTX On scenarios, many will probably disappointed whey they see comparisons with RTX Off. RT and Tensor Ops likely are the catalysts here more than anything else.
 

Spukc

always chasing the next thrill
CUs are not just dumb "do this floating point operation" pieces, they are more like mini (haha, ok, compared to normal CPUs fairly dumb) CPUs. You could allow them to, in certain circumstances, allow to do twice fp ops (note how it is generic FLOPS means floating point operation, without specifying which ones).


Oh dear. You've missed that lovely and not totally misleading "DLSS" letter combination, haven't you?
to make it run even F A S T E R
 

IntentionalPun

Ask me about my wife's perfect butthole
100s of millions of people play games on devices that still measure performance in gigaflops. I don't think a single >1TF phone exists.

That's the market MS is chasing. And it's not entirely about home consoles, I think they are targeting XSS for xCloud over using the more expensive/heat producing/somewhat overpowered XSX.

MS thinks cloud gaming is going to be about streaming to cell phones. XSS is likely positioned for the cloud; selling it as a casual home console is a bonus for them and somewhat a no brainer if they are expecting devs to target that profile for xCloud.

(I think they are barking up a tree that's gonna fall on their face personally)
 

DaGwaphics

Member
wth is a NG console?

atari-vcs-ataribox-controller-9106.jpg


dims


🤷‍♂️
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
4 TF Series S is still a good idea. It's a cheap console suitable for many console buyers.

It will be the cheapest and highest selling Xbox Series.

They just gotta keep AAA games at 900-1080p and keep Series X for the higher resolutions.

Lets hope you are wrong about this.
 

Spukc

always chasing the next thrill
2020, year when gamers discovered running games at lower resolutions improves framerates.


you are mixing up DLSS with old fashioned resolution scaling :pie_roffles:
according to DF DLSS looks better then normal 4k in death stranding
 
Last edited:

Journey

Banned
My point is obvious.
30 is the best there will be. Even 10.9 would be better than 10.3, which is already much better than 4.. Who cares?

Oh, you have to jab on Playstation as usual, ok. Have the tallest midget.


No, this issue has never been raised by the Xbox side, it's the PlayStation fanboys trying to highlight how Microsoft's 4TF console will pose a problem when it won't be anymore of a problem than the Geforce 2060 is to the nVidia line. After arguing until their faces turned blue and finally concluding that the CPU would be an issue in terms of scaling and finally having a leg to stand on, we learned that the CPU in Xbox Series S is just as impressive as the PS5 or XSX' You would think that at that point the issue would be quashed, but NO, we get these stupid threads trying to start a problem when there's none.
 
Last edited:

Papacheeks

Banned
you are mixing up DLSS with old fashioned resolution scaling :pie_roffles:
according to DF DLSS looks better then normal 4k in death stranding

Part of DLSS is that though? It's using a AI based algorithm to re-contruct and image from a lower resolution to a 4k. It's scaling but it's also reproduced based on a blueprint set in the driver from AI based algorithm.

So you sometimes get an amazing looking reconstructed image aka Death stranding. And that would be because of the limitations possibly of the engine itself with scaling to 4K. DECIMA is a engine based and primarily written for console hardware. SO it's not like it native designed with 4k rendering or 4k assets. WHich obviously is the case.

Newer engines like Unreal 5 and updated versions of engines like new DECIMA will now be using 4k+ or even movie quality 8k assets that will be scaled down. So image density it going to look amazing native wise.

So there for DLSS may not even need to be used. I bet my account next year once cyber punk is updated for next gen, we may get a extreme bump in asset quality unless for PC they already are using 4k+ assets.

A good example of not needing DLSS if the engine has high quality assets built into it's rendering pipe is a game like Gears 5. That game looks amazing and plays great at 4k without needing DLSS.
 

Spukc

always chasing the next thrill
Part of DLSS is that though? It's using a AI based algorithm to re-contruct and image from a lower resolution to a 4k. It's scaling but it's also reproduced based on a blueprint set in the driver from AI based algorithm.

So you sometimes get an amazing looking reconstructed image aka Death stranding. And that would be because of the limitations possibly of the engine itself with scaling to 4K. DECIMA is a engine based and primarily written for console hardware. SO it's not like it native designed with 4k rendering or 4k assets. WHich obviously is the case.

Newer engines like Unreal 5 and updated versions of engines like new DECIMA will now be using 4k+ or even movie quality 8k assets that will be scaled down. So image density it going to look amazing native wise.

So there for DLSS may not even need to be used. I bet my account next year once cyber punk is updated for next gen, we may get a extreme bump in asset quality unless for PC they already are using 4k+ assets.

A good example of not needing DLSS if the engine has high quality assets built into it's rendering pipe is a game like Gears 5. That game looks amazing and plays great at 4k without needing DLSS.
srsly it's about 8k not 4k tbh.
 

whyman

Member
I can get a nice 4K TV and a PS5 or a graphics card that will be outdated in 2 - 3 years for the same price... I think the graphics card is the stupid option.
 

JLB

Banned
Console owners should really start to internalize that the visual gap between next gen consoles and 3080 will be dramatic. Its not only memory, or TFs. Its mainly DLSS.
Where PC will have true 4k, full ray tracing and 60fps at least, consoles will go for partial ray tracing, 30fps-60fps and checkerboarding.
And let alone 8k, 120fps competitive games.
 
Last edited:

Papacheeks

Banned
srsly it's about 8k not 4k tbh.

Well in all seriousness I think now with the change in how engines use assets as seen in unreal 5 demo, we could be seeing games using movie quality assets without taking a hit to overall game performance because of the decompression and SSD I/O changes in hardware.

So you could take a 8k texture asset, downscale it to 1440p or 4k with minimal hit to performance. ANd reason would be amount of shaders and throughoutput available to render assets would not have to work as hard.

GOD this gen is so fucking exciting.
 

JLB

Banned
DF right?
The guys who got that exclusive "super early preview of Ampere" right?
Guys, who by incident didn't notice artifacts that GAFers spotted 10 minutes into discussion?

I think it's legit. :D


You'll be a grandpa.

Sounds you are really into downplaying Nvidia. See, Im a console owner, and play mostly on console as well.
No need to try to elaborate conspiracy theories about DF and such.
 

Spukc

always chasing the next thrill
DF right?
The guys who got that exclusive "super early preview of Ampere" right?
Guys, who by incident didn't notice artifacts that GAFers spotted 10 minutes into discussion?

I think it's legit. :D


You'll be a grandpa.
i don't know why you have to be such a negative nancy and all.

8K gaming the fuck...
1 week ago everybody was talking about 4k :pie_roffles:
 

llien

Member
1 week ago everybody was talking about 4k :pie_roffles:
8k is 4 times 4k.
So you'd need something roughly 4 times the 2080/2080s (PS5/XSeX) to run.
3090 is somewhere half way to it.

More to it, 4k is becoming baseline with the next gen (the way 1080p has in the past). That was console push, not desktop GPU (most PC gamers sit on 1060 and slower, lol)

See, Im a console owner, and play mostly on console as well.
I don't know why it matters, but so do I, since it does.

No need to try to elaborate conspiracy theories about DF and such.
Excuse me, which part of what I've said is a "conspiracy theory"?
DF hyping DLSS? DF not seeing the obvious artifacts? Maybe DF getting super early "exclusive preview" Ampere?
Where you there when Huang taught Anand a lesson, so that the latter went as low as benching golden AIB Fermi sample against stock Terrascale?
Heck, wasn't GPP only back in 2018 and happily signed by most reviewers, bar a handful of brave ones? (I'll remember that, computerbase.de)

Maybe I'm crazy and GPP existed only in my imagination? Conspiracy theories tend to come in groups, I was told.
 
Last edited:

Spukc

always chasing the next thrill
8k is 4 times 4k.
So you'd need something roughly 4 times the 2080/2080s (PS5/XSeX) to run.
3090 is somewhere half way to it.

More to it, 4k is becoming baseline with the next gen (the way 1080p has in the past). That was console push, not desktop GPU (most PC gamers sit on 1060 and slower, lol)


I don't know why it matters, but so do I, since it does.


Excuse me, which part of what I've said is a "conspiracy theory"?
DF hyping DLSS? DF not seeing the obvious artifacts? Maybe DF getting super early "exclusive preview" Ampere?
Where you there when Huang taught Anand a lesson, so that the latter went as low as benching golden AIB Fermi sample against stock Terrascale?
Heck, wasn't GPP only back in 2018 and happily signed by most reviewers, bar a handful of brave ones? (I'll remember that, computerbase.de)

Maybe I'm crazy and GPP existed only in my imagination? Conspiracy theories tend to come in groups, I was told.



you should really watch their newest video
 

JLB

Banned
8k is 4 times 4k.
So you'd need something roughly 4 times the 2080/2080s (PS5/XSeX) to run.
3090 is somewhere half way to it.

More to it, 4k is becoming baseline with the next gen (the way 1080p has in the past). That was console push, not desktop GPU (most PC gamers sit on 1060 and slower, lol)


I don't know why it matters, but so do I, since it does.


Excuse me, which part of what I've said is a "conspiracy theory"?
DF hyping DLSS? DF not seeing the obvious artifacts? Maybe DF getting super early "exclusive preview" Ampere?
Where you there when Huang taught Anand a lesson, so that the latter went as low as benching golden AIB Fermi sample against stock Terrascale?
Heck, wasn't GPP only back in 2018 and happily signed by most reviewers, bar a handful of brave ones? (I'll remember that, computerbase.de)

Maybe I'm crazy and GPP existed only in my imagination? Conspiracy theories tend to come in groups, I was told.

Oh, got it. Your one of those. Thanks!
 

iHaunter

Member
Not, it was a bad idea even when released. Just drop the price of Xbox One X? They should've done what Sony did, regular version, and disk-less version of XSX. Makes more sense.
 

Lone Wolf

Member
Not, it was a bad idea even when released. Just drop the price of Xbox One X? They should've done what Sony did, regular version, and disk-less version of XSX. Makes more sense.
The One X is already discontinued, and it’s Jaguar cores wouldn’t cut it long term for next gen.
 

EDMIX

Member
Was never a good idea before this tbh.

If gamers think they are getting the short end of the stick with XONE still being used to port to Series X, they really need to worry about that Series S BS. THATS the thing that will fuck them over in the long run, never mind XONE, thats just for a few years, but if they legit release Series S, thats what they will be making their games from. Sounds like a very bad idea...
 

RockstarBeaver

Neo Member
Can I plug dual sense to this beautiful graphics cards and play Spiderman by Just pressing the button and loading the game in 0.83 secounds?

Can I plug in a Logitech G Saitek PRO Flight Yoke System and play Flight Simulator 2020 while witnessing true next-gen graphics on Sony's new PS5 with last gen computer parts?
 

Ascend

Member
The XSX GPU that has 12TF shouldn't be too far off from the RTX 3070, yet, the latter is advertised as 20TF.

nVidia is inflating their TF numbers.
 

DaGwaphics

Member
The XSX GPU that has 12TF shouldn't be too far off from the RTX 3070, yet, the latter is advertised as 20TF.

nVidia is inflating their TF numbers.

They are just reporting the basic mathematics, as does everyone.

What is left to be seen is how efficiently the architecture utilizes the stream processors. Historically, nVidia has had the edge in this area with AMD having higher sp counts (and TF numbers) but being on the short end of performance.

I'm really curious about AMD pricing now. In theory, they should be on much smaller dies for even their widest designs. If they can muster decent RT performance without all the purpose built silicone, that could really put them in the drivers seat in terms of profitability and pricing.
 
Last edited:

llien

Member
They are just reporting the basic mathematics, as does everyone.
It would be fine if they just said FLOPS, but they made it about CUs.
I'd call it lying.
CU is more than just number cruncher, just that it could doe more ops per clock doesn't turn it into 2 CUs.

With Turing, each actual shader contained 1 FP32 pipeline and 1 INT-32 pipeline. With Ampere, they're shifting to 1 FP32 pipeline and a second pipeline capable of both INT and FP32 - but only one at a time.

It is akin to Buldozer "cores".
 
Last edited:
Top Bottom