So? It's still unclear how high the PC requirements for pure (not cross gen) next gen games will be. These numbers mean shit.
Cards coming out around the release of new consoles have always been outtdated more quickly than the ones released later.
The circle jerk on Ree over Nvidia's new cards is disgusting!
The casuals on there don't even understand what they are looking at i.e thinking a 3080 is '2X the performance of s 2080 Ti' I've seen on there!!
No, the rasterization performance looks exactly as MLID and others leaked:
3080 is 25% faster than 2080 Ti 'only', according to DF numbers. It's using Samsung's inferior 8nm process. It only has 10gb. This leaves AMD a very good chance to beat it.
How did you come up with 25% faster than 2080Ti, when the 3080 had areas where it reached 205% over 2080. The 2080TI is only around 20-25% faster than 2080. 3080 destroys 2080TI by a bigger margin than we expected the 3090 to do it.
And it mopes the floor with ps5 in a way that i didnt even dreamed of. The 500 dollar 3700 wipes the floor with ps5. That means a 300 dollar 3060 will be equal or better than a ps5. This makes building a PC of comparable power at close to the same price as the consoles a reality.
Imagine The Last of Us Part II, which looks fantastic on the PlayStation 4 Pro, running on a 3090 at native 4K. I would love to see that.100% going with the PS5, because that is where the games that I want to play are and the new controller looks absolutely amazing.
There is a chance I am considering the RTX 3080, but my graphics card is still decent and third party games with slightly prettier visuals and Xbox Game Pass games are not my priority and the RTX 3080 wouldn't open any doors for me.
I do really like PC hardware a lot and would love to slot that card in, but that is more of a materialistic weakness of mine and doesn't correlate as much with pure gaming enjoyment since the additional performance mostly goes to overly taxed Ultra settings.
You should wait for benchmarks to see real perf metrics. No chance in hell a 3080 is close to 205% faster than a 2080 on average, that's fantasy stuff
Stop worrying about PS5, it's not competing directly with a gaming PC. The Series X and it's modest 12tflops is. You either play all those PC games on XsX at lower settings or you get a next gen card.
This was why its smart Sony focused on games, IO, SSD and games again instead of losing battle with hardware and gpus.
Imagine The Last of Us Part II, which looks fantastic on the PlayStation 4 Pro, running on a 3090 at native 4K. I would love to see that.
I mean, read more carefully what i said. Not 205% faster on average. There were scenes, in Doom Eternal, where it reached 205% over 2080. On average, seems to be some 90%. Some games its 70%, some 80%. EIther way, its an enormous lead over both 2080 and 2080Ti. Just enormous. Until reviews hit, DF had their video where these lead in percentages were shown. Its just as good as a benchmarks with average framerates, better even, because the difference in percentage is how you measure it, not counting frames out of context
Lisa wants to tell you two words, told my senses, one starts with f and another with o.Not even faster than the RTX 3080?
It better be cheaper than the RTX 3070 then..
RTRT is supported (whoever is adored by those RT effects in WoW and about a dozen of games)serious perks that displace RTX I/O, DLSS and RTX.
Lisa wants to tell you two words, told my senses, one starts with f and another with o.
Seriously, why would the much faster card need to be cheaper?
RTRT is supported (whoever is adored by those RT effects in WoW and about a dozen of games)
"RTX IO"... is misleading and quite akin to what has been done in consoles.
So it likely went AMD implements it => Microsoft (hey, games gotta run on Windowzwz) creates corresponding API => Microsoft tells Huang to move it, move it. I still wonder what needs to be there in hardware for SSD to be able to send data straight to GPU, perhaps so far it's about decompression alone.
As for DLSS, I guess, it's there mainly to show that DF is full of shit, for Huang to claim 3090 is a 8k GPU and for other ways of misleading customers.
Anyway, at least XSeX AMD chip does support AI inference (although it wasn't meant to be mainly used to upscale images)
Seems a waste to invest in this early generation pc shit when devs won't be using it for a while. Might as well wait for 2nd or 3rd gen of the tech.
Is this what having a stroke while typing is like?
AMD havent shown us their GPU decompression, when they do ill put a tick on it.
Their big Navi would need to be price competitive to get me to move from CUDA/RTX features.
DLSS is mainly to show that DF is full of shit......hahahaha.
You mean like not buying 1st generation RT GPUs....and buying 2nd generation RT GPUs......as in Ampere RTX 30 GPUs?
You think devs arent going to use DX12U DirectStorage?I'm talking more about the I/O tech in the RTX30, the RT tech has clearly matured. I'm guessing 2nd gen of this I/O tech will put it more in line with PS5 speeds and have actual games that actually use it available at that point.
Dude, what's wrong with you?
Don't want to have a civil conversation, just move on.
You think devs arent going to use DX12U DirectStorage?
If DirectStorage has minimal hit on GPU/CPU and SSD speeds keep increasing....PCIE5 DirectStorage speeds will easily be 28+GB/s.
Microsoft supporting DirectStorage in the DX API makes it almost a given that the adoption rate will be quite high.
I take it you mean the compression ratios getting better or something?