Sosokrates
Report me if I continue to console war
An 2080RTX will be roughly equal to ps5/XSX performance. A few years later it will lag because a bigger optimisation on the RDNA architecture.
I’m sorry but it will.
the 2080 will be replaced once just before the consoles get launched to sucker people in then shortly after they will launch a ‘better’ next gen GPU that will be more inline with the next gen.
The PS5 game trailer for example. It seems like it’s 4K 60fps with some form of raytracing.
The 2080 RT card can barely do this with mine craft
The new consoles will eat current PCs for breakfast being unified - of course then PCs will get better but that’s not to say your 2080’s will cut it when they launch - far from it
How’s Red Dead running for you guys?
Xbone X is running at medium / high settings at 30fps at 4K for £299
How are your £299 rigs going?
Did you forget to mention the lower than low 1X settings?How’s Red Dead running for you guys?
Xbone X is running at medium / high settings at 30fps at 4K for £299
How are your £299 rigs going?
You put a lot of work into this post showing us how stupid you are. I commend you.Oh are we back in this again....
Facts are, you can’t get better bang for your buck with a console and next gen this is going to be even more apparent.
Okay. You have a 2080TI now with an SSD and kick ass CPU.
That will be about 2k with the case and motherboard etc and STILL you are struggling with RT / 4K / 60fps on current gen games.
Fact is, for a £459 - £499 fee in 2020 you will have 4K RT & 60FPS gaming or even BETTER graphics at 30FPS locked.
PC gamers always whinge yet the only graphical advantages are from brute forcing and the price difference of that is laughable.
We got 60fps and 4K - yeah bro, that cost you 1.5k you dipshit
We got 60fps and 4K - yeah bro, that cost you 1.5k you dipshit
Oh are we back in this again....
Facts are, you can’t get better bang for your buck with a console and next gen this is going to be even more apparent.
Okay. You have a 2080TI now with an SSD and kick ass CPU.
That will be about 2k with the case and motherboard etc and STILL you are struggling with RT / 4K / 60fps on current gen games.
Fact is, for a £459 - £499 fee in 2020 you will have 4K RT & 60FPS gaming or even BETTER graphics at 30FPS locked.
PC gamers always whinge yet the only graphical advantages are from brute forcing and the price difference of that is laughable.
We got 60fps and 4K - yeah bro, that cost you 1.5k you dipshit
Monkeygourmet in a nutshell:Sorry I forget PC gamers can install nude mods of lollis.
I see know why they all protest PC’s are the way forward- it’s the perverts choice
only the 12tf is not confirmed.
Well they said 4 times as powerfull as X - so since X is 6TF then you arrive at 24TF but since Microsoft marketing gonna do their thing it's obvious they forgot to mention it's 24 at half precision so you end with realistic expectation of 12TF with ability to use FP16 calculations (same technology Sony has in Pro)
Phil said the GPU is TWO times more powerful than the X which if he means number of TF that would be 12. It's still up in the air whether it's double the TF or double the power of the previous GPU taking other considerations than just TF. There is no actual confirmation that it has 12 TF.
Oh are we back in this again....
Facts are, you can’t get better bang for your buck with a console and next gen this is going to be even more apparent.
Okay. You have a 2080TI now with an SSD and kick ass CPU.
That will be about 2k with the case and motherboard etc and STILL you are struggling with RT / 4K / 60fps on current gen games.
Fact is, for a £459 - £499 fee in 2020 you will have 4K RT & 60FPS gaming or even BETTER graphics at 30FPS locked.
PC gamers always whinge yet the only graphical advantages are from brute forcing and the price difference of that is laughable.
We got 60fps and 4K - yeah bro, that cost you 1.5k you dipshit
Is there a metric other than TF for raw power? What is it?
I'm not tech knowledgeable enough to answer you intelligently. I'm repeating what I saw in the DF Video discussion that less than 12 TF with a better(?) GPU would add up to 2x the power of X.
Is there a metric other than TF for raw power? What is it?
Raw performance is the only meaningful metric. We'll have a much better picture when specs are revealed and RDNA2 GPUs are actually released on PC some time next year.
People on this forum should forget that teraflops even exist as it causes too much confusion...
Raw performance is the only meaningful metric. We'll have a much better picture when specs are revealed and when RDNA2 GPUs are actually released on PC, to see if RNDA2 has better performance per flop compared to first gen RDNA.
People on this forum should forget that teraflops even exist as it causes too much confusion...
Who knew PS3 was more powerful than PS4.Raw performance is the only meaningful metric. We'll have a much better picture when specs are revealed and when RDNA2 GPUs are actually released on PC, to see if RNDA2 has better performance per flop compared to first gen RDNA.
People on this forum should forget that teraflops even exist as it causes too much confusion...
Raw performance is the only meaningful metric. We'll have a much better picture when specs are revealed and when RDNA2 GPUs are actually released on PC, to see if RNDA2 has better performance per flop compared to first gen RDNA.
People on this forum should forget that teraflops even exist as it causes too much confusion...
Ah. Sorry missed the bit about PC.
Obviously it's wrong to measure a console GPU by using a PC API. You just cannot compare.
You can compare, it seems you're just not understanding how the comparisons can be made...
The 780ti was 277% higher tflop than the ps4 (5 and 1.8 respectively), 2080ti is like 116% higher than ps5. It´s gonna become a dog at 4k almonst instantly.
Xbox X GPU is not exactly RX 580, it's customized and more efficient GPU. For example in RDR2 with close to xbox x settings RX 580 can only provide around 20fps while xbox x run the same game at solid 30fps (so we must assume average fps is at least around 35fps). MS has improved xbox x architecture to the point even PS4P GPU was clearly inferior. Xbox x can render over 2x more pixels than PS4P GPU in the same games, and PS4P also use Polaris GPU.
Based on my observations 10TF Navi would be able to match 2x xbox x power, however you only need 8TF Navi to match 8x xbox one (first GCN architecture). These two numbers 8TF and 10TF arnt the same, therefore it's obvious Phil wasn't talking about performance but only about TFLOPS metric. Everything matches perfectly if you consider 12TF GPU, because it's exactly over 8x more power compared to xbox ono s (1.4TF), and exactly 2x more compared to xbox x.
I do understand, essentially you're claiming that if nobody in PC multiplatform game or benchmark cannot extract enough power it cannot be possibly extracted.
Well I have some news for you.
Hint: the vast majority of PC games are either bottlenecked by CPU or ROP>VRAM badwidth.
Yes, you cant match all settings perfectly on PC with xbox x version, but the most expensive settings are already on low or medium, and therefore you cant expect drastic performance difference from there.PC settings are different then console settings. Xbox one X and that game u talk about RDR2 has presets lower then the lowest preset on PC. It goes basically below that. That why it does perform better then on PC at low settings in benchmarks. PC games however can be downscaled into infinity until it runs on potato pc's. like control can run on a 2 core integrated vega 200ge athlon that's 48 bucks new right now lol.
Anyway its also not uncommon for games to be down mastered for consoles this happens all the time to keep performance up. Good examples are bdo, divinity 2, they are billions.
About the tflop part. yes that's what i mentioned a while back and that's what phil also states. 12tflop is what the xbox series X performs if you look at it from a xbox one x perspective performance wise.It's double the gpu power which will get butchered really quick with higher pc like settings.
I'm talking about PLSL bro.
GNM is the API for the PS box. I get that. It's a thin layer because it's not multiplatform. I get that too. We are in agreement there.
All I'm saying is that GNM on a PS compared to DX on a PC isn't really a comparison because the PC will still outperform it thus mitigating the argument that GNM gives the PS an edge.
Yet everybody was talking how 780ti was dead because 3gb of v-ram same with titan and its "only 6gb" vs 8gb on PS4 at the time. See how that went. Same kind of logic people apply there. Yet it outperforms the PS4 every day without breaking a sweat.
The 2080ti is also a lot more faster on average in games then 117%, i would say its about 135% faster. And that's actually a lot when u look at it on 4k solutions, if pc gamers sit at 1440p ( which most of them have that got that card most likely as 4k isn't much of a thing on PC segment ) there will be even more gains that consoles can't keep up with as there gpu gets butchered on 4k resolutions.
About performance3 however a lower tflop gpu 750ti 2gb has been outperforming the PS4 for a long time now unless the games is butchered on the PS4 to run less stuff then the 750ti by lower presets.
Average that 750 ti wins.
There is no need to wait for death stranding. There are tons of games that run on both platforms as the consoles are already out for a while.
Also PC presets are different in games then consoles. If devs decide to make 2x draw distance lowest preset, then its not really comparable anymore like for example red dead redemption does.
Just because a PC can run games faster with way faster hardware does not mean thiner APIs do not give consoles an advantage. GNM is faster compared to DirectX in a like for like comparison as is DX11.x which is the XBO version.
I find it highly unlikely that someone who actually works at naughty dog would make such a huge gaf as saying that’s the PlayStation runs DX. The only thing close to reality from your statements I can find is that iirc PSSL is quite close to GLSL or HLSL (can’t remember which).
the PlayStation does not fun DX, nor is it capable being a BSD derivied operating system it doesn’t even support it and would also likely violate many patents and copyrights that Microsoft had if it did.
the only thing close to correct you got is that the programmers use Windows because it’s the default IDE supported by the PS4 for debugging and profiling but even then it doesn’t use msvc but clang as the compiler because it has to run on the actual hardware.
I’m sorry but it will.
the 2080 will be replaced once just before the consoles get launched to sucker people in then shortly after they will launch a ‘better’ next gen GPU that will be more inline with the next gen.
The PS5 game trailer for example. It seems like it’s 4K 60fps with some form of raytracing.
The 2080 RT card can barely do this with mine craft
The new consoles will eat current PCs for breakfast being unified - of course then PCs will get better but that’s not to say your 2080’s will cut it when they launch - far from it
You are right of course. But it doesn't seem to matter when the games get compared. PS4 Pro is hardly ever running the games as fast or at high resolution when compared to the X1X. I'm trying to show that brute force power will always win out. Not the thin layered API when comparing multiplat games.
I never said that the PS runs on that. They have seperate boxes. PC is the main dev machine running Windows, VS, and PLSL.
Yes. You are correct.
Of course if your PC card is multiple times faster then the console its going to win out. BTW the X1X also has the same type of API so the comparison you are making is invalid.
4k@60fps with raytracing, on a 500usd box? Well, believe what you want.
LANGUAGE!!!!!!!Yeah so many people larping here.
Tell me what you are actually disagreeing with?
So you are saying a PC costing £499 (assuming that’s the price of next gen) will be able to display better graphics than a PS5 or Sex Bone?
Fuck oooooofff...