• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

if Senua's Saga: Hellblade II is actually what we can expect for next gen consoles does that mean rtx2080 will be outdade by 2020

kraspkibble

Permabanned.
lol.

the 2080 will have aged and not be as capable but outdated? no.

plus remember on PC we have higher refresh rates. our CPUs are much more powerful. no way is my 9900K at 5.1GHz, 32GB RAM 3300MHZ, RTX 2080 with 8GB VRAM gonna be outdated any time soon. it might struggle but it'll perform better or just as good as any game on console.
 

kraspkibble

Permabanned.
Next gen consoles are going to be outdated when they are released next year. They are already locked to using a 7nm node when there is already a better, more efficient 7nm+ EUV node out there, and the 5nm node will be out next year. PC cpu/gpu are going to get a lot more stronger next year on top of that. Some of the stuff may sound cool this year, but not so great next year.
so true. by the time these consoles come out there will be new CPUs from AMD + INTEL. more powerful GPUs from Nvidia (and hopefully AMD too).

also people seem to forget that these CPUs + GPUs will be low powered chips. the Zen 2 cpus in next gen will NOT be the same as the Zen2 cpus currently in PCs.
 
I think all of those things are being used now is my point. Not all in one game mind you, but I'm sure there is one game out there that can say "hey, we did that!" running on a PC.
No. There's not one single game out there that is built from the ground up to use any of those features.

Also, what game is using mesh shaders atm?
 

nowhat

Member
I guess the reason for "in-engine" for Hellblade is because they don't have final silicon yet, so the video is a baseline of what they expect to deliver.
*cough*Killzone 2*cough*

(kidding, it was indeed a clip of what they wanted to do, but it wasn't supposed to represent the end product - that it ended up in the E3 reel was a fuckup by someone at Sony's marketing, much to Guerrilla's disbelief)
 

RaySoft

Member
Again, let's see some multiplat titles that run twice as fast on console than on a PC with similar hardware. If what Carmack says is still true, that would apply to almost every game out there, so coming up with dozens of examples should be a breeze for you.
First off your looking at it backwards.. It's more of "The PC needs 2x the power than the consoles to produce comparable results. And you can't use multiplats for this comparison, you need code that pushes the system, like a first party title.
 

nowhat

Member
NVM looked it up. No, it doesn't do anything special. What I read and what I was told are the same thing. Just a wrapper around DX11.
Just because you skimmed a Wikipedia article (that doesn't really include any up-to-date or in-depth info, it's all behind an NDA) doesn't mean you understand what you're talking about. There's no DX11 on PS4. There never has been any variant of DirectX on any PlayStation.
 
The 2080 will not be outdated at all but the downside is that Devs dont code to a specific Gpu, thats why you wont see any major gains from that card.
 
First off your looking at it backwards.. It's more of "The PC needs 2x the power than the consoles to produce comparable results. And you can't use multiplats for this comparison, you need code that pushes the system, like a first party title.
How awfully convenient. The only games that count are the ones that don't allow for a comparison because they're not available on PC. If your entire argument rests on claims that are untestable by definition, then it's not much of an argument at all.

It's not like most games these days are developed with PC as the lead platform. Even third party multiplat titles are optimized for consoles, and even if they're less optimized than first party exclusives on average, it would take some truly spectacular incompetence on the part of the developers to end up with a game so unoptimized that it would completely nullify a supposed 2x advantage in processing power.
 

pawel86ck

Banned
I'm guessing it's going to end up between 2060 Super and 2070 Super.

Phil Spencer said 2x Xbox One X. Many people (including myself initially) took this as being 12 Tflops.

But 12 Navi teraflops is actually more like 2.5x Xbox One X, which used Polaris. DF concluded that Navi is 1.28x Polaris so I'm using that for this calculation.

I'm guessing it ends up actually being like 9-10 Navi Tflops which has the performance of ~12 Polaris Tflops.

And 9-10 Navi Tflops is in the range of 2060 Super to 2070 Super.
GCN TFLOPS doesn't make any sense if you really think about it. Not only from marketing perspective, but actual numbers will not add up.

Remember Phil has said xbox SX will deliver 8x times xbox one power, so if you want to base your conclusions on GCN TFLOPS performance then you need around 10-11 Vega TFLOPS, and probably even less, because Vega was already way more efficient than customized first gen GCN found in standard xbox one.

So lest assume you would need 10TF Vega. Now in order to match 10TF Vega you need 8TF Navi and because xbox SX will use even better architecture than first gen RDNA (new features like VRS or HW RT are already confirmed) then we can assume even 7TF Navi 2 would already deliver 8x xbox one performance.

However the problem with 8-7TF navi is, it will not deliver 2x xbox x GPU performance no matter how you look at it. Xbox one X is using polaris architecture on steroids and just for comparison Xbox x with it's 6TF can deliver 2x times as many pixels as 4.2TF polaris found in PS4P. Obviously xbox x GPU architecture was already a clear step up from other polaris based GPU's. So even if you take into account Navi architecture you would need many TFLOPS in order to match 2x xbox x GPU performance, IMO you would need 10TF Navi at minimum in order to double xbox x performance.

So obviously Phil wasnt thinking about GCN performance, but pure TFLOPS number in general like he always did. 2x more 6 TFLOPS is 12TF and this particular number can perfectly explain what Phil has said in regards to xbox SX power. 1.4TF x8 is 11.2TF, so 12TF would be exactly over 8x times better compared to xbox one and also exactly 2x more compared to xbox x.

With windows central leak on top of that I'm certain MS is aiming at 12TF Navi. Of course thanks to RDNA2 architecture the performance gap between xbox x will be even bigger than numbers suggest but that's something casuals will never think about.
 

DJ12

Member
I guess the reason for "in-engine" for Hellblade is because they don't have final silicon yet, so the video is a baseline of what they expect to deliver.
i.e. it could be rendered on a rx5700 (or what's currently inside the devkit) with lower fps, but it's at least rendered with the game-engine.
Phil and others having the console at home suggest otherwise.

Doesn't fit the narrative being pushed that they are behind but facts are facts.
 

Nickolaidas

Member
Hellblade II (as well as Godfall) is a just small taste of the Dark Side.

We have yet to see what the two powerhouses will pull out of their asses next gen. It's going to be amazing.
 

RaySoft

Member
Phil and others having the console at home suggest otherwise.

Doesn't fit the narrative being pushed that they are behind but facts are facts.
Do you really think Phil has a final retail unit looking like the revealed box at home? It's probably not a final unit he has. All those pictures of the new box are renders anyways. Not saying they dont have real units looking like it, but it's certainly not a donfirmation they do.
 
Last edited:

DJ12

Member
Do you really think Phil has a final retail unit looking like the revealed box at home? It's probably not a final unit he has. All those pictures of the new box are renders anyways. Not saying they dont have real units looking like it, but it's certainly not a donfirmation they do.
It may not be in the same form factor, although I suspect it is, it will be final silicon or what's the point of people like Phil having it at home.
 

Woo-Fu

Banned
People think that was actually running on an X? lol. I've seen "rendered in engine", but they don't tell you what the engine was running on at the time. It's either a dev kit or a PC, most likely a PC. Either way it is running on what is effectively off-the-shelf PC hardware available today.

People throwing out the 2x quote from Carmack are using an old opinion that doesn't reflect the current industry, particularly when it comes to Microsoft who is releasing almost everything on PC too. Just look at the original title in the series and how it performs on Xbox One X vs. PC.

Console lost the opportunity to surpass PC when they started using PC hardware themselves. Even when they had the opportunity it only lasted for a very brief window each generation.
 
Last edited:

Kenpachii

Member
The theoretical performance maybe, but never inside a PC. You just have too many bottlenecks (PC's biggest problem is it's legacy hardware like motherboard layout, buses, southbridge etc. A console is more streamlined to deliver maximum peak level across the system. That's why Carmac's "2x" still apply.

Carmack 2x doesn't apply and never applied the moment they shifted over toward PC architectures.

I would even say PC has would even beat console in optimisation because they don't lock down entire cores for security and background tasks while PC only uses 1-2% of its cpu to push tasks it needs to push which is far below. Add with it higher clocks on the CPU front and that idea falls dead on its face big time.

PS4 was proof of this, where a i7 870 3 years before PS4 came out with half the cores would steamrol any PC game at higher frames through entire PS4 generation.

Memory wise, PC isn't far behind and memory isn't much of a issue to start with. 2 gb v-ram and 8gb system ram covered the 8gb of total v-ram on PS4. This is hard to compare because PC pushes higher visual options as low then what console have which will consume more ram as result. Double that and u get 4gb v-ram + 16gb memory for next gen. Hardly interesting for PC.

GPU wise a 1,7 tflops 750 ti beated a 1.84 ps4 gpu pretty the entire generation on overall performance. So i would say GPU wise consoles certainly don't have any 2x advantage.
And with 4k/8k focus on those consoles while pc gamers will sit at 1080p/1440p because higher res makes zero sense for them. GPU's on PC will boost a ton more fps forwards then consoles will.

These consoles will feature 2x the xbox one x performance gpu wise, double the memory and 3-4 times the cpu power depending on clock. As xbox one x is barely able to push 4k in modern PC games already at 30 fps and below low settings. It isn't looking to good.

This is why they call it x series most likely because they going to refresh that box every 2 years to keep it updated.
 
Last edited:

FireFly

Member
These consoles will feature 2x the xbox one x performance gpu wise, double the memory and 3-4 times the gpu power depending on clock. As xbox one x is barely able to push 4k in modern PC games already at 30 fps and below low settings. It isn't looking to good.
It depends whether they mean 2x the performance or 2x the teraflops. At 2x the teraflops, performance should be around 2080 Ti levels, given where the 7.95 Teraflop 5700 XT sits.
 
GCN TFLOPS doesn't make any sense if you really think about it. Not only from marketing perspective, but actual numbers will not add up.

Remember Phil has said xbox SX will deliver 8x times xbox one power, so if you want to base your conclusions on GCN TFLOPS performance then you need around 10-11 Vega TFLOPS, and probably even less, because Vega was already way more efficient than customized first gen GCN found in standard xbox one.

So lest assume you would need 10TF Vega. Now in order to match 10TF Vega you need 8TF Navi and because xbox SX will use even better architecture than first gen RDNA (new features like VRS or HW RT are already confirmed) then we can assume even 7TF Navi 2 would already deliver 8x xbox one performance.

However the problem with 8-7TF navi is, it will not deliver 2x xbox x GPU performance no matter how you look at it. Xbox one X is using polaris architecture on steroids and just for comparison Xbox x with it's 6TF can deliver 2x times as many pixels as 4.2TF polaris found in PS4P. Obviously xbox x GPU architecture was already a clear step up from other polaris based GPU's. So even if you take into account Navi architecture you would need many TFLOPS in order to match 2x xbox x GPU performance, IMO you would need 10TF Navi at minimum in order to double xbox x performance.

So obviously Phil wasnt thinking about GCN performance, but pure TFLOPS number in general like he always did. 2x more 6 TFLOPS is 12TF and this particular number can perfectly explain what Phil has said in regards to xbox SX power. 1.4TF x8 is 11.2TF, so 12TF would be exactly over 8x times better compared to xbox one and also exactly 2x more compared to xbox x.

With windows central leak on top of that I'm certain MS is aiming at 12TF Navi. Of course thanks to RDNA2 architecture the performance gap between xbox x will be even bigger than numbers suggest but that's something casuals will never think about.

4 times the CPU x 2 times the GPU = 8 times faster.
It's that simple.

The CPU is easy, Zen 2 is just that much faster than Jaguar.
And a Navi equivalent to the RX 5700 with Variable Refresh Rate and more memory bandwidth can also be easily 2 times faster than the GPU in the X.

I believe that Phil was very truthful about the power of this next console.
 
Last edited:

psorcerer

Banned
Yes, that is and always will be true. It's a race they will never catch. Nvidia doesn't have to release the 3xx series next year but they will. And it will be probably close to 2x faster than the 2xxTi series. Hardware development from MS and Sony can't keep up with that exponential development from Nvidia. AMD is a few years behind Nvidia and not sure if they'll ever catch up. Don't forget that Intel will enter the fray next year as well.

I think everything above is false.
1. Hardware these days means much less than in '00s mostly because of the complexity and synchronization/latencies hidden inside.
2. What does allow to make better games is thin interfaces/APIs. Hardware is "too smart" if you have a "too smart" software on top it becomes so smart that it shoots itself in the foot every single time.
3. Nvidia hardware is worse than AMD. That's a fact. But Nvidia has 10x manpower to optimize for each and every game on the market. They also take a lot of shortcuts inside the drivers to cut performance hogging stuff to a minimum. If you have ever wondered why there is a new Nvidia driver each and every game release and the driver code (compiled) is well over 100mb in size, it's because of the above.

In the end it means that you cannot directly compare Windows DX12 PC to PS5 GNM at all. These a vastly different platforms with a lot of different tradeoffs to make. So the claim that "consoles are behind" is meaningless.
You can make it meaningful very easy though. Consoles are always behind in PC-first multiplatform titles. That's obviously true and will not change.
 

Kenpachii

Member
It depends whether they mean 2x the performance or 2x the teraflops. At 2x the teraflops, performance should be around 2080 Ti levels, given where the 7.95 Teraflop 5700 XT sits.

This is what he says.

Microsoft says its next-gen console, Xbox Series X, will be twice as powerful as Xbox One X.

The console previously known as Project Scarlett was unveiled on Thursday night at The Game Awards, ahead of its launch during the 2020 holiday season.

“We wanted to have a dramatic upgrade from the Xbox One base console,” Xbox boss Phil Spencer told GameSpot. “So when we do the math, we’re over eight times the GPU power of the Xbox One, and two times what an Xbox One X is.”

Doing the maths, Xbox One X is targeting around 12 teraflops (TF) of computing power, compared to the Xbox One X’s 6TF, and the Xbox One S’s 1.4TF.

He's talking about old tflops to compare the performance. That thing is nowhere near 2080 ti levels.

Xbox one X = 580
Xbox one X series = 5700xt at best.

Here's a benchmark to showcase where the 580 sits on the PC solution, keep in mind 2080ti isn't even the top card in this market, but its somewhat still payable card, and that thing can sli on top of it.

index.php
 

VFXVeteran

Banned
I think everything above is false.
1. Hardware these days means much less than in '00s mostly because of the complexity and synchronization/latencies hidden inside.

Ok. I'll concede this.
2. What does allow to make better games is thin interfaces/APIs. Hardware is "too smart" if you have a "too smart" software on top it becomes so smart that it shoots itself in the foot every single time.

I'm all in for a thin interface. But what interface would you consider thin? A GNM isn't necessarily thin. And even if you have a thin interface, that doesn't detract from the ever-growing need for raw power (i.e. more RAM, more cores, etc..)

3. Nvidia hardware is worse than AMD. That's a fact. But Nvidia has 10x manpower to optimize for each and every game on the market. They also take a lot of shortcuts inside the drivers to cut performance hogging stuff to a minimum. If you have ever wondered why there is a new Nvidia driver each and every game release and the driver code (compiled) is well over 100mb in size, it's because of the above.

I don't see it that way and in practice, the numbers just don't show. I know AMD has very good CPU hardware for sure. But I'm not seeing the GPU as superior at all.

In the end it means that you cannot directly compare Windows DX12 PC to PS5 GNM at all. These a vastly different platforms with a lot of different tradeoffs to make. So the claim that "consoles are behind" is meaningless.

I don't think they are much different. PLSL is literally a wrapper around DX. Guys at Naughty Dog told me this. If they are vastly different, then I'd need some proof.

You can make it meaningful very easy though. Consoles are always behind in PC-first multiplatform titles. That's obviously true and will not change.

Agreed.
 

VFXVeteran

Banned
The biggest differentiations between PC’s and consoles are really cost (feel free to buy that Mac Pro for $52k ;)) and size and thus power requirements (dissipation and consumption) where PC’s can and do use much much bigger boxes.

Yep. Agreed.

MS and Sony, with CY Q4 2020 semi-custom HW and custom OS, stand to raise the bar for PC gaming. Of course after a bit, some users with 32 GB of RAM or more will start to use ram disk solutions to brute force their way out, but that is beside the point.

Also agree. I have 64G of RAM and I'll just install the entire game in RAM.
 

VFXVeteran

Banned
You have just said that your HW setup can brute force its way out of the current console specs and overhead, you have not proved that overhead is not a thing, that fixed HW specific optimisations are not a thing, and that will also apply to next generation launches. There could be DF sized threads about examining your results and the console results on a level playing field and ensuring we had matching IQ and framerate, but you can start by quoting 10 GB of overall memory vs 8 GB total and likely your CPU is a much more complex chip (core for core) than the Jaguar in the game consoles and clocked far above 1.6 GHz (a big part of the “overhead” is also mostly at the CPU side of things where OS and driver can impact performance the most... the rest is really where devs that have the time and possibility to optimise around a fixed and well documented HW target can get to).

Given how hungry modern desktop OS are and how many apps have switched to that resource waster tech called Electron... reports of running many tasks in the background 1-2% max usage feel dubious :), but it is not the point.

I think the point from the PC gamers perspective is -- let's say everything you mention is true and factual in every case. That the PC incurs a lot of overhead. The fact remains, whatever optimizations you do to the consoles, they'll never outperform (or include better graphics features) the highest end PC in any kind of scenario. So for example, if TLoU2 Remake comes out specifically for the PS5 and the PC. During a comparison, the PC will be the better platform for gameplay and looks.

The big question is: which platform (any) will give the best approximation to the Director's vision for a game? And that answer will always be a high-end PC.
 

VFXVeteran

Banned
Just because you skimmed a Wikipedia article (that doesn't really include any up-to-date or in-depth info, it's all behind an NDA) doesn't mean you understand what you're talking about. There's no DX11 on PS4. There never has been any variant of DirectX on any PlayStation.

You are talking about the actual hardware. I'm not talking about that. I'm talking about what they develop on. And they *do* develop on PCs and run their games on PCs. Those PCs use DX11/DX12. They develop on Windows boxes. They develop in Visual Studio. Compiling down to the PS console devkit is not what I am talking about. At that point, it doesn't matter. Their main machine for creating new content, algos, etc.. is NOT the PS devkit. Now if you come back and tell me what I'm saying is NOT true, then I need to contact my sources and ask them why they are lying to me.
 
Last edited:

FireFly

Member
There is zero percent chance these consoles will be anywhere even close to 2080 ti levels.
I think it's unlikely but not completely impossible. The 5700 XT has a die size of 251mm^2. Scale by 1.5x and add on 70mm^2 for the CPU cores and you get 445mm^2. Well, the 360 had a combined die size of 438 mm^2 for the CPU + GPU + EDRAM. Also the PS3 had a combined die size of 493 mm^2.

This is what he says.

He's talking about old tflops to compare the performance.
Teraflops are a measure of peak compute performance. Hence a 12 TF Navi chip would only have 2X the peak performance of the Xbox One X.
 
Last edited:

nowhat

Member
You are talking about the actual hardware. I'm not talking about that. I'm talking about what they develop on. And they *do* develop on PCs and run their games on PCs. Those PCs use DX11/DX12. They develop on Windows boxes. They develop in Visual Studio. Compiling down to the PS console devkit is not what I am talking about. At that point, it doesn't matter. Their main machine for creating new content, algos, etc.. is NOT the PS devkit. Now if you come back and tell me what I'm saying is NOT true, then I need to contact my sources and ask them why they are lying to me.
Please tell me how "GNM is a wrapper for DX11" is true in any way.


GNM is akin to Vulkan (or DX12, or Metal, or Mantle, or whatnot)
 
Last edited:

lukilladog

Member
So why are people treating it as fact? Very odd.

Because AMD doubled chip density at the same power consumption on 7nm, this means ms can design their new console with the same gpu thermal specs of the xbox X but twice the gpu power, it would be odd not to do it.

Ps.- the 2080ti will be history at 4k once next gen games start coming out.
 
Last edited:

VFXVeteran

Banned
No. There's not one single game out there that is built from the ground up to use any of those features.

Also, what game is using mesh shaders atm?

Variable Rate Shading: Wolfenstein 2

Double rate FP16 just started back up again. Was used in the past.

Mesh shaders are in the new Turing GPUs, so I stand corrected on a game using it.

Texture space shading: are you speaking about the regular 2D pixel shader pipeline? Or are you talking about things like POM? Or something else that I completely missed?
 
Last edited:

VFXVeteran

Banned
I guess the reason for "in-engine" for Hellblade is because they don't have final silicon yet, so the video is a baseline of what they expect to deliver.
i.e. it could be rendered on a rx5700 (or what's currently inside the devkit) with lower fps, but it's at least rendered with the game-engine.

I've said this a gazillion times here. That trailer was not rendered on a XSX. It was done using a PC. The producer of the trailer told me this yesterday on the phone.
 

VFXVeteran

Banned
Please tell me how "GNM is a wrapper for DX11" is true in any way.


GNM is akin to Vulkan (or DX12, or Metal, or Mantle, or whatnot)

I'm talking about PLSL bro.

GNM is the API for the PS box. I get that. It's a thin layer because it's not multiplatform. I get that too. We are in agreement there.

All I'm saying is that GNM on a PS compared to DX on a PC isn't really a comparison because the PC will still outperform it thus mitigating the argument that GNM gives the PS an edge.
 

Kenpachii

Member
To check just how 'little' overhead is a thing now, when death stranding comes out on PC. See just how well it runs with a GPU that matches the PS4.

There is no need to wait for death stranding. There are tons of games that run on both platforms as the consoles are already out for a while.
Also PC presets are different in games then consoles. If devs decide to make 2x draw distance lowest preset, then its not really comparable anymore like for example red dead redemption does.
 

nowhat

Member
I'm talking about PLSL bro.

GNM is the API for the PS box. I get that. It's a thin layer because it's not multiplatform. I get that too. We are in agreement there.

All I'm saying is that GNM on a PS compared to DX on a PC isn't really a comparison because the PC will still outperform it thus mitigating the argument that GNM gives the PS an edge.
We'll see.

D dark10x when Death Stranding is released on PC, and I know DF will really go to town with it, could you please please pretty please also make a test rig that'd match a PS4 (Pro or not, it really doesn't matter) as closely as possible hardware-wise, and then pit them against each other? A graphics API deathmatch, Vulkan vs. GNM - two API enter, one API leave. This debate must be sorted out somehow, and that would be an excellent chance.
 

Kenpachii

Member
Because AMD doubled chip density at the same power consumption on 7nm, this means ms can design their new console with the same gpu thermal specs of the xbox X but twice the gpu power, it would be odd not to do it.

Ps.- the 2080ti will be history at 4k once next gen games start coming out.

The same way as the titan/780 ti was history once the PS4 came out? oh wait. 2080ti aint going anywhere.
 

psorcerer

Banned
1. I'm all in for a thin interface. But what interface would you consider thin? A GNM isn't necessarily thin. And even if you have a thin interface, that doesn't detract from the ever-growing need for raw power (i.e. more RAM, more cores, etc..)

2. I don't see it that way and in practice, the numbers just don't show. I know AMD has very good CPU hardware for sure. But I'm not seeing the GPU as superior at all.

3. I don't think they are much different. PLSL is literally a wrapper around DX. Guys at Naughty Dog told me this. If they are vastly different, then I'd need some proof.

1. Thin = possible to optimize for in a straightforward way.
2. Numbers are showing only the driver optimization. There are no other numbers.
3. PLSL is just a shader language, it's more or less close to the hardware there is nothing DX11-specific there. DX12 is much closer than DX11 to the hardware, but the shader language is still the same.
 

Eiknarf

Member
Holy crap! I didn’t even know there was a sequel- So I just found the trailer and love it!!!

Will it be multiplatform?
 

pawel86ck

Banned
This is what he says.



He's talking about old tflops to compare the performance. That thing is nowhere near 2080 ti levels.

Xbox one X = 580
Xbox one X series = 5700xt at best.

Here's a benchmark to showcase where the 580 sits on the PC solution, keep in mind 2080ti isn't even the top card in this market, but its somewhat still payable card, and that thing can sli on top of it.

index.php
Xbox X GPU is not exactly RX 580, it's customized and more efficient GPU. For example in RDR2 with close to xbox x settings RX 580 can only provide around 20fps while xbox x run the same game at solid 30fps (so we must assume average fps is at least around 35fps). MS has improved xbox x architecture to the point even PS4P GPU was clearly inferior. Xbox x can render over 2x more pixels than PS4P GPU in the same games, and PS4P also use Polaris GPU.

Based on my observations 10TF Navi would be able to match 2x xbox x power, however you only need 8TF Navi to match 8x xbox one (first GCN architecture). These two numbers 8TF and 10TF arnt the same, therefore it's obvious Phil wasnt talking about performance but only about TFLOPS metric. Eerything matches perfectly if you consider 12TF GPU, because it's exactly over 8x more power compared to xbox ono s (1.4TF), and exactly 2x more compared to xbox x.
 

lukilladog

Member
The same way as the titan/780 ti was history once the PS4 came out? oh wait. 2080ti aint going anywhere.

The 780ti was 277% higher tflop than the ps4 (5 and 1.8 respectively), 2080ti is like 116% higher than ps5. It´s gonna become a dog at 4k almonst instantly.
 

Digity

Member
Still falling for in-engine stuff? Ever see a BF trailer recently? They are in-engine, they look great. Get to real gameplay and suddenly it looks nothing like the trailer. This game will still be stunning in the end but I’m not expecting to see the level of graphics in the trailer carry over to actual gameplay.
 
PC gaming will need an overhaul when the new consoles drop to maintain parity.

we can already see Star Citizen pretty much requiring an SSD and with these being a standard in the new consoles along with some kind of raytracing we will see the first significant shake up with basic consoles overtaking all but high end PC owners.

I remember getting a GTX 780 just before the PS4 launched - the most high end GPU at the time.

it had 3Gb of ram and the PS4 had 8Gb of unified DDR and it quickly showed its limitations in games like Shadow of Mordor.

It was great for running PS3 / 360 era games at 1080p and 60fps but when the newer games hit it was already struggling.

Games made Specifically for the Sex Box and PS5 will blow PC titles out of the water from a price / performance ratio.

I just booted up the Order 1886 again the other day and I’ve yet to see something that looks like this that could run on a PC that costs £199.

This will only increase next gen and be more apparent.
 

rsouzadk

Member
No. It will not.

A rtx 2080 is already better than a 5700/5700xt or whatever custom gpu the put on next-gen consoles.
 
No. It will not.

A rtx 2080 is already better than a 5700/5700xt or whatever custom gpu the put on next-gen consoles.

I’m sorry but it will.

the 2080 will be replaced once just before the consoles get launched to sucker people in then shortly after they will launch a ‘better’ next gen GPU that will be more inline with the next gen.

The PS5 game trailer for example. It seems like it’s 4K 60fps with some form of raytracing.

The 2080 RT card can barely do this with mine craft 😂

The new consoles will eat current PCs for breakfast being unified - of course then PCs will get better but that’s not to say your 2080’s will cut it when they launch - far from it 😂😂😂
 
Last edited:
Top Bottom