• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Discussion: Can the 2080Ti be able to run PS5 &XSX games on ultra 4k ?

RaySoft

Member
Reminder, NVIDIA Turing has the superior memory compression hardware.

Titan Pascal's superior memory compression hardware.

3659247-5448792340-689b3.png


Turing has further memory compression improvements over Pascal.

3659248-9174399985-NV_Tu.png

NVIDIA's real-time memory compression is optimized for color.




If XSX needs ~560 GB/s BW to rival RTX 2080's 448GB/s, RDNA 2 is already behind Turing TU104.
So you're comparing that old texture compression with the consoles?? That's JUST for textures, and it's also just for textures residing i VRAM already.. that's kinda pointless comparison.
 

johntown

Banned
XSex => around 2080 super
PS5 => around 2080 (7% slower than super)

2080Ti at 4k => about 18% faster than 2080s. (half of that at lower resolutions)


If consoles will target 30fps at 4k, there is no way you'd get 60fps @4k with your only slightly faster GPU, let alone "ultra" setting, whatever the heck "ultra" does is pretty random. Even at the same resolution and graphics settings as consoles, your GPU will barely have an edge.


PS
People on gaming formum glorifying DLSS, which is yet another fancy way to upscale, is pathetic.
Typically, consoles don't target native 4k either. They use checkerboard rendering and other techniques to get to 4k. I think it is too early to say if consoles will be able to run native 4k (we need to wait for them to release and run games first). Also, consoles running at 30fps is typically due to CPU limitations to keep the cost down.

A 2080ti and a good CPU should be able to get 4k 60fps on many games. Sure there are plenty of newer ones that won't get to 4k 60fps with max settings but you can typically turn down a few settings without really sacrificing the picture to get there. You won't be able to do that on a console.
 

Larsowitz

Member
Recently I was able to get my hands on Asus 2080ti for about 850$ ( 1200$ Canadian is how much i paid for ) US in almost brand new state from friend of mine.

I was really hesitant to buy the card considering 3000 series coming probably at the end of this year and I do not know the performance of the 3080 compared to the 2080ti ( assuming the 3080Ti is way more powerful and expensive ). I mean, the 2080ti here costs around 2000$ tax included.

But figured the price is very fair even if the 3080 was way more powerful than 2080ti ( which I believe it should be around the same range but correct me if i am wrong, then I can always sell the ti for the 3000 series )

with that being said, the 2080ti can barely run from the videos i have seen games like Red Dead 2 4k 60 fps everything ultra. from the videos its more of 50s or so.


Now comes the new generation of consoles. which on paper the 2080 ti ( and 2080 super for that matter ) are way more powerful than GPU on XSX and PS5, how that will translate to gaming ?

Red Dead 2 on Xbox one X runs at 4k native and 30 fps but with settings around mid to low compared to PC.

Can for example XSX run the game 4k on ultra and 60 fps ?

is the nature of the closed box of console development can lead to something higher than that ?

My 2080ti runs well optimized games like Doom Eternal or Gears 5 at 80-100 fps with ultra/max settings in 4k. It is a very capable card for 4k/60+ gaming!
 
Last edited:

Mobilemofo

Member
Honestly, i don't get why some pc gamers seemed concerned about the consoles. You have options to upgrade and will always have better gfx generally. Say this as a console gamer since the snes days (gave up building pc's in my mid 20s)
 
Last edited:

diffusionx

Gold Member
Typically, consoles don't target native 4k either. They use checkerboard rendering and other techniques to get to 4k. I think it is too early to say if consoles will be able to run native 4k (we need to wait for them to release and run games first). Also, consoles running at 30fps is typically due to CPU limitations to keep the cost down.

A 2080ti and a good CPU should be able to get 4k 60fps on many games. Sure there are plenty of newer ones that won't get to 4k 60fps with max settings but you can typically turn down a few settings without really sacrificing the picture to get there. You won't be able to do that on a console.

I think the new consoles will hit 4K, at least early on. And no, 30fps is not typically due to CPU limitations, I mean look at any PC benchmark, to see how the GPU affects the framerate. Not to mention many console games have a performance mode that knocks down resolution and detail to get to 60fps or something close to it.
 

rnlval

Member
So you're comparing that old texture compression with the consoles?? That's JUST for textures, and it's also just for textures residing i VRAM already.. that's kinda pointless comparison.
I'm still waiting for RTX 2080 Super level SKU from AMD with a 256-bit GDDR6 bus. LOL

With XSX, MS is throwing higher bus width PCB BOM cost against RTX 2080's lower 256-bit GDDR6 bus PCB BOM cost. This is why Nvidia has higher profitability.

"I expect more from AMD". LOL.

Nvidia is not Intel.
 
Last edited:
Recently I was able to get my hands on Asus 2080ti for about 850$ ( 1200$ Canadian is how much i paid for ) US in almost brand new state from friend of mine.

I was really hesitant to buy the card considering 3000 series coming probably at the end of this year and I do not know the performance of the 3080 compared to the 2080ti ( assuming the 3080Ti is way more powerful and expensive ). I mean, the 2080ti here costs around 2000$ tax included.

But figured the price is very fair even if the 3080 was way more powerful than 2080ti ( which I believe it should be around the same range but correct me if i am wrong, then I can always sell the ti for the 3000 series )

with that being said, the 2080ti can barely run from the videos i have seen games like Red Dead 2 4k 60 fps everything ultra. from the videos its more of 50s or so.


Now comes the new generation of consoles. which on paper the 2080 ti ( and 2080 super for that matter ) are way more powerful than GPU on XSX and PS5, how that will translate to gaming ?

Red Dead 2 on Xbox one X runs at 4k native and 30 fps but with settings around mid to low compared to PC.

Can for example XSX run the game 4k on ultra and 60 fps ?

is the nature of the closed box of console development can lead to something higher than that ?
Never understood the obsession with ultra settings you can barely tell the difference from high, especially when in motion.

dial all that garbage down abit then you can have your 4K cream pie.

but then again chasing 4K is equally a waste of power too.
 

RaySoft

Member
I'm still waiting for RTX 2080 Super level SKU from AMD with a 256-bit GDDR6 bus. LOL

With XSX, MS is throwing higher bus width PCB BOM cost against RTX 2080's lower 256-bit GDDR6 bus PCB BOM cost. This is why Nvidia has higher profitability.

"I expect more from AMD". LOL.

Nvidia is not Intel.
Your GPU could be the biggest monster there is, with 512bit memory interface or whatnot, but it wont matter when data it needs are "miles away" figurely speaking.
The new consoles are a complete eco system where the whole system performes "as one". PC hardware are more like "lonely islands" where each island are really powerfull on their own, but lacks any streamlined solution for feeding themselves with data at the same rate. Consoles has one pool of ram, wich functions as both system and vram, so data does'nt have to be copied twice for instance.

For a PC to achieve remotely the same experiences the consoles will bring (after the first wave of cross-gen games) thanks to their massive bandwidth in the whole pipeline chain, it would need a nand storage connected directly to the gpu card itself, or have massive amounts of vram and having the whole game resident in that ram.
A chain is only as strong as it's weakest link.

Ive read a few pages here and Im afraid many PC evangalists here will be in for a shock once they finally realize what's coming.

Edit: To all the LOLs: Feel free to join the discussion at anytime. Please enlighten me with your own views and knowledge!:)
 
Last edited:

Tesseract

Banned
with some adaptive sampling techniques, probably

just going by what i'm working on in the lab, what my hardware looks like (gtx 1080 ti / 2080 ti), what i'm targeting (3080 ti+ / ampere and beyond)
 
Last edited:
Never understood the obsession with ultra settings you can barely tell the difference from high, especially when in motion.

dial all that garbage down abit then you can have your 4K cream pie.

but then again chasing 4K is equally a waste of power too.
Yeah, throw that shit on 1800p and High settings and you have a game that performs better and looks basically the same in the majority of titles.
 

Shai-Tan

Banned
i cant even play some current gen games in 4k with a 2080ti without drops to well under 60. but the next gen consoles won't be running native 4k all the time either. i read that the unreal engine 5 demo the other day was running in 1440p on the ps5
 

johntown

Banned
I think the new consoles will hit 4K, at least early on. And no, 30fps is not typically due to CPU limitations, I mean look at any PC benchmark, to see how the GPU affects the framerate. Not to mention many console games have a performance mode that knocks down resolution and detail to get to 60fps or something close to it.
CPU limitations 100% affect the frame rate in consoles.

PC are more GPU bound but CPU contributes to the frame rate and a better CPU will give you a better frame rate.

My main point is the new consoles will be slightly on par with a high end PC with the exception of the CPU. The PC will be able to get a higher FPS due to the CPU.
 

Brofist

Member
Well the great thing about PC gaming is that you can tweak the settings a bit and get the performance you are looking for. At least you have the option.
 

Senua

Member
i cant even play some current gen games in 4k with a 2080ti without drops to well under 60. but the next gen consoles won't be running native 4k all the time either. i read that the unreal engine 5 demo the other day was running in 1440p on the ps5
30fps too!
 

Rikkori

Member
Which is why I said that most games next-gen will look relatively the same as current gen games on PC @ ultra settings. If a dev wants to go for true 4k, it'll eat up the GPU TFLOPS very quickly without adding anything new like UE5 demo, for example. Raytracing will take a back seat for next-gen consoles because it's just too demanding for consoles.

As far as RT is concerned, at least on consoles, it's all going to come down to how fast & how far they can push the denoisers. There's a lot of room to innovate there and that's the major hurdle atm. Otherwise, I think it's clear from the UE5 approach that they will prefer to stay away from proper RT due to its cost and because there's still a lot of value in mixing traditional approaches which get you VERY close to an RT image anyway. Frankly I'm happy to see it done that way, I was more than satisifed with SVOGI. So long as we get light bounces that pick up colour then I'm over the moon.
 

Allandor

Member
Make sure RTX 2080 Ti has a motherboard linked to PCIe 4.0 NVMe SSD setup with 5GB/s or higher.
why? On PC you can have 32GB of main memory or more, that is actually cheaper than a 1TB high speed SSD. You just don't need a fast SSD if you have enough main memory. Yes, loading times can be a "problem" but not graphical details.
Btw, loading times should go down even with a normal SSD if games would just require that. Than no more packaging would be required and it is just a question how big the memory buffer can be.
 
Last edited:

martino

Member
why? On PC you can have 32GB of main memory or more, that is actually cheaper than a 1TB high speed SSD. You just don't need a fast SSD if you have enough main memory. Yes, loading times can be a "problem" but not graphical details.
Btw, loading times should go down even with a normal SSD if games would just require that. Than no more packaging would be required and it is just a question how big the memory buffer can be.

the thing is you keep problems and logic you erase when your game can load only what it needs when it needs it.
I'm curious to see how they will do on pc. Will they do all the works this imply ? what will be the new solution to avoid that ?
If there are none will this encourage games not to release on pc again because of all this work on the i/o optimization ?
also will hardware release for it ?

i don't have the answer here but there are lot of possible
 

Knch

Member
I need to watch YT to see how version 2.0 of upscaling tech, who's version 1.0 sucked, does to a game that looks like something from 2007 to appreciate it's greatness.
Yeah, thanks.

It's like applying RT to games with had no shadows/light effects/reflections whatsoever, say, Minecraft and seeing something unprecedented happening.


Upscaling technology gives better than native picture quality... I have underestimated the power of ultimate buzzwords, it seems...
Inhale slowly, hold it, exhale slowly.

Let the nvidier fanbois feel good about their graphics card giving double the performance, while doing a quarter of the work, and it looking "better" unless it royally fucks up.
 

pawel86ck

Banned
I would like to see it's performance against ps4 with new games released in 2019.
I cant imagine GTX 750 running such demanding games like metro exodus or RDR2 with PS4 settings. This GPU is too slow and has only 2 GB VRAM.

In order to run PS4 ports with similar quality you have to own at least 3 GB GPU like GTX 780 or AMD 7970, and both are 2x faster than PS4 GPU.
 
Last edited:
RDR 2 struggles to hit 4k 60 on a 2080ti (averaging 40 fps). So what do you think? Console versions just run much better on the same level of hardware than PC ports for a variety of reasons even though many people may tell you otherwise. I've seen the proof time and time again. When I had an r9 280x on my PC years ago its much stronger than a ps4's gpu yet the ps4 versions ran and looked better than when I tried them on PC at the lowest settings I could. I can tell you there is NO WAY that card is going to run anything looking like God of war or UC4 with remotely the same performance. Its console magic.
To give a solid answer, no way a 2080 ti will run any next gen game at 4k 60, unless its just a bad looking game/cartoony game or a cross gen title from a mid tier studio.
This the one. Games are optimized more for consoles, so don’t be surprised if the XSX out perform your computer on certain games.
 

Kenpachii

Member
Your GPU could be the biggest monster there is, with 512bit memory interface or whatnot, but it wont matter when data it needs are "miles away" figurely speaking.
The new consoles are a complete eco system where the whole system performes "as one". PC hardware are more like "lonely islands" where each island are really powerfull on their own, but lacks any streamlined solution for feeding themselves with data at the same rate. Consoles has one pool of ram, wich functions as both system and vram, so data does'nt have to be copied twice for instance.

For a PC to achieve remotely the same experiences the consoles will bring (after the first wave of cross-gen games) thanks to their massive bandwidth in the whole pipeline chain, it would need a nand storage connected directly to the gpu card itself, or have massive amounts of vram and having the whole game resident in that ram.
A chain is only as strong as it's weakest link.

Ive read a few pages here and Im afraid many PC evangalists here will be in for a shock once they finally realize what's coming.

Edit: To all the LOLs: Feel free to join the discussion at anytime. Please enlighten me with your own views and knowledge!:)


For a PC to achieve remotely the same experiences the consoles will bring (after the first wave of cross-gen games) thanks to their massive bandwidth in the whole pipeline chain, it would need a nand storage connected directly to the gpu card itself, or have massive amounts of vram and having the whole game resident in that ram.

And what is that pipeline chain going to bring exactly? go enlighten me? And why would a SSD ever need to be connected straight to a GPU on PC? just why? And why would a game need to be entirely loaded into ram to compete with PS5 setup?

I honestly want to see you stumble on those questions.

I will explain u how PC works.

Data sits on a SSD, goes into RAM ( which is a super SSD ) and feeds it into the V-ram. And the ram > v-ram communication is done in nano seconds not milliseconds what the PS5 SSD sits at. it could hop up and down 500 times and still be faster then what PS5 delivers.

While that PS5 is walking through a crack to swap data in and out of its memory from the SSD in milliseconds, PC will already have that data ready to be served ages ago and loads it into the v-ram pool in nana seconds ( practically instantly ) because its not limited by a choking 16gb of v-ram pool as it has its own v-ram pool and memory pool that both can store far larger amount of data and swap quicker.

While that PS5 is dropping data from its memory pool and loading in from its SSD, PC already has all those rooms in the demo ready to be served in spare v-ram and memory and the SSD will load in the next demo in even more ram at the same time while the PS5 has to crawl through endless of cutscenes.

I probably don't have to lecture u how much memory and what memory performance a PC has access towards and what SSD performance a SSD solution can deliver on PC, it's far faster than PS5 has on every front.

But keep believing mate, u sound exactly like those APU people that thought it was the future of PC gaming because PC's could not ever compete with such a architecture. Yet reality hitted them the moment games came out and realized the hardware is kinda shit already for its day.

Want to know the performance of that PS5, go watch some 4k 5700xt benchmarks.

Wonder why that demo was partly scripted from EU5? and wonder why it only ram at 1440p and 30 fps? even while they walk around the area in a serious slow pace with barely anything going on? there you go.

This the one. Games are optimized more for consoles, so don’t be surprised if the XSX out perform your computer on certain games.

This is what optimisation means:

168d058529d79c4706cc4f5b7a879558.gif
 
Last edited:

Allandor

Member
the thing is you keep problems and logic you erase when your game can load only what it needs when it needs it.
I'm curious to see how they will do on pc. Will they do all the works this imply ? what will be the new solution to avoid that ?
If there are none will this encourage games not to release on pc again because of all this work on the i/o optimization ?
also will hardware release for it ?

i don't have the answer here but there are lot of possible
That is not correct. You don't t load the data when you need it, you still must load at least x frames ahead. That is more or less like now, but with a much smaller time frame and without loading packets that you really don't need.
All you must change is the size of buffering.
 

pawel86ck

Banned
And what is that pipeline chain going to bring exactly? go enlighten me? And why would a SSD ever need to be connected straight to a GPU on PC? just why? And why would a game need to be entirely loaded into ram to compete with PS5 setup?

I honestly want to see you stumble on those questions.

I will explain u how PC works.

Data sits on a SSD, goes into RAM ( which is a super SSD ) and feeds it into the V-ram. And the ram > v-ram communication is done in nano seconds not milliseconds what the PS5 SSD sits at. it could hop up and down 500 times and still be faster then what PS5 delivers.

While that PS5 is walking through a crack to swap data in and out of its memory from the SSD in milliseconds, PC will already have that data ready to be served ages ago and loads it into the v-ram pool in nana seconds ( practically instantly ) because its not limited by a choking 16gb of v-ram pool as it has its own v-ram pool and memory pool that both can store far larger amount of data and swap quicker.

While that PS5 is dropping data from its memory pool and loading in from its SSD, PC already has all those rooms in the demo ready to be served in spare v-ram and memory and the SSD will load in the next demo in even more ram at the same time while the PS5 has to crawl through endless of cutscenes.

I probably don't have to lecture u how much memory and what memory performance a PC has access towards and what SSD performance a SSD solution can deliver on PC, it's far faster than PS5 has on every front.

But keep believing mate, u sound exactly like those APU people that thought it was the future of PC gaming because PC's could not ever compete with such a architecture. Yet reality hitted them the moment games came out and realized the hardware is kinda shit already for its day.

Want to know the performance of that PS5, go watch some 4k 5700xt benchmarks.

Wonder why that demo was partly scripted from EU5? and wonder why it only ram at 1440p and 30 fps? even while they walk around the area in a serious slow pace with barely anything going on? there you go.



This is what optimisation means:

168d058529d79c4706cc4f5b7a879558.gif
Go lecture Cerny and Tim Sweeney.
 

martino

Member
That is not correct. You don't t load the data when you need it, you still must load at least x frames ahead. That is more or less like now, but with a much smaller time frame and without loading packets that you really don't need.
All you must change is the size of buffering.

sure but you're focusing what need to loaded to something near instant.
it's obvious the more freedom your game give you and the more you need to have in advance the more complex the buffer become.

but don't get me wrong here. I think current pc hardware have untapped potential and the hardware can be better used (it was not needed).
Will it be done , Will it be enough , will it need more ?
Great times ahead us.
 
And what is that pipeline chain going to bring exactly? go enlighten me? And why would a SSD ever need to be connected straight to a GPU on PC? just why? And why would a game need to be entirely loaded into ram to compete with PS5 setup?

I honestly want to see you stumble on those questions.

I will explain u how PC works.

Data sits on a SSD, goes into RAM ( which is a super SSD ) and feeds it into the V-ram. And the ram > v-ram communication is done in nano seconds not milliseconds what the PS5 SSD sits at. it could hop up and down 500 times and still be faster then what PS5 delivers.

While that PS5 is walking through a crack to swap data in and out of its memory from the SSD in milliseconds, PC will already have that data ready to be served ages ago and loads it into the v-ram pool in nana seconds ( practically instantly ) because its not limited by a choking 16gb of v-ram pool as it has its own v-ram pool and memory pool that both can store far larger amount of data and swap quicker.

While that PS5 is dropping data from its memory pool and loading in from its SSD, PC already has all those rooms in the demo ready to be served in spare v-ram and memory and the SSD will load in the next demo in even more ram at the same time while the PS5 has to crawl through endless of cutscenes.

I probably don't have to lecture u how much memory and what memory performance a PC has access towards and what SSD performance a SSD solution can deliver on PC, it's far faster than PS5 has on every front.

But keep believing mate, u sound exactly like those APU people that thought it was the future of PC gaming because PC's could not ever compete with such a architecture. Yet reality hitted them the moment games came out and realized the hardware is kinda shit already for its day.

Want to know the performance of that PS5, go watch some 4k 5700xt benchmarks.

Wonder why that demo was partly scripted from EU5? and wonder why it only ram at 1440p and 30 fps? even while they walk around the area in a serious slow pace with barely anything going on? there you go.



This is what optimisation means:

168d058529d79c4706cc4f5b7a879558.gif
I’m thinking more of how the game is built from the ground up. Like the utilization, prioritizing console development over PC. Getting the best out of a specific hardware that doesn’t change.
 

FireFly

Member
I'm still waiting for RTX 2080 Super level SKU from AMD with a 256-bit GDDR6 bus. LOL

With XSX, MS is throwing higher bus width PCB BOM cost against RTX 2080's lower 256-bit GDDR6 bus PCB BOM cost. This is why Nvidia has higher profitability.

"I expect more from AMD". LOL.

Nvidia is not Intel.
What does any of that have to do with the rate at which data can be streamed into VRAM?

And what is that pipeline chain going to bring exactly? go enlighten me? And why would a SSD ever need to be connected straight to a GPU on PC? just
Data sits on a SSD, goes into RAM ( which is a super SSD ) and feeds it into the V-ram. And the ram > v-ram communication is done in nano seconds not milliseconds what the PS5 SSD sits at. it could hop up and down 500 times and still be faster then what PS5 delivers.

While that PS5 is walking through a crack to swap data in and out of its memory from the SSD in milliseconds, PC will already have that data ready to be served ages ago and loads it into the v-ram pool in nana seconds ( practically instantly ) because its not limited by a choking 16gb of v-ram pool as it has its own v-ram pool and memory pool that both can store far larger amount of data and swap quicker.

While that PS5 is dropping data from its memory pool and loading in from its SSD, PC already has all those rooms in the demo ready to be served in spare v-ram and memory and the SSD will load in the next demo in even more ram at the same time while the PS5 has to crawl through endless of cutscenes.


 

Reficul

Member
I dont get it. Did you buy the graphics card to brag about it to your console owner friends or did you get it to play games?
Why do you care if the New consoles can play games at this or that resolution?
Just play the damn games.

If you need 60 FPS, just lower some settings. The game is probably still playable.
 

llien

Member
This is why Nvidia has higher profitability.
NV has higher profitability, because uneducated folks buy its cards at a premium.
Chip in XSex is an APU and cannot be directly compared.
2080Ti is a 754mm2 12nm monstrosity, chuckle, what were you saying about profitability of that chip vs RDNA2? :D

Of 360mm2 or so in 7nm XSex APU, about 80mm2 should be for CPU alone.
 
Last edited:

Shai-Tan

Banned
RDR 2 struggles to hit 4k 60 on a 2080ti (averaging 40 fps). So what do you think? Console versions just run much better on the same level of hardware than PC ports for a variety of reasons even though many people may tell you otherwise. I've seen the proof time and time again. When I had an r9 280x on my PC years ago its much stronger than a ps4's gpu yet the ps4 versions ran and looked better than when I tried them on PC at the lowest settings I could. I can tell you there is NO WAY that card is going to run anything looking like God of war or UC4 with remotely the same performance. Its console magic.
To give a solid answer, no way a 2080 ti will run any next gen game at 4k 60, unless its just a bad looking game/cartoony game or a cross gen title from a mid tier studio.

lol, no. The pc version just has (mostly useless) ultra settings that have a huge effect on performance. pc gamers just can’t help themselves and turn up options that aren’t suitable for current graphics cards.

edit: and my opinion of rdr 2 is they use a lighting system that kills graphical clarity compared to gta v anyway. Super disappointed in the graphics of that game coming from how good gta v looks in 4k. distant objects way more clear in gta
 
Last edited:

diffusionx

Gold Member
RDR 2 struggles to hit 4k 60 on a 2080ti (averaging 40 fps). So what do you think? Console versions just run much better on the same level of hardware than PC ports for a variety of reasons even though many people may tell you otherwise. I've seen the proof time and time again. When I had an r9 280x on my PC years ago its much stronger than a ps4's gpu yet the ps4 versions ran and looked better than when I tried them on PC at the lowest settings I could. I can tell you there is NO WAY that card is going to run anything looking like God of war or UC4 with remotely the same performance. Its console magic.
To give a solid answer, no way a 2080 ti will run any next gen game at 4k 60, unless its just a bad looking game/cartoony game or a cross gen title from a mid tier studio.

Here's some benchmarks for the R9 280X: https://www.techspot.com/article/1592-revisiting-radeon-r9-280x-radeon-hd-7970/

All the games in that first article run at 30fps on PS4 IIRC, except RE7 which runs at 60. However, it's not hard to get that GPU to run at 60fps and even over 100fps for RE7.

The Anandtech article is more of a comparison thing but I can pull out, for example Battlefield 4, which runs at 900p on PS4 at around 40fps and looks like shit, but it seems like your particular GPU could handle it at 1080p/60fps without too many problems.

A better GPU is just a better GPU. No secret sauce.
 

Alphagear

Member
2080ti cannot hit 4k/60fps on a majority of AAA games?

Surprised about that. What chance do the next gen consoles have then.
 

hyperbertha

Member
Here's some benchmarks for the R9 280X: https://www.techspot.com/article/1592-revisiting-radeon-r9-280x-radeon-hd-7970/

All the games in that first article run at 30fps on PS4 IIRC, except RE7 which runs at 60. However, it's not hard to get that GPU to run at 60fps and even over 100fps for RE7.

The Anandtech article is more of a comparison thing but I can pull out, for example Battlefield 4, which runs at 900p on PS4 at around 40fps and looks like shit, but it seems like your particular GPU could handle it at 1080p/60fps without too many problems.

A better GPU is just a better GPU. No secret sauce.
You know what looks like shit? Nier automata on PC. I couldn't play that game without constant fps drops but it looked just fine on my ps4. Same goes for cod modern warfare that I can't run well on my pc at the lowest settings yet is fine on console. Also they cap most ps4 games at 30 fps to avoid the fps drops you get at 60 fps with cards like the r9 280x. Constant 30 fps>>>fluctating 60 fps. Pretty sure the ps4 can pull overall higher framerates than that card on most games. Are you spreading FUD to fit you PC agenda? That's what it looks like. Its common knowledge that consoles just play games better than same hardware on PC and I have concrete proof for it from my own PC gaming experience. Debating console optimizations is on the same level as PC is a joke.
 

Croatoan

They/Them A-10 Warthog
I have a 2080ti and barely run current gen games at 4k 60fps. I don't expect next gen consoles to do more than 4k 30fps at high settings, and I don't think we will have 4k 60fps maxed on next gen PC games for at least one more GPU cycle.

4k is just that demanding.


LOL, I always wonder why consolers assume PC people mean 60fps max. PC gamers never like to go BELOW 60, and for me I prefer between 77 and 144fps because of my 144hz gsync monitor.
 
Last edited:
You know what looks like shit? Nier automata on PC. I couldn't play that game without constant fps drops but it looked just fine on my ps4. Same goes for cod modern warfare that I can't run well on my pc at the lowest settings yet is fine on console. Also they cap most ps4 games at 30 fps to avoid the fps drops you get at 60 fps with cards like the r9 280x. Constant 30 fps>>>fluctating 60 fps. Pretty sure the ps4 can pull overall higher framerates than that card on most games. Are you spreading FUD to fit you PC agenda? That's what it looks like. Its common knowledge that consoles just play games better than same hardware on PC and I have concrete proof for it from my own PC gaming experience. Debating console optimizations is on the same level as PC is a joke.
Don't want to sound mean or anything, but did you build your computer or buy it? Something sounds funny about it. What are your specs? Something can't be right, as the benchmarks above prove otherwise. Don't recall you saying you have a pc before now, so I'm just curious, as your performance doesn't add up.

Either way, your experience doesn't account for everyone else under the sun with better performance with that exact gpu. I know, because I once had one. And i ran all games over 60fps back then, with better visuals and resolution than consoles.
 

Rickyiez

Member
Early gen title maybe but as games demanding more utilization of newer tech, it will struggle. Same like 980 Ti during the early gen where it's a beast but it can't even do 1080p 60fps max settings for newer games like Control or RDR 2 nowadays. 2080 Ti will struggle too as the next console cycle matured, this has always been the case.
 
Last edited:

VFXVeteran

Banned
As far as RT is concerned, at least on consoles, it's all going to come down to how fast & how far they can push the denoisers. There's a lot of room to innovate there and that's the major hurdle atm. Otherwise, I think it's clear from the UE5 approach that they will prefer to stay away from proper RT due to its cost and because there's still a lot of value in mixing traditional approaches which get you VERY close to an RT image anyway. Frankly I'm happy to see it done that way, I was more than satisifed with SVOGI. So long as we get light bounces that pick up colour then I'm over the moon.

I'm not of the believer that it's very close. Having shadows cast from very small objects is a necessity to simulate realworld light physics. Those bugs in the demo is proof that shadows are needed. They all looked detached from the scene.
 
Last edited:

Rikkori

Member
I'm not of the believer that it's very close. Having shadows cast from very small objects is a necessity to simulate realworld light physics. Those bugs in the demo is proof that shadows are needed. They all looked detached from the scene.

I guess that's a lot more subjective. You work in the industry directly so your frame of reference is going to be further skewed towards noticing the imperfections more, but even I as a more dedicated observant player would struggle to really pinpoint it, least of all during gameplay (even in more static scenes). Let alone someone who just isn't aware of all these things or doesn't care (i.e. the 99%).
 

RaySoft

Member
And what is that pipeline chain going to bring exactly? go enlighten me? And why would a SSD ever need to be connected straight to a GPU on PC? just why? And why would a game need to be entirely loaded into ram to compete with PS5 setup?

I honestly want to see you stumble on those questions.
A GPU with it's own storage solution would drastically reduce latency and deliver blazing fast bandwidth to data.
This way it could also be "freed" from the constraints of having to be compatible with existing interfaces, so the GPU vendor could finetune their own i/o logic as they see fit.
A PS5 can sustain an effective 9GB/s data throughput, PC's are nowhere near that today. Remember we are talking about effective speeds here, not theoretical.
For a PC to achieve something similar it would actually need to be a bit faster than that since it also has to decompress data (except textures) and then move everything to GPU memory.

I will explain u how PC works.
No need, but if you say so.

Data sits on a SSD, goes into RAM ( which is a super SSD ) and feeds it into the V-ram. And the ram > v-ram communication is done in nano seconds not milliseconds what the PS5 SSD sits at. it could hop up and down 500 times and still be faster then what PS5 delivers.
So RAM is a super SSD now?
The theoretical bandwith of a 16x PCIe 5.0 is around 50GB/s, wich is fine, but you still need to get the data off the SSD first.
The process of moving data from system ram to video ram is not the bottleneck here, it's the process of getting the compressed data from SSD into system memory. That's the slowest part.
Still, , using CPU cycles to decompress that data. (see above)

While that PS5 is walking through a crack to swap data in and out of its memory from the SSD in milliseconds, PC will already have that data ready to be served ages ago and loads it into the v-ram pool in nana seconds ( practically instantly ) because its not limited by a choking 16gb of v-ram pool as it has its own v-ram pool and memory pool that both can store far larger amount of data and swap quicker.
That was the old way of getting around the fact that loading assets was slow. It's not needed anymore on next-gen.
For your statement to be somewhat correct, that data you talk about would be required to already be uncompressed in PC's system memory (not from ssd)

While that PS5 is dropping data from its memory pool and loading in from its SSD, PC already has all those rooms in the demo ready to be served in spare v-ram and memory and the SSD will load in the next demo in even more ram at the same time while the PS5 has to crawl through endless of cutscenes.
This is probably the point most don't get. PS5 don't need the extra ram since with 9GB/s you don't need to pre-load stuff anymore.
Like Cerny said, They only need data for the next 1 second of gameplay. That's effectively just your framebuffer.
This opens up a new way of designing games, where you don't have to pre-load a lot of data wich would needed to be ditched anyways after player input. (Like in a crossroad for instance, you would have to -pre-load data for all four paths, and then ditch the other three pathways as soon as the player makes their choice. This is what it means that the storage is "closer" to the GPU.
The "closer" it is the less unvanted data you need to pre-load.

I probably don't have to lecture u how much memory and what memory performance a PC has access towards and what SSD performance a SSD solution can deliver on PC, it's far faster than PS5 has on every front.
No you don't.
So how are you planning to deliver 9GB of uncompressed data from a storage device, redily available for the GPU to work with in 1 second sustained? (and don't start with striping)
I honestly want to see you stumble on that one.

But keep believing mate, u sound exactly like those APU people that thought it was the future of PC gaming because PC's could not ever compete with such a architecture. Yet reality hitted them the moment games came out and realized the hardware is kinda shit already for its day.
I'm not one of those, if at all they exists?

Want to know the performance of that PS5, go watch some 4k 5700xt benchmarks.
That's quite a naive claim, and you should know that. I'd leave it at that.

Wonder why that demo was partly scripted from EU5? and wonder why it only ram at 1440p and 30 fps? even while they walk around the area in a serious slow pace with barely anything going on? there you go.
UE5 is a multiplatform engine, that probably has some bespoke paths for PS5 optimizations, but the 1st party engine(s) will be even a more "tight fit" for the PS5's hardware and thus achieving even better results.
 
Last edited:
with that being said, the 2080ti can barely run from the videos i have seen games like Red Dead 2 4k 60 fps everything ultra
That's because:
1) RDR2 is a dogshit port
2) Ultra is almost always a waste of GPU resources
3) ...as is 4K...

It's a 14TF card with enough VRAM to hold a small country. Sure, the 3000 series are around the corner and it will eventually fall behind (as all tech does) but you really have nothing to worry about for a few years yet. I'm still rocking a 1070 and am happy enough with the performance.
 

Pizdetz

Banned
I decided to aim for 1440p and 60-120 FPS for next gen.
Pretty sure with a budget of $500 for the GPU and willingness to wait till when the new generation cards drop, it should be sufficient.
With enormous diminishing returns and a huge hit to performance it doesn't make sense to me to aim for 4K, even 1440p might be a stretch.
If you have money to burn then 4K 60 FPS could be a fun milestone.
 
Top Bottom