• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
D

Deleted member 775630

Unconfirmed Member
The devkit
Why would you name the retail device differently from the devkit with another codename? So the actual retail name will be Xbox Series S, the codename of that device is Edinburgh and the codename of the devkit is Lockhart? Seems far-fetched
 

DrDamn

Member
Why would you name the retail device differently from the devkit with another codename? So the actual retail name will be Xbox Series S, the codename of that device is Edinburgh and the codename of the devkit is Lockhart? Seems far-fetched

Yep.

Anaconda => XSX
Lockhart => XSS
Edinburgh? If it's related to Lockhart i.e. Lockhart => Edinburgh => XSS then what about Anaconda? What is it's retail codename?
 

Nikana

Go Go Neo Rangers!
Decided to ask a buddy of mine in the industry about the Lockhart and Series X vs PS5 and what he thinks so far. He is working on Borderlands 3 for Next Gen right now.

Vetted by Bill O'Rights Bill O'Rights

When I asked about Lockhart "holding back next gen." He had this to say:

"I really hate this term as it doesn't make any sense in terms of how games are actually made but I understand what people are trying to say. But if you have ever designed a game there isn't much you can and can not do. Its all a matter of what you are willing to put time/resources into to get working correctly. Whenever a new generation of consoles happens it allows creators to get things working faster and easier. Lockhart when we/my team was first briefed on it sounded really bad on paper.

Microsoft failed on providing real dev kits and details on the project. We didn't get any type of Lockhart hardware until very recently. Before we actually had the hardware we were given a profile on Anaconda dev kits that would mimic what Lockhart would be. But Microsoft never mentioned that it would have the same CPU and an SSD or how much RAM they intended Lockhart to have. I suspect this was because they themselves hadn't decided. To put it bluntly, they released these profiles far too early. The tools they provided made us hate Lockhart.

That changed once we got Lockhart Dev kits. It is indeed the same CPU and SSD and getting up and running on this device was super easy compared to Anaconda running in the Lockhart profile. We have been able to do the work we want on Anaconda and get it running on Lockhart with not a ton of work but it has required a bit more time to make sure the code runs on both machines in the same fashion. Its not something we are really worried about anymore. As the generation goes on I feel like this will be the approach for many studios. You start on Anaconda and then optimize for Lockhart. There is nothing the Lockhart can't do that the Anaconda can.

The one thing I have heard thats concerning is that Lockhart dev kits are not common. It seems like Microsoft really wants to be able to use Anaconda to accurately portray Lockhart performance and that has not been the experience my team has had. The profiles and tools are getting better on Anaconda in terms of mimicking Lockhart, but if you don't have a Lockhart dev kit, I feel like you are not going to be able to see how it accurately runs on Lockhart. Maybe this will change, but as of right now you really need a Lockhart Dev kit to understand it. For smaller teams I could see the optimization process being more time consuming but the tools provided by Microsoft have come a long way. They make it very easy to jump from one kit to another and the Lockhart kit is equipped with a lot of tools that help you see exactly where code needs to be looked at. Ray Tracing is one area that they seemed to have focused on and have made it very easy to adjust the levels."

I asked about PS5 Dev kits vs Series X dev kits and which console has the upper hand?

"PS5 dev kit is a bit easier to work with. Its well thought out and designed in ways that make it a bit easier to tweak and change things vs Anaconda. To say I prefer one over the other isn't' really fair because both are very good, but its just a bit easier to work with PS5. But Anaconda has the upper hand in terms of us being able to really push effects. The difference will come down to effects over resolution for us. We have both dev kits pushing 4K/60 on Borderlands 3 and we have almost zero loading times on both kits. Looking at them side by side the image is very similar.
 

FranXico

Member
I asked about PS5 Dev kits vs Series X dev kits and which console has the upper hand?

"PS5 dev kit is a bit easier to work with. Its well thought out and designed in ways that make it a bit easier to tweak and change things vs Anaconda. To say I prefer one over the other isn't' really fair because both are very good, but its just a bit easier to work with PS5. But Anaconda has the upper hand in terms of us being able to really push effects. The difference will come down to effects over resolution for us. We have both dev kits pushing 4K/60 on Borderlands 3 and we have almost zero loading times on both kits. Looking at them side by side the image is very similar.

So, no major performance differences between the PS5 and XSX. Slightly better IQ on XSX at the same resolution and framerate, that's all.
Sounds like less of a difference than between the PS4Pro and X1X.
 

Mr Moose

Member
Why would you name the retail device differently from the devkit with another codename? So the actual retail name will be Xbox Series S, the codename of that device is Edinburgh and the codename of the devkit is Lockhart? Seems far-fetched
Could be a shitty all digital version like PS5/XBO SAD. (Fuck the all digital future).
It's interesting that this is the first time we're hearing about this version though. Xbox Series Scottish.
 

icerock

Member


I hope enhanced version would be available on PS5 too as part of backward comparability program.


Dynamic foliage looks sweet. Hope they deliver a solid patch for PS5.

Its funny (but not surprising) to see people moan about the pricing which is. $50/€50 in many regions. Yet, here we have Sony trying to pierce a PC market which historically are very tight when it comes to spending on software.

Looks better than HZD2 in PS5.

(For the guys who will remember this kkkkkkkkkkkkkk)

I understood that reference. But, be assured that he'll point out certain things happening on the PC ver. compared to Forbidden West and claim superiority.
 

Bo_Hazem

Banned
I think people are under-appreciating 4K more than I can remember people doing it with 1080p... Like people don't remember that the first two generations of Xbox 360 didn't have an HDMI port...

This generation we even saw a console refresh just to support 4K... Yes, the majority of the people don't have a 4k TV, but in 3 more years, this won't be the scenario anymore...

Now this mentality of visualizing the future is something the customers should worry about and do not invest in something that will be very limited in a couple of years but companies like Microsoft should only care in making their brand big and stronger, so for Microsoft, Lockhard is not a bad console, it even makes them look pro-customers giving people, the right to choose according to their budget.

But for every person that asks me what to buy, I will tell them the same thing I tell them about a 1080p TV/monitor... "save some money an go for the 4K one"

Plus, for Sony it's a smart move to expose more consumers to their TV lineup. I will go 8K in 4-5 years, been on 4K since 2015 and I think 10 years is enough. Tech should evolve, 1080p is already embarrassingly ancient. The first 8K camera was Sony's Cinealta in 2011, so when 8K starts to spread, most of the Netflix content will transfer to 8K support as they're already partners with Sony CineAlta.

venice-pta.jpg



Most of them are using 4K instead in the meanwhile though. For Microsoft 4K or 8K isn't that important for them, their outdated Surface 2 in 1 laptops are mostly still 1080p! Sony will push for 8K with PS5 Pro with more than likely a chiplet of 2 dies stacked, mostly around 20.6-22TF range by 2024-2025 especially if their Crystal LED (their own microLED tech) hits the market and 8K becomes their mid-range or even budget selection for tv's. By then we should expect 16K penetration into the market in the extremely high tier tv's.
 
Last edited:

Nikana

Go Go Neo Rangers!
Interesting Nikana Nikana but doesn't having a separate Lockhart dev kit go against all info so far? The whole point of the profiles/switch on the Dante dev kit is it services both consoles?

Couldn't tell you. Didn't ask that. He mentioned they just recently got Lockhart kits. It's possible they aren't sending them out to everyone as he said he has heard there aren't many
 
Last edited:

geordiemp

Member
Plus, for Sony it's a smart move to expose more consumers to their TV lineup. I will go 8K in 4-5 years, been on 4K since 2015 and I think 10 years is enough. Tech should evolve, 1080p is already embarrassingly ancient. The first 8K camera was Sony's Cinealta in 2011, so when 8K starts to spread, most of the Netflix content will transfer to 8K support as they're already partners with Sony CineAlta.

venice-pta.jpg



Most of them are using 4K instead in the meanwhile though. For Microsoft 4K or 8K isn't that important for them, their outdated Surface 2 in 1 laptops are mostly still 1080p! Sony will push for 8K with PS5 Pro with more than likely a chiplet of 2 dies stacked, mostly around 20.6-22TF range by 2024-2025 especially if their Crystal LED (their own microLED tech) hits the market and 8K becomes their mid-range or even budget selection for tv's. By then we should expect 16K penetration into the market in the extremely high tier tv's.

You really get excited about your 8K Bo.

For me, getting older, I cant see jack shit difference from about 1440p up on a OLED c7 55" which I game on....

I would rather play 1440p upscaled and 60 over 4k30 as I can see more of a difference, and 8K not a chance I will need

 

SlimySnake

Flashless at the Golden Globes
Yep.

Anaconda => XSX
Lockhart => XSS
Edinburgh? If it's related to Lockhart i.e. Lockhart => Edinburgh => XSS then what about Anaconda? What is it's retail codename?
12 tflops 4k
4 tflops 1080p
2 tflops 720p

might as well release a 1 tflops shithole of a console with the same ssd and cpu that runs games at 480p.

we must go lower.

this has to literally be the worst gen ever.
 

Kerlurk

Banned
So, no major performance differences between the PS5 and XSX. Slightly better IQ on XSX at the same resolution and framerate, that's all.
Sounds like less of a difference than between the PS4Pro and X1X.

That because it is.

I created a graph that helps visualize those differences.

pcy2z0g.png


The XSX memory is hard to compare so I pro-rated based on it's bandwidth to percentage of memory.
(560*(10/16)) + (336*(6/16))
 
Last edited:

Nikana

Go Go Neo Rangers!
12 tflops 4k
4 tflops 1080p
2 tflops 720p

might as well release a 1 tflops shithole of a console with the same ssd and cpu that runs games at 480p.

we must go lower.

this has to literally be the worst gen ever.

Literally just posted something from a developer saying Lockhart isnt going to make a difference and won't do a thing to hold next gen back.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Literally just posted something from a developer saying Lockhart isnt going to make a difference and won't do a thing to hold next gen back.
I saw that. I was just going to reply to it.

The dev is working on a cross gen game. It is running a last gen game at 4k 60 fps. of course it would be easy to downgrade it. Even a next gen native 4k 30 fps game could be reworked to run at 1080p, but how many of those games are we going to get? What happens in cases like the UE5 demo where the 12 tflops version runs at 1440p? 3.6 million pixels divided by 3 is 1.2 million pixels which is somewhere around 800p. is that good enough for next gen?

The worst part of it all is that this might force devs to stick with native 4k and waste literally half of the tflops rendering pixels instead of improving the quality of pixels. My 2 tflops example was meant to show just how absurd of a proposition this is from microsoft. technically it can work, but does that mean we should go ahead and launch a 2 tflops 720p console as well? for those poor third world countries? What a pro consumer move that would be.
 

Nikana

Go Go Neo Rangers!
I saw that. I was just going to reply to it.

The dev is working on a cross gen game. It is running a last gen game at 4k 60 fps. of course it would be easy to downgrade it. Even a next gen native 4k 30 fps game could be reworked to run at 1080p, but how many of those games are we going to get? What happens in cases like the UE5 demo where the 12 tflops version runs at 1440p? 3.6 million pixels divided by 3 is 1.2 million pixels which is somewhere around 800p. is that good enough for next gen?

The worst part of it all is that this might force devs to stick with native 4k and waste literally half of the tflops rendering pixels instead of improving the quality of pixels. My 2 tflops example was meant to show just how absurd of a proposition this is from microsoft. technically it can work, but does that mean we should go ahead and launch a 2 tflops 720p console as well? for those poor third world countries? What a pro consumer move that would be.

You clearly ignored everything in the thread of relevance. He stated the Series S can do everything the Series X will be able to.. There is nothing to indicate they will be forced to render at 4K. Thats absurd. And he specifically stated he thinks the difference between Series X and PS5 will come down to effects and not resolution. There is so much more that makes up image quality than Resolution. And the UE5 demo running at 1440p is prime example of this.

You are being ridiculous.
 

sircaw

Banned
That because it is.

I created a graph that helps visualize those differences.

pcy2z0g.png


The XSX memory is hard to compare so I pro-rated based on it bandwidth to percentage of memory.
(560*(10/16)) + (336*(6/16))

I like your graphs, easy to read, thank you for the effort. i am not very technical, but where would things like cache scrubbers and smart-shift technology feature in these type of images.

I image they would effect certain graphs.

If they do are we expecting a lot of change in % numbers.
 
Last edited:
Yep.

Anaconda => XSX
Lockhart => XSS
Edinburgh? If it's related to Lockhart i.e. Lockhart => Edinburgh => XSS then what about Anaconda? What is it's retail codename?

Maybe it’s something like:

Anaconda => XSX
Lockhart => Initially proposed XSS profile running on Anaconda
Edinburgh => XSS

Also, what about the “Count” codename?
 

Kerlurk

Banned
> I like your graphs, easy to read, thank you for the effort.

Thanks, I enjoy making them for myself, and you guys.

but where would things like cache scrubbers and smart-shift technology feature in these type of imagines.

That is getting into finer granularity and is a streaming issue (from the SSD), so that would depend on the developer's game engine and resourcefulness. Very hard to include that.

edit: also works with data in memory (if a game engine uses LODs (level of detail swaps on models/textures) , and is not only a SSD streaming issue, but same thing, hard to compare between consoles in spec graphs. Thanks Darius87.

but where would things like cache scrubbers and smart-shift technology feature in these type of imagines.

That's a power issue on the chip between the CPU and GPU, and Mark Cerny said the maximum clock rate can be reached concurrently, and that is reflected in the graphs.

edit: made some changes.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
You clearly ignored everything in the thread of relevance. He stated the Series S can do everything the Series X will be able to.. There is nothing to indicate they will be forced to render at 4K. Thats absurd. And he specifically stated he thinks the difference between Series X and PS5 will come down to effects and not resolution. There is so much more that makes up image quality than Resolution. And the UE5 demo running at 1440p is prime example of this.

You are being ridiculous.
what in the world are you talking about? did you just completely ignore my math about games running at 1440p on a 12 tflops machine running at 800p? the difference between the ps5 and xbox is 18%. the difference between the xbox and lockhart is 200%. i literally said it will be fine for games targeting 4k, but what about 1440p? what about effects heavy games that struggle to hit 1440p? there are several games on the PS4 that dont hit 1080p. Battlefront, Watch Dogs, and some other ubisoft games. its entirely possible some games wont even hit 1440p.
 

Nikana

Go Go Neo Rangers!
what in the world are you talking about? did you just completely ignore my math about games running at 1440p on a 12 tflops machine running at 800p? the difference between the ps5 and xbox is 18%. the difference between the xbox and lockhart is 200%. i literally said it will be fine for games targeting 4k, but what about 1440p? what about effects heavy games that struggle to hit 1440p? there are several games on the PS4 that dont hit 1080p. Battlefront, Watch Dogs, and some other ubisoft games. its entirely possible some games wont even hit 1440p.

Again..IQ is made up of so many other things that resolution. If a game dips below 1080p on Lockhart is isn't suddenly going to look ugly. Your fixation on resolution over overall IQ is clearly a bias you refuse to let go.
 

Great Hair

Banned
There is nothing the Lockhart can't do that the Anaconda can.

At the same resolution, the 12 TFlops will look always, perform better than the 4Tflops. It´s like playing Minecraft with or without raytracing. Once assets are taking advantages (if at all) of XSX, PS5 (gpu, vram&bandwidth, ssd, cpu 16 threads, etc.), they have to be downgraded so

a
) they can fit in the 7 to 8GB vram of the XSS
b) the 4Tflop gpu can render them, at lower resolution and no add. effects due
c) smaller bandwidth (vram), less features or none at all (raytracing etc.), less cache, etc. yadayada
d) the ssd&cpu can tackle the data requested (what if these two are also slower?) OR
e) or they actually dont bother/have to, because they´ve created the assets, the game with XSS in mind, so that all they have to do is port the code over to XSX and optimize the 2K game code of the XSS to run at 4K on the XSX (and not touch the assets at all).

In other words, there´s a risk that we wont be seeing a jump in assets quality, like better faces (not just mains, also useless npcs), animations, worlds (we are stuck with 2K assets for like 15 years+, textures, models, levels, etc. all optimized for FHD).

Imagine Sony saying "Every PS5 game has to run on the PS4 with 1.8Tflops&1.6Ghz cpu".

All that equates to TIME. TIME spent on OPTIMIZATION/Porting. $$$$ wasted on "slower, older machines. TIME better spent on other projects.

And im pretty sure that DX12 for Xbox and DX12 for PC are similar, but also different (2 code paths). So, it won´t be just a simple "1 click and create PC version", that also takes some time.

Without these slower machines, the developer can finally focus on creating assets ready for 4K displays. Being it textures, higher quality cars, levels, models, worlds ....

Lockhart vs Anaconda is the same as an Intel i3 1050Ti system vs Intel i9 10900k 2080+. Lockhart, by "nature" will hold back the much stronger machines (8GB vram vs 16GB vram alone makes a lot of difference). If the CPU is also slower, the SSD as well ... that all requires extra work.
 

Nikana

Go Go Neo Rangers!
At the same resolution, the 12 TFlops will look always, perform better than the 4Tflops. It´s like playing Minecraft with or without raytracing. Once assets are taking advantages (if at all) of XSX, PS5 (gpu, vram&bandwidth, ssd, cpu 16 threads, etc.), they have to be downgraded so

a
) they can fit in the 7 to 8GB vram of the XSS
b) the 4Tflop gpu can render them, at lower resolution and no add. effects due
c) smaller bandwidth (vram), less features or none at all (raytracing etc.), less cache, etc. yadayada
d) the ssd&cpu can tackle the data requested (what if these two are also slower?) OR
e) or they actually dont bother/have to, because they´ve created the assets, the game with XSS in mind, so that all they have to do is port the code over to XSX and optimize the 2K game code of the XSS to run at 4K on the XSX (and not touch the assets at all).

In other words, there´s a risk that we wont be seeing a jump in assets quality, like better faces (not just mains, also useless npcs), animations, worlds (we are stuck with 2K assets for like 15 years+, textures, models, levels, etc. all optimized for FHD).

Imagine Sony saying "Every PS5 game has to run on the PS4 with 1.8Tflops&1.6Ghz cpu".

All that equates to TIME. TIME spent on OPTIMIZATION/Porting. $$$$ wasted on "slower, older machines. TIME better spent on other projects.

And im pretty sure that DX12 for Xbox and DX12 for PC are similar, but also different (2 code paths). So, it won´t be just a simple "1 click and create PC version", that also takes some time.

Without these slower machines, the developer can finally focus on creating assets ready for 4K displays. Being it textures, higher quality cars, levels, models, worlds ....

Lockhart vs Anaconda is the same as an Intel i3 1050Ti system vs Intel i9 10900k 2080+. Lockhart, by "nature" will hold back the much stronger machines (8GB vram vs 16GB vram alone makes a lot of difference). If the CPU is also slower, the SSD as well ... that all requires extra work.

Maybe you should read the actual quote from the developer. Because your comparisons are wildly inaccurate.
 

quest

Not Banned from OT
At the same resolution, the 12 TFlops will look always, perform better than the 4Tflops. It´s like playing Minecraft with or without raytracing. Once assets are taking advantages (if at all) of XSX, PS5 (gpu, vram&bandwidth, ssd, cpu 16 threads, etc.), they have to be downgraded so

a
) they can fit in the 7 to 8GB vram of the XSS
b) the 4Tflop gpu can render them, at lower resolution and no add. effects due
c) smaller bandwidth (vram), less features or none at all (raytracing etc.), less cache, etc. yadayada
d) the ssd&cpu can tackle the data requested (what if these two are also slower?) OR
e) or they actually dont bother/have to, because they´ve created the assets, the game with XSS in mind, so that all they have to do is port the code over to XSX and optimize the 2K game code of the XSS to run at 4K on the XSX (and not touch the assets at all).

In other words, there´s a risk that we wont be seeing a jump in assets quality, like better faces (not just mains, also useless npcs), animations, worlds (we are stuck with 2K assets for like 15 years+, textures, models, levels, etc. all optimized for FHD).

Imagine Sony saying "Every PS5 game has to run on the PS4 with 1.8Tflops&1.6Ghz cpu".

All that equates to TIME. TIME spent on OPTIMIZATION/Porting. $$$$ wasted on "slower, older machines. TIME better spent on other projects.

And im pretty sure that DX12 for Xbox and DX12 for PC are similar, but also different (2 code paths). So, it won´t be just a simple "1 click and create PC version", that also takes some time.

Without these slower machines, the developer can finally focus on creating assets ready for 4K displays. Being it textures, higher quality cars, levels, models, worlds ....

Lockhart vs Anaconda is the same as an Intel i3 1050Ti system vs Intel i9 10900k 2080+. Lockhart, by "nature" will hold back the much stronger machines (8GB vram vs 16GB vram alone makes a lot of difference). If the CPU is also slower, the SSD as well ... that all requires extra work.
Its got the same cpu and SSD. It is optimizations around the ram. The assets will still need to be cut down for the PC version. This won't be a lot of work just working around the ram some maybe cutting out an effect or 2.

The concern trolling is nice by people who will never by a series x or series s.
 

Great Hair

Banned
Maybe you should read the actual quote from the developer. Because your comparisons are wildly inaccurate.

Sure thing. Keep thinking that a 4tflop can do everything a 12tflop can. The gap between the two is 300%, add faster ssd, cpu etc. ... the 4tflop machine will not run anything optimized for XSX at all. BUT a 12tflop machine will run anything a 4tflop can.

XSS is an extra step. Extra work, time better spent on dlcs, other projects.
 
Last edited:

Nikana

Go Go Neo Rangers!
Sure thing. Keep thinking that a 4tflop can do everything a 12tflop can. The gap between the two is 300%, add faster ssd, cpu etc. ... the 4tflop machine will not run anything optimized for XSX at all. BUT a 12tflop machine will run anything a 4tflop can.

XSS is an extra step. Extra work, time better spent on dlcs, other projects.

Yeah ill take your word over someone whos actually used the device. Sounds legit.

Also the SSD and CPU are not faster. But hey keep trying.
 
Last edited:

quest

Not Banned from OT
Sure thing. Keep thinking that a 4tflop can do everything a 12tflop can. The gap between the two is 300%, add faster ssd, cpu etc. ... the 4tflop machine will not run anything optimized for XSX at all. BUT a 12tflop machine will run anything a 4tflop can.

XSS is an extra step. Extra work, time better spent on dlcs, other projects.
Can we cut out the PC version of games then since they are extra work? Of course not this will be a lot less work than that and some will be the same work like lesser assets for lower spec PC,laptops.
 

Kerlurk

Banned
cache scrubbers works autonomously you don't need to program for it so it works on every game also same applies for I/O.

Right, I was thinking about the SSD streaming, but it also works for data in memory. I edited the post above.

It would still be a result of game engines and developer's resourcefulness, as those cache scrubbers would work with LOD (level of detail models/textures) in memory, and not every game uses LODs, but most do.

Hard to compare in maximum spec comparative graphs, as this is more of an efficiency issue.
 
Last edited:
At the same resolution, the 12 TFlops will look always, perform better than the 4Tflops. It´s like playing Minecraft with or without raytracing. Once assets are taking advantages (if at all) of XSX, PS5 (gpu, vram&bandwidth, ssd, cpu 16 threads, etc.), they have to be downgraded so

a
) they can fit in the 7 to 8GB vram of the XSS
b) the 4Tflop gpu can render them, at lower resolution and no add. effects due
c) smaller bandwidth (vram), less features or none at all (raytracing etc.), less cache, etc. yadayada
d) the ssd&cpu can tackle the data requested (what if these two are also slower?) OR
e) or they actually dont bother/have to, because they´ve created the assets, the game with XSS in mind, so that all they have to do is port the code over to XSX and optimize the 2K game code of the XSS to run at 4K on the XSX (and not touch the assets at all).

In other words, there´s a risk that we wont be seeing a jump in assets quality, like better faces (not just mains, also useless npcs), animations, worlds (we are stuck with 2K assets for like 15 years+, textures, models, levels, etc. all optimized for FHD).

Imagine Sony saying "Every PS5 game has to run on the PS4 with 1.8Tflops&1.6Ghz cpu".

All that equates to TIME. TIME spent on OPTIMIZATION/Porting. $$$$ wasted on "slower, older machines. TIME better spent on other projects.

And im pretty sure that DX12 for Xbox and DX12 for PC are similar, but also different (2 code paths). So, it won´t be just a simple "1 click and create PC version", that also takes some time.

Without these slower machines, the developer can finally focus on creating assets ready for 4K displays. Being it textures, higher quality cars, levels, models, worlds ....

Lockhart vs Anaconda is the same as an Intel i3 1050Ti system vs Intel i9 10900k 2080+. Lockhart, by "nature" will hold back the much stronger machines (8GB vram vs 16GB vram alone makes a lot of difference). If the CPU is also slower, the SSD as well ... that all requires extra work.

I'm very confused by this whole post. Why do they need 7 - 8 GB of VRAM for 1080P? VRAM scales down significantly with the required resolution footprint. It's why VRAM is higher on GPUs targeting 4K than ones that aren't. If I took a 2080 and looked at the VRAM usage at 4K vs 1080p they would not be close. Why do we need 13.5 GB of VRAM for 1080p. Can someone please explain to me by 1/4 of the pixels have the same VRAM footprint? I have yet to hear a coherent argument because its obviously not true.

They are using the same CPU, SSD (speed) and likely equivalent RAM (will less because you don't need it) where is the bottleneck? It has been said over and over again, CPU and SSD speed will be the same. Expect the bandwidth to be scaled appropriately as well. Your strawman is acting like they weren't considering any of this. Why??

Don't create a strawman out of information that people have reiterated is not accurate. Yes, if they used a signficantly slower CPU and SSD and cut the RAM far beyond what was needed could that have an impact? Of course! But they aren't doing that so why even bring it up??

LH will have RT.

Assets are always easier to downgrade than upscale. All games will be made with PS5 and XSX in mind and downscaled. You can't take a 1080p texture and magically make it look good at 4k. But you can take 4k and make it 1080p. Think of it like a photo a 2MB version of a photo blown up will look terrible vs a 10MB version. But a 10MB version of a photo shrunk will look much better than the 2MB because the data is there.
 

SlimySnake

Flashless at the Golden Globes
Again..IQ is made up of so many other things that resolution. If a game dips below 1080p on Lockhart is isn't suddenly going to look ugly. Your fixation on resolution over overall IQ is clearly a bias you refuse to let go.
you must lack some serious reading comprehension. I am literally advocating for 1440p next gen games. I have repeatedly pointed out in responses to your posts that devs focusing on native 4k is a literal waste of tflops. How you can accuse me of being fixated on resolution over IQ is bizarre because i literally said i would rather they focus on quality of pixels. It's like you aren't even reading my posts.

Now at some point, we do start to see resolution hurt video games. If you think 800p is enough then you clearly havent played many current gen games on xbox one. MS went above and beyond with their tools to get out of the 720p-800p hell they found themselves in with current gen. It's going to be an even bigger problem next gen when base resolutions arent just 1080p, they are 1440p. 1080p already adds dithering, shimmering and jaggies to an image especially on 4k tvs. whats going to happen with 800p games? Regardless, my initial post was talking about a supposed 2 tflops gpu to point out how ridiculous this concept is. At what point do we stop? I am personally fine with offering a 1080p console, but 4 tflops simply wont be enough when devs start aiming for 1440p or 4kcb.
 
Last edited:
Sure thing. Keep thinking that a 4tflop can do everything a 12tflop can. The gap between the two is 300%, add faster ssd, cpu etc. ... the 4tflop machine will not run anything optimized for XSX at all. BUT a 12tflop machine will run anything a 4tflop can.

XSS is an extra step. Extra work, time better spent on dlcs, other projects.

The gap between the two is 300% and the targeted resolution is 1/4. What's your point? GPUs scale. They always have depending on what you are targeting. Game design is not impacted by this.

I swear you are concern trolling and arguing in bad faith and completely ignoring the fact that this is how the industry has always worked.
 

Nikana

Go Go Neo Rangers!
you must lack some serious reading comprehension. I am literally advocating for 1440p next gen games. I have repeatedly pointed out in responses to your posts that devs focusing on native 4k is a literaly waste of tflops. How you can accuse me of being fixated on resolution over IQ is bizarre because i literally said i would rather they focus on quality of pixels. It's like you aren't even reading my posts.

Now at some point, we do start to see resolution hurt video games. If you think 800p is enough then you clearly havent played many current gen games on xbox one. MS went above and beyond with their tools to get out of the 720p-800p hell they found themselves in with current gen. It's going to be an even bigger problem next gen when base resolutions arent just 1080p, they are 1440p. 1080p already adds dithering, shimmering and jaggies to an image especially on 4k tvs. whats going to happen with 800p games? Regardless, my initial post was talking about a supposed 2 tflops gpu to point out how ridiculous this concept is. At what point do we stop? I am personally fine with offering a 1080p console, but 4 tflops simply wont be enough when devs start aiming for 1440p or 4kcb.

Your posts are not nearly as easy to comprehend as you think. So the insult is not needed. Your posts read like you have a vendetta against teraflops and that 4TF is going to kill next gen. There is zero indication you are advocating for 1440p until this most recent post.

You can use use techniques like CBR to get to 1080p as well. To claim I haven't been paying attention is daft. Again...You can enhance an image to make it look better than it is especially with new GPU's with new features like DLSS. We have come a long way from a simple upscaler in the Xbox one.

If you are someone who buys a Lockhart and has a 4K screen, then you are not something who cares about dithering and jaggies on an image. 4TF is more than fine is RDNA 2.
 

Great Hair

Banned
The gap between the two is 300% and the targeted resolution is 1/4. What's your point? GPUs scale. They always have depending on what you are targeting. Game design is not impacted by this.

I swear you are concern trolling and arguing in bad faith and completely ignoring the fact that this is how the industry has always worked.

Yeah ill take your word over someone whos actually used the device. Sounds legit.

Are you really saying that, a PS4 can render anything a PS5 can? O´really? It´s much easier to port XSS code over to XSX, than the other way.

Games on PC are optimized for the lowest common denominator since the PS2 days. Your $2,000 RTX 3080Ti with its 16gb? vram, all it will do is run code, assets optimized for GTX1660? 6Gb vram and brute force it to 4K, 8K and higher.

Some developer may take the extra time and add 4K textures ...but not change the cars, the faces, every assets etc. all will look "last gen". That is why so many pc diehard fans whine about, their $5,000 machines never really getting psuhed to the limit.

Last game to do that, server the "hardcore clientele" first, was Crysis .. and because the assets (the player models, faces are still top), the world, essentially everything was made with high end gpus in mind (the tessallation battle).

Crysis SOLD like crap.

EDIT:

Imagine Forza 7 being an XSS game, with 20 cars, each 250,000 polys.?, stereo, 2K textures or lower ... 5Gb vram + 2Gb game/os
Imagine Forza 8 being an XSX game, with 50 to 200 cars (Le Mans < Design), each car 250,000 to 500,000, 4K textures, 3D sounds yada ... it eats up 13GB vram fully.

XSX Forza 8, needs to be optimized to run on the XSS. The XSS game does not require any fiddling, bar some minor adjustments.
 
Last edited:
Status
Not open for further replies.
Top Bottom