• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Heard that Xbox Series S Is A "Pain" For Developers Due To Memory Issues

Status
Not open for further replies.

OmegaSupreme

advanced basic bitch
It wasn't worded the best, but I understood what he was asking. OF course we all know the Series S version of a game will be held back. That's straight forward. The question is what PS5/XSX games have been held back due to the Series S existence? None.

It's hilarious to see so many folks on an enthusiast forum hating on the Series S when it is doing gangbusters. Are you the ones out of touch?
Is it doing gangbusters because people actually want it or is it because they can't get an X or PS5? I don't know myself. All I do know is that you can find an S easily. And as you said this is an enthusiast forum. We like the nice things. PC guys on this forum aren't playing on budget gpus if we can help it. If you want an Xbox and can find an X you're going to get that one. Not an S.
 
It's funny in a thread about Digital Foundry you've apparently never watched a Digital Foundry comparison video before. The S is always compromised. Many times beyond just resolution. Here's a recent one for you and DarkMage619 DarkMage619 and anyone else in complete denial. Enjoy.


Absolutely no one said that the XSS wouldn't have to have compromises Mr. Strawman. The device is $200 less than the XSX with lower specs. Only people arguing completely in bad faith made claims like 'MS 'promised' the ONLY thing all games running on the XSS would have different is lower resolutions and that lower resolution would ONLY be 1440p'. Very similar to the claim you made interestingly enough.

As soon as a game runs at a lower resolution than 1440p or raytracing is not used nonsense claims of MS lying are hurled and that the device is holding back the entire generation. Just hyperbolic comments from people in most cases who do not own the system or Xbox in general. It goes so far to take the word of Alex from DF, a guy that most Sony fans here say shouldn't be believed.

What ever a dev does to get a game working is on that dev. Graphical features like resolution and raytracing get lowered/removed from PS5 and XSX versions of games regularly from the PC versions yet again only the budget console is complained about by the usual people. It's all pretty silly.

lol....that's an entirely different question than what you asked.
The commentary shifts around based on whose making the argument. In this very thread people have claimed wrongly that MS said there would be NO compromises minus resolution on all software AND wrongly that the XSS has negatively affected OTHER systems. So it's understandable why you might not have understood what Hendrick's Hendrick's was saying.


I was going on your own words. "Still waiting for someone to show evidence that a game had to be compromised in order to get it to run on the S." You mentioned nothing of other hardware. Granted I didn't read the whole thread. I simply came to the last page and saw the usual people spouting nonsense.

I totally agree most of the claims against the XSS are pure nonsense.
 
Most ps5 and xbox series games run at 1440p when trying ro deliver decent framerates. Why would a series s deliver 1440p?
Thats is fine the XSS is what it is. My stance would be why should i pay for it when there is much better deals available? MS chose to release an objectively under powered console. MS chose to try to convince everyone is just as capable for next gen. I chose to point, laugh and buy a more capable system.
 

Topher

Gold Member
It wasn't worded the best, but I understood what he was asking. OF course we all know the Series S version of a game will be held back. That's straight forward. The question is what PS5/XSX games have been held back due to the Series S existence? None.

If he had worded it properly then I would not have answered as I haven't seen any evidence that games on other consoles have been affected at all by XSS.

It's hilarious to see so many folks on an enthusiast forum hating on the Series S when it is doing gangbusters. Are you the ones out of touch?

This isn't about sales, but I'm not hating on XSS at all so you can direct that quote elsewhere.
 

Hobbygaming

has been asked to post in 'Grounded' mode.
I think they've already said that was before they got hands on, you can see because the "split" memory bandwith is in fact irrelevant because the 2gb slower speed is reserved for the OS only.
Maybe look at the 120fps mode on Doom Eternal and the interview with id after it shipped for clarification.
Even with the "faster" bandwidth 224GB/s is bad for current gen. They should have scrapped the XSS
 

Jayjayhd34

Member
It wasn't worded the best, but I understood what he was asking. OF course we all know the Series S version of a game will be held back. That's straight forward. The question is what PS5/XSX games have been held back due to the Series S existence? None.

It's hilarious to see so many folks on an enthusiast forum hating on the Series S when it is doing gangbusters. Are you the ones out of touch?

The answer is not answerable because no one on here is a dev that's what makes this discussion so pointless the fact neither side can debunk each other.

The fact is series s exist and their is nothing anyone can do now. IF the series S is holding back development its no diffrent 360 era with limited disc space of dvd or overly complicated cell processor. Or shitty cpus of last generation. However developers still made some best games to ever come out still, yeah sad that developers might be held back might have choose to sacfries or extend development but overall we will get some amazing third party games and that's all that matters.
 
Last edited:

OmegaSupreme

advanced basic bitch
Little Beast is the reason why Series is the best selling Xbox ever. I'm sure Microsoft is very happy about its own decision.
You right on that one. Easier to produce in these difficult times. It probably has much higher profit margins as well. Would it have sold so well if could you get an x? God no but money made is money made.
 

DaGwaphics

Member
I think you misunderstand my point. X1x is clearly the better designed console but only when compared to the Pro. And XSX is an incredible console for its price to tflops ratio. I am sure you have seen my posts where I have listed those charts. I also think the XSX's high CU count and lower clocks are a bottlenecked that is holding back its GPU. I am sure you have seen me call the PS5 overengineered and not as good as it could've been for a $500 console. And yet, I also believe both the PS5 and XSX are the most powerful consoles Sony and MS have ever released.

There is nuance to my argument that I'll admit is hard to get across at times. X1x can be the an incredibly designed console, but also not as good as it could've been a year after the Pro and $100 more expensive. I can forgive Cerny for sticking with the Pro in 2016 when Zens were still relatively new, and he had a strict $399 budget to adhere to, but MS? Polaris was already over a year old by then and they had an extra $100 to play with. Do you remember Senjustsu Sage? He was told by MS people that they would be upgrading the CPU. Poor guy risked his account because he was mislead by some MS exec. But to me, that shows that Zen was in the discussion at some point, and the X1x despite its high ram bandwidth, suffered from the same issues are the Pro when it came to running games at 60 fps. It was bottlenecked by the same shitty jaguar CPU.

Im sure the MS engineers who had to get 6 tflops in a console did a bang up job, but are you telling me that MS engineers didnt want a better CPU in there? Of course they did, but they couldnt have gotten 6 tflops and a zen cpu in one SOC and had to settle for a big jump in GPU likely because beating Sony's tflops count was the main driver there.

Same thing with the XSX. It's a phenomenal console. In some ways, better designed than the PS5, but clearly those 1.8 ghz clocks are way too low for an RDNA 2 card which routinely go over 2.2 ghz. thats where all the gains come from. Someone brought up the 13 tflops 6700xt in this thread, and that GPU has no problems giving a linear 1:1 performance increase over the 10.7 tflops 6600xt. So why dont we see that for the XSX? Because they wanted to hit that 12 tflops and instead of going with the 40 CU 6700xt and pushing clocks to 2.2 ghz and coming in around 11 tflops, they chose the high CU count just to hit the 12 tflops 'marketing' buzzword and ended up creating a bottleneck.

There are some assumptions in there that might not be quite right. The 6700 is above the 6600 because the product stack was designed like that, the 6700 is stronger in every way possible (more cache, more bandwidth, more CUs, and higher clocks). Could AMD have chosen to make a card that was equal to the 6600 but was wider and slower if they had tried? Sure, and the funny thing is that card would probably be more power efficient in the process. These consoles are designed years in advance and are power constrained. Back in 2016 or whenever this project started, someone decided on what they thought the power usage target would be for the console and decisions would have been made from there. It looks like MS prioritized power draw over die size, where as Sony was trying to get the smallest piece of silicone they could (and thus the lowest cost), maybe because they wanted to hit the $400 price point in some way. Seems like there would be trade offs in both cases.
 

JackMcGunns

Member
Didn't recall hearing this when they where developing for the PS2 a console that had half the memory of the original Xbox, ot those gen7 ports to switch, this is more a complain of having to optimize for 2 consoles.


Exactly! but more importantly it means that they are actually focusing on the Series X. If they dialed back the visuals to cater to the S, then there wouldn't be an issue. The hardship is having to optimize these graphic intensive games to also run on Series S, the burden here is on the developers as has always been the obvious, now we have everyone coming out of the woodworks to say "I told you so" when we knew all along it would take some effort, the real concern was if Series X games would suffer and I still stand fimrly in believing that it won't.

100% of complaints are coming from people who don't own and will never own a Series S and just want to console war. The fact is the Series S was a brilliant idea that's paying dividends. While Sony ramped up production of PS4's to fend off the PS5 shortages, which by the way is much more detrimetal to the next gen transition, MS on the other hand sold Series S, a conosle with a Zen 2 CPU, RDNA2 CPU and SSD. What happened to SSD and CPU being the most important upgrades this gen? Ah yea, the goal post thingy.
 

DaGwaphics

Member
The issue arises not that cpu doesn’t need more, it is that when your accessing the slow ram the entire bus becomes that speed for that precise moment. This effects the GPU.

For arguments sake, if the slow ram we’re to be accessed 50% of the time, the essential memory speed on the system would be the avg of 224 and 56 = 140gb/s

In a realistic usage scenario, only one processor can access the memory pool in either console at any given nano second. On XSX the CPU has slightly slower memory to work with in most cases, while the GPU has faster memory to work with. Since GPUs typically need more bandwidth, it's hard to believe there is anywhere near a 50/50 split in memory access.
 

Topher

Gold Member
Exactly! but more importantly it means that they are actually focusing on the Series X. If they dialed back the visuals to cater to the S, then there wouldn't be an issue. The hardship is having to optimize these graphic intensive games to also run on Series S, the burden here is on the developers as has always been the obvious, now we have everyone coming out of the woodworks to say "I told you so" when we knew all along it would take some effort, the real concern was if Series X games would suffer and I still stand fimrly in believing that it won't.

100% of complaints are coming from people who don't own and will never own a Series S and just want to console war. The fact is the Series S was a brilliant idea that's paying dividends. While Sony ramped up production of PS4's to fend off the PS5 shortages, which by the way is much more detrimetal to the next gen transition, MS on the other hand sold Series S, a conosle with a Zen 2 CPU, RDNA2 CPU and SSD. What happened to SSD and CPU being the most important upgrades this gen? Ah yea, the goal post thingy.

Sony did not ramp up production of PS4. That was a bullshit rumor stated by Bloomberg that Sony flatly said was false.
 

Razvedka

Banned
Exactly! but more importantly it means that they are actually focusing on the Series X. If they dialed back the visuals to cater to the S, then there wouldn't be an issue. The hardship is having to optimize these graphic intensive games to also run on Series S, the burden here is on the developers as has always been the obvious, now we have everyone coming out of the woodworks to say "I told you so" when we knew all along it would take some effort, the real concern was if Series X games would suffer and I still stand fimrly in believing that it won't.

100% of complaints are coming from people who don't own and will never own a Series S and just want to console war. The fact is the Series S was a brilliant idea that's paying dividends. While Sony ramped up production of PS4's to fend off the PS5 shortages, which by the way is much more detrimetal to the next gen transition, MS on the other hand sold Series S, a conosle with a Zen 2 CPU, RDNA2 CPU and SSD. What happened to SSD and CPU being the most important upgrades this gen? Ah yea, the goal post thingy.
Few things here.

1). Critics don't need to own something in order to criticize it justifiably. Nor does anyone need to own it in order to grasp the merits of the item in question (although it helps). This is just gatekeeping behavior and is counter productive to a discussion. I do not need to own an F35 Lightning II in order to appreciate that the damn thing has crashed 7 times, projected to cost US taxpayers some 1.7 trillion dollars, and been plagued with issues. It is entirely appropriate for me, as an outside observer given the facts, to openly speculate that it is perhaps "a stupid idea". Or, rather, a spectacularly terrible implementation of a series of great ideas frankensteined together.

2). In many cases people in this thread, including myself, have no qualms with the Series X and have in the past openly praised it. It's a bit ridiculous to accuse critics of the S as 'console warriors' when they praise or even own a Series X (and perhaps other consoles like a PS5, Switch or a gaming PC). If some people here criticizing the S had an X, what would you say to this? How are they console warriors if they literally own a next-gen Xbox? Some weird brand classed-based inner-turmoil/ balkanization right there.

3). Discussing shortcomings of any design does not necessarily render you a partisan for 'the other side'.

4). Memory is being discussed because, since the very beginning with ID Software, it's the sticking point of technical criticism with the S. Digital Foundry brought this discussion back from the past by saying other developers are saying the same thing that ID did (and more generically Remedy).
 
Last edited:

Shmunter

Member
In a realistic usage scenario, only one processor can access the memory pool in either console at any given nano second. On XSX the CPU has slightly slower memory to work with in most cases, while the GPU has faster memory to work with. Since GPUs typically need more bandwidth, it's hard to believe there is anywhere near a 50/50 split in memory access.
That sort of contention doesn’t sound right, happy to read up on it if you have something detailing it.
 

JackMcGunns

Member
Few things here.

1). Critics don't need to own something in order to criticize it justifiably. Nor does anyone need to own it in order to grasp the merits of the item in question (although it helps). This is just gatekeeping behavior and is counter productive to a discussion. I do not need to own an F35 Lightning II in order to appreciate that the damn thing has crashed 7 times, projected to cost US taxpayers some 1.7 trillion dollars, and been plagued with issues. It is entirely appropriate for me, as an outside observer given the facts, to openly speculate that it is perhaps "a stupid idea". Or, rather, a spectacularly terrible implementation of a series of great ideas frankensteined together.

2). In many cases people in this thread, including myself, have no qualms with the Series X and have in the past openly praised it. It's a bit ridiculous to accuse critics of the S as 'console warriors' when they praise or even own a Series X (and perhaps other consoles like a PS5, Switch or a gaming PC). If some people here criticizing the S had an X, what would you say to this? How are they console warriors if they literally own a next-gen Xbox? Some weird brand classed-based inner-turmoil/ balkanization right there.

3). Discussing shortcomings of any design does not necessarily render you a partisan for 'the other side'.

4). Memory is being discussed because, since the very beginning with ID Software, it's the sticking point of technical criticism with the S. Digital Foundry brought this discussion back from the past by saying other developers are saying the same thing that ID did (and more generically Remedy).


1) You can go ahead and give your review of the Series S, that's perfectly fine, my message was for those claiming the Series X would suffer from having a lower common brother where flagship games would be dumbed down to cater to the Series S, rather than focusing on Series X, then having to deal with the port down which is the difficult way, hence the complaints from developers. It's actually a GOOD thing, it means games are not being made with Series S as base.

2) This message wasn't for you then, strange that it would trigger you, wonder why?

3) Again, discussing the shortcummings of the Series S is perfectly fine. There are people that will be okay with the compromise, just like those who bought a RTX 3050 instead of a 3080 or 90, except in that community, no one is constantly criticizing the lower end GPUs constantly and years after it launched. Why continue beating a dead horse unless there was some other agenda?

4) Cool, that's the part I'm interested in, but not everyone is discussing that, again you seem to be taking it personal.
 

DaGwaphics

Member
That sort of contention doesn’t sound right, happy to read up on it if you have something detailing it.

It's my understanding that ram is stripped just like a raid array. The accesses just come so fast that it is unnoticed. An SSD can only deliver a single data stream too, they also are just super fast at doing it. A nano second might even be too long, LOL. Whatever the fastest time period is that data can be read/written to ram would be the minimum time needed for any single access to take place. Though I do remember AMD talking about allowing two accesses but at half speed, that doesn't sound that great for a GPU either.
 
Last edited:

Razvedka

Banned
1) You can go ahead and give your review of the Series S, that's perfectly fine, my message was for those claiming the Series X would suffer from having a lower common brother where flagship games would be dumbed down to cater to the Series S, rather than focusing on Series X, then having to deal with the port down which is the difficult way, hence the complaints from developers. It's actually a GOOD thing, it means games are not being made with Series S as base.
I am saying that the Series S will become the 'ceiling' for this generation for multiplatform games. So I am effectively arguing that by its existence, yes, there will have to be some level of 'dumbing down'. Not necessarily apocalyptic or anything, but it's going to happen.

2) This message wasn't for you then, strange that it would trigger you, wonder why?

Doesn't matter if it isn't applicable to me, it was worth being said.

3) Again, discussing the shortcummings of the Series S is perfectly fine. There are people that will be okay with the compromise, just like those who bought a RTX 3050 instead of a 3080 or 90, except in that community, no one is constantly criticizing the lower end GPUs constantly and years after it launched. Why continue beating a dead horse unless there was some other agenda?
This isn't an apt comparison. Consoles are always the 'baseline' for a new generation of gaming, never PCs. Games are typically developed to work on consoles and then ported over to PC as something of an afterthought (relatively speaking). So when one of the machines is significantly weaker than the other SKUs, it means that the baseline is now becoming diminished/warped. This is exactly what the ID guys were saying.

4) Cool, that's the part I'm interested in, but not everyone is discussing that, again you seem to be taking it personal.
Nah. At this point in my life, with my disposable income, plastic fun boxes are just plastic fun boxes. I can own multiple. The technology and philosophical underpinnings of what's going on with those fun boxes and gaming at large is what has me still engaged with the community.
 
Last edited:

Shmunter

Member
It's my understanding that ram is stripped just like a raid array. The accesses just come so fast that it is unnoticed. An SSD can only deliver a single data stream too, they also are just super fast at doing it. A nano second might even be too long, LOL. Whatever the fastest time period is that data can be read/written to ram would be the minimum time needed for any single access to take place. Though I do remember AMD talking about allowing two accesses but at half speed, that doesn't sound that great for a GPU either.
You’d need more granular control on what goes where tho. Otherwise striping would end up with GPU critical assets in the slow ram. There has got to be some additional mechanisms here to manage such a config.
 

DaGwaphics

Member
You’d need more granular control on what goes where tho. Otherwise striping would end up with GPU critical assets in the slow ram. There has got to be some additional mechanisms here to manage such a config.

On XSX there would be two stripped arrays of memory. One with 10 1GB pieces and one with 6 1GB pieces, as far as the stripping goes the array not being accessed doesn't exist for that specific transaction. Even though these apparently present to the OS as a unified pool and are freely accessible by both processors, you'd think they would not have overlapping address space. At some level you would have to have control over what goes where, even if it was a kernel level function. 🤷‍♂️
 
Last edited:
Here's how much RAM Win 11 OS is using on my PC (on a fresh boot). About...700MB more than Series S OS usage? That's nowhere near half of 8GB like you falsely claimed DarkMage619 DarkMage619 .

3T4KVuf.png
And it's only talking as much because you have 16 gb of memory available for it.
No one in their right mind would use a 2GB card for current-gen titles. Most common VRAM amount according to Steam survey is 8GB. So total usable memory from both system RAM + VRAM would be ~13.5GB. Almost +70% more than Series S. 6GB is next, which would mean 11+GB total (which is still a lot more than what's available to XSS).
Generic "PC" Windows with UI is known to be a memory hog in a way that it isn't on consoles. Also bare in mind that unified is different than split in regards to the amount of redundancy.

This goes both ways, with people forgetting that Apple M1 CPU's doesn't have dedicated VRAM so they effectively only have 8 GB memory for both things, and this case where the console affectively has to share. But regardless of that, it has to be said that unified can still certainly "punch" a little above it's weight in both scenarios because data accessed by both doesn't have to be moved or repeated. Also both processors gain cycles by it being so.

Anyway, VRAM is sacred so most of the console's budget is going there, processor though, and processor with access to an SSD, it's going to look a bit different, some games will use certainly use a bit of virtual memory. There's a performance penalty in doing that, but certainly acceptable if they implement a tier system.

If anything, I continue to think Series S and X biggest design blunder is the internal SSD being stuck with PCIe 4.0 with 2-lanes limitation. (equivalent to PCIe 3.0 with 4 lanes) as SSD bandwidth is key to offsetting these bottlenecks that happen when you don't have more RAM.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
So excusing your whataboutery.. your agreeing with me that some people on here and in an official capacity at Xbox said be no compromises between systems (y)

The original PR statements were embellished to an extent that every console maker does, there's nothing really much more to it.

Just the same 3, 4 users trying to convince everyone that it's a bigger deal than it actually is.
 

SlimySnake

Flashless at the Golden Globes
Series S raw pixel fill rate is 75% higher than that of the PS4 Pro, and it did "OK" with dynamic 4K, so why do you think a system all of a sudden will struggle at 1080p on a 1080p screen? If people run the Series S on a 4K screen, and I am one of them, then it is on us. The system was never marketed as a 4K console, hell I have the box mine came in righ beside me and I don't even see 1440p listed as a feature.
Asked and answered. It's almost impossible to buy a 1080p screen.

And besides, we have pointed out how next gen games like Matrix and the ray tracing only game like Metro do not run at 1080p. They either run at 512p like Metro or drop SIGNIFICANTLY below 533p like the matrix demo. You can run a 512p game on a 720p tv and it will look blurry. Thats why we moved to 720p as the standard in 2005.
 

SlimySnake

Flashless at the Golden Globes
Most ps5 and xbox series games run at 1440p when trying ro deliver decent framerates. Why would a series s deliver 1440p?
I wish I had bookmarked some of my posts because thats exactly what I said two years ago. That most next gen only games wont even come close to 4k 60 fps. They will be 1440p and I specifically remember bringing up scenarios where PS5 and XSX games might drop to 1080p or 2 million pixels. I did the math and pointed that the XSS would have to run the game at 500k pixels since it's only got 1/3rd the raw gpu power of the series S. Well, we are seeing that now with Metro and Matrix and those are just the first two next gen only games.

Things are going to get far far worse for the Series S. The only reason it hasnt is because devs have literally not made a next gen game so far. Even metro is only doing ray tracing with no real cpu or physics improvements that we expect to see this gen.
 

STARSBarry

Gold Member
This is extremely wrong my brother.

You can walk into any Best Buy as Topher mentioned above, or Walmart, Target, Costco.

There's tons and tons of 1080p displays on in stores and can be bought online.

He's right you know... like most of them might be off brand and not using premium tech like the old high end 1080p's but there's still plenty.
 

SlimySnake

Flashless at the Golden Globes
There are some assumptions in there that might not be quite right. The 6700 is above the 6600 because the product stack was designed like that, the 6700 is stronger in every way possible (more cache, more bandwidth, more CUs, and higher clocks).
Not sure what you are trying to say here. Are you saying the XSX was not designed to be stronger than the PS5 in every way possible? They chose 12 tflops for a reason. They wanted to be the most powerful console next gen. They advertised it as such.
Could AMD have chosen to make a card that was equal to the 6600 but was wider and slower if they had tried? Sure, and the funny thing is that card would probably be more power efficient in the process. These consoles are designed years in advance and are power constrained. Back in 2016 or whenever this project started, someone decided on what they thought the power usage target would be for the console and decisions would have been made from there. It looks like MS prioritized power draw over die size, where as Sony was trying to get the smallest piece of silicone they could (and thus the lowest cost), maybe because they wanted to hit the $400 price point in some way. Seems like there would be trade offs in both cases.

It's clear that both Sony and MS knew about RDNA 2.0 well before we did. And we knew about RDNA 2.0 in 2019 when RDNA 1.0 was first revealed. Proof is in the specs and Cerny confirmed that it would have ray tracing back in April 2019 before RDNA 1.0 was even announced. Yes, they were designed years in advance but Sony and MS knew what AMD had in store for RDNA 2.0 and a big part of that was ray tracing and the perf per watt gains. Otherwise Sony wouldve never targeted such high clock speeds.
 

yamaci17

Member
Asked and answered. It's almost impossible to buy a 1080p screen.

And besides, we have pointed out how next gen games like Matrix and the ray tracing only game like Metro do not run at 1080p. They either run at 512p like Metro or drop SIGNIFICANTLY below 533p like the matrix demo. You can run a 512p game on a 720p tv and it will look blurry. Thats why we moved to 720p as the standard in 2005.
" You can run a 512p game on a 720p tv and it will look blurr"

apparently it's not. that's the part where I get fatal errors.

i cannot even stand 960p on my 1080p screen and 900p looks something horrid in ac valhalla and halo infinite (both games use the exact same temporal upscaler they use on consoles). 720p? stuff of nightmare.

but others claim these resolutions look fine on a small 1080p screne. i don't know if 24 inch is small or not. i guess small. or maybe not. i have no idea at this point. i'm pretty sure someone will call me a liar for noticing 900p from 1080p on my 1080p screen. i mean, even native 1080p rendering does not look good enough, and i have to use %150-%200 scaling to make games look good (1620p-2160p supersampling) current Temporal Anti Aliasing methods makes the 1080p look as if its 720p or something.

for example in the comparison belove, i can see a HUGE difference in sharpness and clarity on my 1080p screen;



either my 1080p screen is a 4k screen in disguise, or there's something i'm missing, or just that I'm too sensitive to native resolution at this point.

imagine my shock when i can notice 1440p upsampling to 1080p on my 1080p screen yet some people say 800p looks fine on a 1080p screen. i must be crazy, then.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
What do you mean? I can drive down to Best Buy and buy a brand new 1080p TV right now.
This is extremely wrong my brother.

You can walk into any Best Buy as Topher mentioned above, or Walmart, Target, Costco.

There's tons and tons of 1080p displays on in stores and can be bought online.
I stand corrected but i still dont think anyone is buying 1080p tvs in 2022 when 4k tvs can be bought for as cheap as $250.
 

OmegaSupreme

advanced basic bitch
I stand corrected but i still dont think anyone is buying 1080p tvs in 2022 when 4k tvs can be bought for as cheap as $250.
Very true. Unless you're still rocking a plasma from 10 years ago all 1080p TV's suck these days and nobody buys them. Maybe for a kids room.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
" You can run a 512p game on a 720p tv and it will look blurr"

apparently it's not. that's the part where I get fatal errors.

i cannot even stand 960p on my 1080p screen and 900p looks something horrid in ac valhalla and halo infinite (i used both games' temporal upscaler to see if it does anything valuable). 720p? stuff of nightmare. they still looked horrible.

but others claim these resolutions look fine on a small 1080p screne. i dont know if 24 inch is small or not. i guess small. or maybe not. no idea. all i know that 1080p itsef looks ugly, 900p even uglier, and i have to use %150-%200 scaling to make games look good (1620p-2160p supersampling)

either my 1080p screen is a 4k screen in disguise, or there's something i'm missing, or just that I'm too sensitive to native resolution at this point.
I forced my brother to buy a giant 55 inch HD TV way back in 2007 just so i can play my PS3 games on a bigger screen than the tiny 21 inch 720p tv i bought myself. I thought 720p PS3 era games looked fine on his screen and had way better picture quality than my 720p tv. Now that was a $2,000 tv we bought on sale for $1,500 so that might be why, but I never had issues running PS3 or 360 games on a 1080p tv. Now when I upgraded my PC and started running every PS3 era game at 1080p 60 fps, I was like omg that looks incredible, but I remember going back to PS3 exclusives and the stunning Halo 4 running at 720p and they more or less looked fine. Low res for sure, but not awful like how DriveClub and Batman Arkham Knight look on my 4k screens. They just look awful. A jaggied shimmery mess that I did not notice on my 1080p tv. Frankly it's ruined DriveClub for me. I literally cant play one of my favorite games of last gen because of the poor IQ.

P.S 1080p Monitors in my experience exacerbate the problem because you sit so close to them. I remember trying to run Crysis at 720p on extreme settings way back in 2011 and ended up settling for 1080p at high because it simply looked better on my monitor. I am sure my TV's upscaling wouldve salvaged the IQ at 720p but I dont ever remember testing Crysis on my tv.

P.P.S There was one PS3 game that was a nightmare on my 1080p. Castlevania lords of shadow. Shimmering everywhere. This was the first time i really truly noticed that problem and to say it was an eyesore would be an understatement. but most of the other games were fine. KZ2 looked absolutely breathtaking on my brother's 1080p tv.
 

yamaci17

Member
Very true. Unless you're still rocking a plasma from 10 years ago all 1080p TV's suck these days and nobody buys them. Maybe for a kids room.
1080p tvs only suck because tv manufecturers wants them to be so

if they gave quality oled and hdr to 1080p, most people would still choose them. these so called "next gen" consoles are 1080p-60 fps machines. literally, most of the 4K TVs today cost more than the damn consoles. that kind of weird thing, i never remember happening before.

people suggest new and shiny 4k screens not because they're 4k but because they have next gen TV goodies such as vrr, allm, oled, high quality hdr and such.

this is why i personally protest and keep on with a 1080p screen. i have no intention paying up big money for pixels i won't use. if ihad a 3080 or 3090 system that can ruck native 4k or at least 4k dlss quality, i'd consider a 4k screen tho.

for ps5 and sx? lmao its a waste. those people will feel compelled to upgrade once midgen refresh comes along. and all of a sudden a surplus of second hand sx and ps5 consoles will find their way into homes with 1080p screens and monitors, just like they were always meant to

practically these tv makers hide goodies behind the "entry fee" of buying 8 millions of pixels instead of 2 millions of pixels. and then they expect you to pair their 8 millions of pixels TVs with consoles that barely pushes 2.5-3.5 millions pixels in most games.

you can make the argument that bigger screens need 4k but it is the case when you can feed the damn tv a 4k content. even most movies nowadays are still mastered at 1080p..
 
Last edited:

Topher

Gold Member
I stand corrected but i still dont think anyone is buying 1080p tvs in 2022 when 4k tvs can be bought for as cheap as $250.

No, they still sell. They even have 720p TVs available. If someone is looking to buy a 32" or smaller TV then 1080p makes sense. Actually, I'm not seeing any 4K TVs available under 43" at Best Buy right now.
 

OmegaSupreme

advanced basic bitch
1080p tvs only suck because tv manufecturers wants them to be so

if they gave quality oled and hdr to 1080p, most people would still choose them. these so called "next gen" consoles are 1080p-60 fps machines. literally, most of the 4K TVs today cost more than the damn consoles. that kind of weird thing, i never remember happening before.

people suggest new and shiny 4k screens not because they're 4k but because they have next gen TV goodies such as vrr, allm, oled, high quality hdr and such.

this is why i personally protest and keep on with a 1080p screen. i have no intention paying up big money for pixels i won't use. if ihad a 3080 or 3090 system that can ruck native 4k or at least 4k dlss quality, i'd consider a 4k screen tho.

for ps5 and sx? lmao its a waste. those people will feel compelled to upgrade once midgen refresh comes along. and all of a sudden a surplus of second hand sx and ps5 consoles will find their way into homes with 1080p screens and monitors, just like they were always meant to

practically these tv makers hide goodies behind the "entry fee" of buying 8 millions of pixels instead of 2 millions of pixels. and then they expect you to pair their 8 millions of pixels TVs with consoles that barely pushes 2.5-3.5 millions pixels in most games.

you can make the argument that bigger screens need 4k but it is the case when you can feed the damn tv a 4k content. even most movies nowadays are still mastered at 1080p..
Plenty of content in 4k and hdr these days outside of games. I'd understand if you only play games but as a movie watcher I can appreciate the extra pixels hdr etc.
 

yamaci17

Member
Plenty of content in 4k and hdr these days outside of games. I'd understand if you only play games but as a movie watcher I can appreciate the extra pixels hdr etc.
no no dont get me wrong. for oeld and hdr alone, i would still suggest those tvs for anyone who plays games. they're practically a must if you have the budget. they're nextgen tv stuff. i only hate the fact that oled they're not being given to anything between a mobile phone and a 4k tv screen.

even my damn phone has a oled screen but somehow there's not a single proper oled-hdr 1080p screen. it just makes me sad overall. sadly my budget cannot cover TVs that costs 1.5-2 times more than a console or costs more than my PC itself :/ and sadly them TVs being 4k and 120 hz is a big factor for them being so expensive

for movies, you have a point. i'm not much of a movie watcher myself, i try to go to cinema for new titles. but i can see your point. its just that the budget is the problem


lowest end OLED screen costs like 800 900 bucks no? just 1 month prior, my friend built a PC with a 3060ti for 1k bucks. imagine that. a screen that costs as much as the beefy PC itself. its a bit crazy if you ask me. he ended up buying a 1440p screen instead and he's happy. but the screen is IPS with shitty HDR.
 
Last edited:

baphomet

Member
1080p tvs only suck because tv manufecturers wants them to be so

if they gave quality oled and hdr to 1080p, most people would still choose them. these so called "next gen" consoles are 1080p-60 fps machines. literally, most of the 4K TVs today cost more than the damn consoles. that kind of weird thing, i never remember happening before.

people suggest new and shiny 4k screens not because they're 4k but because they have next gen TV goodies such as vrr, allm, oled, high quality hdr and such.

this is why i personally protest and keep on with a 1080p screen. i have no intention paying up big money for pixels i won't use. if ihad a 3080 or 3090 system that can ruck native 4k or at least 4k dlss quality, i'd consider a 4k screen tho.

for ps5 and sx? lmao its a waste. those people will feel compelled to upgrade once midgen refresh comes along. and all of a sudden a surplus of second hand sx and ps5 consoles will find their way into homes with 1080p screens and monitors, just like they were always meant to

practically these tv makers hide goodies behind the "entry fee" of buying 8 millions of pixels instead of 2 millions of pixels. and then they expect you to pair their 8 millions of pixels TVs with consoles that barely pushes 2.5-3.5 millions pixels in most games.

you can make the argument that bigger screens need 4k but it is the case when you can feed the damn tv a 4k content. even most movies nowadays are still mastered at 1080p..



Basically everything in this post is wrong.
 

yamaci17

Member
you are free to think whatever you think

i talk on the fact that most newer games run at an average of 1080p-1300p which is a far cry from 4k (8 millions of pixels)
 
Last edited:

avin

Member
I wish I had bookmarked some of my posts because thats exactly what I said two years ago. That most next gen only games wont even come close to 4k 60 fps. They will be 1440p and I specifically remember bringing up scenarios where PS5 and XSX games might drop to 1080p or 2 million pixels. I did the math and pointed that the XSS would have to run the game at 500k pixels since it's only got 1/3rd the raw gpu power of the series S. Well, we are seeing that now with Metro and Matrix and those are just the first two next gen only games.

I'm quite happy to take your word for it. But what are games like The Matrix Demo supposed to look like on the XSS? Can we see the horrors coming our way, is there an example detailing this?

Because I tooled about in it on an XSS, and I thought it was fine.

avin
 
Last edited:

baphomet

Member
have fun playing 1008-1200p games on your console, keep on with the dream that you will have native 4k somedays aside from last gen and cross gen games (even in such games, the games still hover between 1440p-1800p, which is ... dreadful)

My 3090 and uhd movies say otherwise.

Also, are you just oblivious to those 1440p games on a 4k screen still looking far better than your games on that ancient 1080p set?
 

Topher

Gold Member
My 3090 and uhd movies say otherwise.

Also, are you just oblivious to those 1440p games on a 4k screen still looking far better than your games on that ancient 1080p set?

Yeah.....pretty much what I was going to say. I game on a LG C1 OLED 55" 4K with my PS5, XSX, and 3070 ti PC and it is bizarre to suggest a 1080p TV is comparable in any way.
 

FrankWza

Member
But what are games like The Matrix Demo supposed to look like on the XSS?
The Xbox Series S will play every game that can be played on Xbox Series X, just at a lower resolution with lower quality textures. This means whether you're interested in third-party games like Assassin's Creed Valhalla and Call of Duty: Black Ops Cold War or titles from Xbox Game Studios such as Halo Infinite, they can be played on Xbox Series S with next-generation features like seamless loading, 120 FPS support, and more.
In this video, Jason Roland, director of program management for Xbox, goes over the finer points of Xbox Series S and how it differentiates from Xbox Series X.

In the video, Microsoft said it wanted to build two consoles with similar next-gen capabilities at a differentiated price point, but states that Xbox Series S will deliver the same experience as Xbox Series X, just at a reduced rendering resolution.
It's Xbox Series S where there's a real story here - just how did the junior Xbox with just four teraflops of compute power pull this off?

First of all, Epic enlisted the aid of The Coalition - a studio that seems capable of achieving results from Unreal Engine quite unlike any other developer.
Multi-core and bloom optimisations were noted as specific enhancements from The Coalition, but this team has experience in getting great results from Series S too, so don't be surprised if they helped in what is a gargantuan effort.

Series S obviously runs at a lower resolution (533p to 648p in the scenarios we've checked), using Epic's impressive Temporal Super Resolution technique to resolve a 1080p output. Due to how motion blue resolution scales on consoles, this effect fares relatively poorly here, often presenting like video compression macroblocks. Additionally, due to a sub-720p native resolution, the ray count on RT effects is also reined in, producing very different reflective effects, for example. Objects within reflections also appear to be using a pared back detail level, while geometric detail and texture quality is also reduced. Particle effects and lighting can also be subject to some cuts compared to the Series X and PS5 versions. What we're looking at seems to be the result of a lot of fine-tuned optimisation work but the overall effect is still impressive bearing in mind the power of the hardware. Lumen and Nanite are taxing even on the top-end consoles, but now we know that Series S can handle it - and also, what the trades may be in making that happen.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I'm quite happy to take your word for it. But what are games like The Matrix Demo supposed to look like on the XSS? Can we see the horrors coming our way, is there an example detailing this?

Because I tooled about in it on an XSS, and I thought it was fine.

avin
westworld-if-you-cant-tell.gif
 
Status
Not open for further replies.
Top Bottom