Riky
$MSFT
Nahhhhhh.
Raytracing on the S is a lost cause. They'll just use that memory for textures and assets.
It's already confirmed for Forza Motorsport.
Nahhhhhh.
Raytracing on the S is a lost cause. They'll just use that memory for textures and assets.
hahahaNahhhhhh.
Raytracing on the S is a lost cause for most devs. They'll just use that memory for textures and assets.
In the XsS for next-gen game experiences, going by the resolution and cutbacks of the UE5 matrix demo, the answer will be no. The game is already running at sub -HD-ready resolutions at times IIRC to hit the 30fps target, so the demo is clearly GPU processing constrained, not CPU constrained - where memory could help fps if the demo was CPU/memory bound...
Will having more ram available positively impact performance, specifically fps, in a situation where the software is memory constrained?
..
I understand that others are using the statement to make some ridiculous claims. I am not in that camp.They think it was because of improvement in development tools/drivers. They directly say that the system getting updated over time is what resulted in the game performance being improved.
Even though you didn't answer my question, you cherry picked a situation instead, I'll digress as it's really unimportant and the conversation as already been derailed enough.In the XsS for next-gen game experiences, going by the resolution and cutbacks of the UE5 matrix demo, the answer will be no. The game is already running at sub -HD-ready resolutions at times IIRC to hit the 30fps target, so the demo is clearly GPU processing constrained, not CPU constrained - where memory could help fps if the demo was CPU/memory bound.
The statement is to suggest the XsS is getting better for its bottlenecks, which it is, sort of, but the GPU is the bottleneck that they'd need to alter to improve fps "graphic performance" IMO.
Nobody will blame or demand raytracing on a Series S game, so it's actually low on most developers list of priorities.It's already confirmed for Forza Motorsport.
Nobody will blame or demand raytracing on a Series S game, so it's actually low on most developers list of priorities.
Forza Motorsport is the exception rather than the rule and it's also first party.
I think we have to wait for the results before we know "how possible" it will be. Something I am looking to be pleasantly surprised if it's competent, and not 640p or almost so bad they should not have included it. Software wise, I am definitely eager to see the fruits of their labor.It shows it's possible though
Consoles are always going to be bound by something. The Switch fails at all levels(CPU, GPU, Memory capacity, Memory speed, IO speeds) which is why it struggles the way it does. The PS4 had an awful CPU so bad even the Switch could run PS4 games fine enough with it's CPU. The PS5 and Series have 2 big weakenesses the lack of dedicated RT hardware on the level of Nvidia GPUs and very weak memory gains.So I guess those rumors and DF reporting on devs complaining about the series s memory issues were true after all. I was told on this forum that it was not an issue.
Too little too late. Hundreds of extra MB is nothing when you are still bound by memory bandwidth and an extremely low tflops count for a next gen console. They shouldve always targeted a 6 tflops console to match the X1x tflops count and given it the same memory allocation as the x1x with an SSD and CPU upgrade. Too many cost cutting measures just to hit a $299 price point with no regard for how watered down the experience would be for their consumers. Now they are scrambling and hoping to find ram but it wont be enough.
I didn't answer the question directly because I was merely mirroring your lack of sincerity that you had shown by moving the goal posts when your own scenarios weren't going to result in more fps.Even though you didn't answer my question, you cherry picked a situation instead, I'll digress as it's really unimportant and the conversation as already been derailed enough.
Yet again, it just came across as disingenuous posting and I wasn't bothered by letting you insincerely ask, over and over; especially as I've never signed an NDA for such info.I have noticed though for the 3rd time you have failed to answer my question about the "certification requirements MS would be admitting to not enforcing" but we both know that was trick question as you nor I know what those are. I'm sure you would not post that claim if you could go back.
Ill admit I was wrong in that I was hammering you about statements you made that you may have not clearly thought out. It was unnecessary as I should have just disagreed and moved on.
Got you, more available memory during memory constrained situations = same or worse performance. Makes senseIf we are being VRAM constrained suggests you are also going to be over burdening your GPU caches too with the cached data being replaced too frequently from the fully filled VRAM you are using, so the additional VRAM will just be alleviating VRAM bandwidth updates which are the reason for needing to free up more VRAM,- in this fictional scenario - and may even increase GPU cache misses and mildly hamper fps too because the ratio of VRAM data wanting to be used with GPU caches has actually worsened AFAIK.
Ok so earlier in the thread when you said that by stabilizing frame rates that were dropping below 30 or 60 fps, MS would be admitting to not enforcing their "certification requirements". These "certification requirements" where actually just the listed supported resolutions and display frequencies that MS mentioned in advertisments and press releases. I'm glad that's cleared up.The info is baked into the release documents of the consoles, anyway - about the resolutions and frequencies they support - and we can thank outlets like NXGamer and DF for holding a flame to the feet of publishers for the last 15years so that even ignoring TRCs we have de facto requirements like stable 30/60, no tearing - unless sellable to DF - stable frame-pacing and TV spec upscaled or native resolutions.
Im not trying to be dismissive, but what are you even going on about? NDA's? Secret documents? Also, I don't think anyone on the entirety of neogaf ever questioned if games should work on modern displays. Why is this even being brought into the discussion because it definitely is a "no shit Sherlock" type of situation.It is all obvious stuff that on GAF no one would probably be interested in seeing your "secret" document covered by an NDA, because it is a "no shit sherlock?" type situation. It is hardly mind blowing that they have to make games that update at compatible rates(30/60) and don't have artefacts that cause eye strain or epilepsy like tearing, and resolutions for TVs that are in all the homes in the world, is it?
I think we have to wait for the results before we know "how possible" it will be. Something I am looking to be pleasantly surprised if it's competent, and not 640p or almost so bad they should not have included it. Software wise, I am definitely eager to see the fruits of their labor.
Xbox OS is also 4k.
My scenario wasn't vague, it was a genuine attempt to replicate the hypothetical and put a face on the scenario their sentence describes in the GPU - because they are implying it is GPU memory bound with the "graphic performance" and even your two earlier examples of Ray tracing and denoising would logically be getting done on the GPU, so I assumed it was established this memory limitation is VRAM.Got you, more available memory during memory constrained situations = same or worse performance. Makes sense
Most definitely if consoles were still targeting a large audience of SD tv owners with s-video, scart and component video, but the drop in resolutions you are describing along with the fx still being compromised outputing on TVs with 1080p or better native panels just illustrates the XsS as a next-gen system conundrum.I mean, if Metro EE has to drop to 1080p or thereabouts when stressed to get RGTI + 60 FPS on SX and PS5, we should expect a similar level of DRS on Series S too if they want all the same features right ..
How indepth did Road to PS5 really go?I guess road to ps5 never happened?
It was pretty high over imo. Very understandable if you have an interest in tech. It was no Siggraph talk for example.How indepth did Road to PS5 really go?
Did they discuss API's?
No, not really unless you have a broken game where it ends up HDD thrashing like skyrim on PS3. Memory constraint will not help with fps, it can help with res if the framebuffer size is being held back due to memory size though though I suspect most games are GPU limited and not memory limited there.Will having more ram available positively impact performance, specifically fps, in a situation where the software is memory constrained?
My scenario wasn't vague, it was a genuine attempt to replicate the hypothetical and put a face on the scenario their sentence describes in the GPU - because they are implying it is GPU memory bound with the "graphic performance" and even your two earlier examples of Ray tracing and denoising would logically be getting done on the GPU, so I assumed it was established this memory limitation is VRAM.
More VRAM is more memory, but unlike CPU ram, which is part of a memory hierarchy which impacts the entire system performance adding more VRAM only impacts the flow of data in and out of VRAM and the GPU caches which get filled from the VRAM.
Are you saying the exact scenario I described wouldn't be GPU cache limited? And therefore have a different take on it?
If you are now saying that you believe the sentence is more about being system ram bound - either doing Ray tracing/denoising on the CPU, or more logically not a direct "graphic performance" related task -and the extra memory is just to improve "program performance", then I'm not arguing against that, just the wrong use of the term "graphic" in "graphic performance".
No, not really unless you have a broken game where it ends up HDD trashing like skyrim on PS3. Memory constraint will not help with fps, it can help with res if the framebuffer size is being held back due to memory size though though I suspect most games are GPU limited and not memory limited there.
Yeah those are large system RAM capacity differences and really low frametimes you're dealing with. you might be IO bound for brief moments with the OS dealing with increased I/O activity and background processes to sway the average (is it rolling average or total though?) but notice the instantaneous frametimes are pretty much the same for the scenes. At 16ms or 33ms typical frametimes these I/O spikes would have less of an effect on averages too. You wouldn't typically get noticeable framerate improvements with more RAM in the same configuration. 'Graphics performance' here refers more to resolution and better sampling I would say. With 120fps mode possibly it will make things more stable and lower averages very slightly (we aren't talking about 8GB here even) but not much difference to your instantanous frametime.
I don't know man. Comparing the dual channel setups I see an increase in avg fps with increased capacity. Obviously these are pretty large increases but a few hundred MB of additional capacity may help stabilize frame rates. MS's statement was pretty simple, these changes "CAN" increase graphical performance in memory constrained situations. MS isn't promising the moon here, they are saying it can help. I don't understand the pushback.
Both the PS3 and 360 had 512MB. The 360 had unified memory though but the PS3 a 256+256 split memory.Consoles are always going to be bound by something. The Switch fails at all levels(CPU, GPU, Memory capacity, Memory speed, IO speeds) which is why it struggles the way it does. The PS4 had an awful CPU so bad even the Switch could run PS4 games fine enough with it's CPU. The PS5 and Series have 2 big weakenesses the lack of dedicated RT hardware on the level of Nvidia GPUs and very weak memory gains.
They top out at 16GB when the One X had 12GB and the PS4/Xbox One had 8GB(due to the price of ram at the time being super cheap) while the 360 had 512mb and the PS3 had 256mb, it's a very anemic increase(due to the price of ram at the time being very expensive) which is why SSD tech is being pushed to cope with it, but as we all know SSDs are incredibly slow compared to ram so they can't replace RAM is most scenarios. These 2 aspects will haunt current gen consoles through their lifetime and will be significantly improved should we receive enhanced versions.
I can get behind what your saying, but an fps avg is derived from a collection of instantaneous frame times. I'm not refuting you, just highlighting that one can not say having more available memory wont help performance in absolute terms. Might I add that MS in their statements were also not speaking in absolutes, they clearly said it "can" help.Yeah those are large system RAM capacity differences and really low frametimes you're dealing with. you might be IO bound for brief moments with the OS dealing with increased I/O activity and background processes to sway the average (is it rolling average or total though?) but notice the instantaneous frametimes are pretty much the same for the scenes. At 16ms or 33ms typical frametimes these I/O spikes would have less of an effect on averages too. You wouldn't typically get noticeable framerate improvements with more RAM in the same configuration. 'Graphics performance' here refers more to resolution and better sampling I would say. With 120fps mode possibly it will make things more stable and lower averages very slightly (we aren't talking about 8GB here) but not much difference to your instantanous frametime.
IMHO if the statement is true - without a VRR addendum -, it is an acknowledgment by Xbox that they are letting games wrongfully pass their certification that aren't meeting the targeted performance of 30fps or 60fps well enough
MS isn't promising the moon here, they are saying it can help. I don't understand the pushback.
Just for the purpose of clarity,I can get behind what your saying, but an fps avg is derived from a collection of instantaneous frame times. I'm not refuting you, just highlighting that one can not say having more available memory wont help performance in absolute terms. Might I add that MS in their statements were also not speaking in absolutes, they clearly said it "can" help.
That's why I took issue with PaintTinJr claim that the statement MS made to developers was in fact incorrect, along with some other claims that if a game fails to lock to 30fps or 60fps well enough (whatever that means) then the game is in violation of MS "certification requirements" (an absurd statement).
Are we in 2020 again?Great update, from the XNC podcast, Colteastwood claims that the Series S will compete and will even surpass the PS5 when it comes to CPU heavy games, given that the Series S has a faster processor and Fidelity Resolution and full RDNA 2 across all feature sets.
Any truth to that?
I'm not making any claim about their certification requirements, you did. I pointed out the absurdity of this statement right here.Just for the purpose of clarity,
Are you saying that Xbox doesn't have a certification program ?
Or are you saying they do have a certification program, but there is no means to fail certification by submitting an Xbox 1/XsS Cyberpunk day one experience?
Or are you saying that there is a program and games do fail on performance grounds, but then Xbox would never pushback against a publisher wanting to go gold, - and block a release - regardless of the technical shortfalls in tearing or judder - which may or may not cause the epilepsy and other conditions like the PlayStation bootup warning has mentioned since the end of the PS3 generation?
This right here is the strawman. You misrepresent what it means to have a target framerate. Both you and I know that a "target framerate" is the maximum number of frames per second that a game should be sending to whatever display device. This does not imply some sort of minimum frequency requirement. If a game fails to hit said target frame rate, then it is typically a less pleasing experience but by no means does it disqualify it from release on the xbox platform. This is fact per Microsoft Store policies 10.4.1. All that is required per 10.4.1 is products must be "compatible with the software, hardware and screen resolution requirements specified by the product"Under all non-VRR situations the games target 30 and 60fps as is required to pass xbox certification - and other than the odd percentile dip in analyses shown by NXgamer, etc typically games on all consoles have stable 30 or 60fps frame-rates, so the frame-rate in games should be locked already with no performance gain to be had - without VRR - because games use dynamic resolution and drop features to hit those frame-rates to match the fixed display refreshes.
IMHO if the statement is true - without a VRR addendum -, it is an acknowledgment by Xbox that they are letting games wrongfully pass their certification that aren't meeting the targeted performance of 30fps or 60fps well enough - that this improvement makes a meaningful difference - and are in essence selling goods that should be held back until the performance is stable.
Riky thicc_girls_are_teh_best
The 13.5 GB RAM being available has been known since 2020 when DF first got their hands on the Series X specs from MS.
Inside Xbox Series X: the full specs
This is it. After months of teaser trailers, blog posts and even the occasional leak, we can finally reveal firm, hard …www.eurogamer.net
PS5 GDDR6 RAM Vs Xbox Series X GDDR6 RAM - Which Is Better? - PlayStation Universe
PS5 GDDR6 RAM Vs Xbox Series X GDDR6 RAM - Which Is Better? We fill you in right here on which next-gen console has the best RAM allocation.www.psu.com
Xbox Series X Allocates 13.5 GB of Memory to Games
gamingbolt.com
I've never seen anyone confirm how much RAM the PS5's OS reserves.The point of contention is Rich claiming PS5 OS uses 3.5 GB of the 16 GB of RAM
I thought the split memory didn't apply to the XSS. The ratio is different and I believe the full fast memory is usable on XSS for games where on the XSX both memory pools will be used for games.
Series s has 10 GB in total. I think 8 GB usable was a given, so maybe they have up to 8.5 now on series s with disabled not used features if the game doesn’t need these features. The real question is : If they reduced the OS footprint maybe they also reduced the OS for series X, freeing more memory up for that machine also.
I think that, correct me if I'm wrong, OS ram allocation isn't always 2.5gb on both series consoles its just that the OS can only pull 2.5gb maximum allocation thus if the a game needs, let's say on a game on Series X it needs 13.8gb Microsoft "can" allow the extra .3gb that comes from OS allocation if it's really needed but I think there will be a certain limit on how much allocation can be borrowed from the OS so it wouldn't crash when running a game that needs the extra memory to prevent performance hiccups.
Split RAM is fine. There is nothing wrong with split RAM except the added complexity for devs and cost cutting for platform holders. The alternatives are a more expensive system or a system with lower RAM so why not split RAM. That's a good thing.
I don't see why when a veteran engine dev says the low RAM was giving them trouble and they release Doom Eternal with missing raytracing on Series S due to it they should be dismissed as ignorant because they mentioned it's also split RAM when it is though.
I see you edited to ask this. what do you mean by virtual memory? As in SSD space used as RAM space? yeah I'm sure SSD space is reserved by the OS and used as a pagefile to move data for less demanding not always needed processes.
I mean in terms of memory allocation and virtual address space though. This basically
https://docs.microsoft.com/en-us/windows-hardware/drivers/gettingstarted/virtual-address-spaces
People are oversimplifing memory managment to physical memory trying to suggest this chunk of physical memory is for this Game or component (GPU) and that chunk of physical memory is for that, when that's not really how it is. The OS doesn't need a 2GB chunk of physical memory like that and it is dynamic even for the game with a lower limit.
If you are trying to suggest VAS doesn't apply to xbox due to 'coding to the metal' or something wierd you can look at the other listed improvement which makes it clear:
I've never seen anyone confirm how much RAM the PS5's OS reserves.
Already knew this, that's not the point of contention tho. The point of contention is Rich claiming PS5 OS uses 3.5 GB of the 16 GB of RAM, which doesn't really make any sense, considering most account place it at 2 GB
PS5
- 256 –bit Memory interface.
- 448 GB/s bandwidth
- 3.5 Gb for OS
- 12.5 GB for developers.
I'm well aware but it's not the same. If your PC has a 3090 with 24GB VRAM plugged in and a single stick of 2GB of system ram it does not have 26GB of Ram. Video memory is not system memory, you're still gonna have all the limitations that having only 2 GB of ram brings no matter how much VRAM u have. The 360 didn't suffer from this issue because it had a single pool of memory(it technically did have ESram on top of that).Both the PS3 and 360 had 512MB. The 360 had unified memory though but the PS3 a 256+256 split memory.
Game art is low rrs for the same reason PS4Pro had sub 4k game art for games that came out before it's launch and some games that came after. Ime if devs don't update it it's not gonna change on its own. Idk about the rest last I heard the actual OS UI was 4k in the same way your windows UI is 4k if you switch the resolution to 4k.95% of the games art still low res, the icons in the UI are also low res, only live text and basic UI elements are 4K
If this were the case then why did you state the 360 as having 512MB of RAM considering some (most) of it would undoubtedly be used as VRAM. As far as I know the VRAM could be used as system RAM too with some hacks but at a drastically reduced bandwidth and higher latency.I'm well aware but it's not the same. If your PC has a 3090 with 24GB VRAM plugged in and a single stick of 2GB of system ram it does not have 26GB of Ram. Video memory is not system memory, you're still gonna have all the limitations that having only 2 GB of ram brings no matter how much VRAM u have. The 360 didn't suffer from this issue because it had a single pool of memory(it technically did have ESram on top of that).
Frankly I wouldn't trust that site you posted more than anything. Surprised Bernd is so quick to call out id devs as illegitimate but accept and thumbs up some unknow site who says thisCan you please share where these accounts are that place it at 2GB ?
Cause I'm also finding sources dated 2020 that have it at 12.5GB.
And the only mentions I can find about 2GB memory usage are all labeled as rumors and lead back to GAF theads.
I would take what Richard says above unconfirmed rumored leaks, personally.
I've never seen anyone confirm how much RAM the PS5's OS reserves.
The PC split comparison (DDR4 + GDDR6X) with PS3 split (XDR+ GDDR3) versus Xbox 360 unified (DDR3) isn't quite so simple, because the PC graphics card lives on the southbridge, and the GDDR6X memory mapped through the DDR4 because it lives on the northbridge whereas in the other two examples their memory is on the northbridge and that is why the PS3 was able to use the GDDR3 in place of XDR, just with different access characteristics - which I believe were eventually abstracted if developers wanted with the SPURs library IIRC.I'm well aware but it's not the same. If your PC has a 3090 with 24GB VRAM plugged in and a single stick of 2GB of system ram it does not have 26GB of Ram. Video memory is not system memory, you're still gonna have all the limitations that having only 2 GB of ram brings no matter how much VRAM u have. The 360 didn't suffer from this issue because it had a single pool of memory(it technically did have ESram on top of that).
Even though I would consider Richard to be - I think the new term is - hype-man I still expect that number to be true because of the wear and tear on the non-replaceable system SSD in the PS5 if they weren't buffering the 4K hdr gameplay recording to the GDDR6. What I suspect is different, is that the PS5 offers more memory to devs than a straight 16GB-3.5GB, where a 2GB area gets nuked and reloaded every time the OS menu button is pressed, because the high priority modes of the IO complex can allow for the eviction -of largely static - OS or game data and guarantee reloading quicker than what would produce OS or game lag....
Well Richard is saying it uses 1 GB more than Series X, at least that's what was mentioned somewhere on the first page, which is what I have a point of contention with.
Richard has access to the PS5 SDK so he would know the reservation, no guess work required. Plus they have access to tons of developers.Can you please share where these accounts are that place it at 2GB ?
Cause I'm also finding sources dated 2020 that have it at 12.5GB.
And the only mentions I can find about 2GB memory usage are all labeled as rumors and lead back to GAF theads.
I would take what Richard says above unconfirmed rumored leaks, personally.
"marginal" as if these specs weren't locked in during a RAM price skyrocket and when you multiply the cost increase by 100M it becomes a bit cost adjustment. Sure MS could afford it, but they still have to budget things out.So how much more memory did the Series S got?
But anyways, it's just slow memory. As the Series S only has 8GB fast memory.
I still don't understand why Xbox Series didn't go with 20GB at 320bit for Series X and 12GB at 192bit for Series S
The additional costs would be marginal. The difference it makes for ease of use for developers would be tremendous.
That site also lists Series X at $600 and claims it has a loud fan.Frankly I wouldn't trust that site you posted more than anything. Surprised Bernd is so quick to call out id devs as illegitimate but accept and thumbs up some unknow site who says this
"[SSD] Expansion module replaces interface one."
Because that was just the silly assumption based on the PS4 hdd. But this unknown site knows things, right?
"marginal" as if these specs weren't locked in during a RAM price skyrocket and when you multiply the cost increase by 100M it becomes a bit cost adjustment. Sure MS could afford it, but they still have to budget things out.
and the amount is flexible. You get more memory from disabling system processes not in use so it can be anywhere from like 100 MB to 300 MB it sounds.
Exactly, I doubt DF would just pick a figure out the air as they know how much crap it would make.
Nah I think she ran around the block a few times herself. As well as with Tommy Lee's boat and countless others, no doubt. Bottom line is we all get old, some age differently.Agreed. That's Pamela looked her best, until Hollywood ran her around the block a few times.
Some of my favorite discussions to lurk were the secret sauce threads. While I kinda secretly hope "secret sauce" discussion comes back, I don't think memory allocation optimizations has enough legs to kick that off.Nah I think she ran around the block a few times herself. As well as with Tommy Lee's boat and countless others, no doubt. Bottom line is we all get old, some age differently.
As for xss, is this the so called "secret sauce" xbone fans clamored for last Gen but never got?
The Xbox series S will remain a backward trough that pulls the two older consoles down.
The usually culprits, people overstating the significance of this update, people understating the significance of this update. Discussion of how the ram works in the XSSWhat in this thread is worthy of as many as six pages of discussion?
That generally just cause a bunch of back and forth shit throwing."The Xbox series S will remain a backward trough that pulls two powerful consoles down."
How's efootball these days? I haven't touched it since "release" and waiting on the single player modes. Are they out yet?Excellent. It’s been a fantastic device for me. I’m using it near daily for eFootball and have Halo and Apex installed as well as all of the old Rare games.