• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Shmunter

Member
Just because they can push lossless assets doesn't mean they shoul. Actually, they can reduce the triangle count of this sculpt and utilize textures and normal maps with a minimum to zero effect on quality.
Also AFAIK, the complex topology of these lossless sculpt may not allow for good animation.
Whichever way the tech is implemented, the tantalising aspect is the ability to bring in such detailed assets in real-time as needed. . The camera zooms up to the character and this is what we see. Zooms away, a reduced LOD is swapped in - or whatever equivalent future tech to LOD there is.

It no longer matters if the model in view takes up 90% of the vram, because it can be flushed out and replaced with world models in an instant. Only the render view is what matters. Game sizes and compression ratios notwithstanding.

I’ve often used when describing to people the prospect of mass streaming, that gameplay should look as good or very close to cutscenes.
 
Last edited:

Shmunter

Member
Reports of PS5 using around only 1 GB of RAM for the OS, according to Moore's Law is Dead channel: (timestamped)





Thanks to Rusco Da Vino Rusco Da Vino for the catch, it's a pretty lengthy podcast to dive into.

I’m still not convinced on only a 1gig reserve. The dvr video being the prime memory hog.

But I can hope for some added DDR4 stashed away somewhere just for that purpose. Or even some cheap flash soldered onto the USB bus like late xbox360 models.
 

Bo_Hazem

Banned
Bo is a hack

:goog_unsure:

If you call someone a hack, you mean they're not great at what they do — especially writing. A mediocre writer is called a hack. Once upon a time hack was short for “an ordinary horse,” and now it's an insult for writers. No one wants to be a hack!


Man, I'm not sure why you call me that, a bad writer and an ordinary horse (hence: pony). I wrote down those notes so one day I can be a famous writer:

b752a53c1c4bc945602f1d55a2ca19b8.jpg
 
Last edited:

bitbydeath

Member
Whichever way the tech is implemented, the tantalising aspect is the ability to bring in such detailed assets in real-time as needed. . The camera zooms up to the character and this is what we see. Zooms away, a reduced LOD is swapped in - or whatever equivalent future tech to LOD there is.

It no longer matters if the model in view takes up 90% of the vram, because it can be flushed out and replaced with world models in an instant. Only the render view is what matters. Game sizes and compression ratios notwithstanding.

I’ve often used when describing to people the prospect of mass streaming, that gameplay should look as good or very close to cutscenes.

Which should mean advertising to impress through cut-scenes should start to go away.
 
Last edited:

Thedtrain

Member
:goog_unsure:

If you call someone a hack, you mean they're not great at what they do — especially writing. A mediocre writer is called a hack. Once upon a time hack was short for “an ordinary horse,” and now it's an insult for writers. No one wants to be a hack!


Man, I'm not sure why you call me that, a bad writer and an ordinary horse (hence: pony). I wrote down those notes so one day I can be a famous writer.

b752a53c1c4bc945602f1d55a2ca19b8.jpg
Those are also the rules of fake news 😛
 

Ptarmiganx2

Member
:goog_unsure:

If you call someone a hack, you mean they're not great at what they do — especially writing. A mediocre writer is called a hack. Once upon a time hack was short for “an ordinary horse,” and now it's an insult for writers. No one wants to be a hack!


Man, I'm not sure why you call me that, a bad writer and an ordinary horse (hence: pony). I wrote down those notes so one day I can be a famous writer:

b752a53c1c4bc945602f1d55a2ca19b8.jpg
I would take it as a compliment...an ordinary horse is no insult if it’s referring to one’s manhood...he might be off topic for a tech discussion but he’s calling you a stud. 😁
 
Last edited:

B_Boss

Member
Reports of PS5 using around only 1 GB of RAM for the OS, according to Moore's Law is Dead channel: (timestamped)





Thanks to Rusco Da Vino Rusco Da Vino for the catch, it's a pretty lengthy podcast to dive into.


That is one of Tom’s best shows concerning nextgen, period. I would even encourage folks to absolutely make sure you listen to the first 10 or so mins because he presents a sort of disclaimer, it’s great.
 

Sinthor

Gold Member
Really really high quality assets which is where the SSD comes into play.

Didn't Microsoft make a similar claim?

We're not seriously parsing Mark Cerny's words to supposedly mean that 8k resolution and output is NOT supported by the PS5, are we? He clearly stated that while 8k graphics were supported (namely resolution) most TV's didn't output that and they were using a 4k TV.

Still, this generation will be the 4k generation, period. Most everything will be native 4k @30fps...some at 60fps. There may be a few titles that are 8k but they will be few and far between. Figure it will be like how this generation currently was the first to fully do full 1080p as a standard and then dipped it's toes into 4k, Even after the mid-gen refresh, 4k native wasn't really a standard and stable at 30fps or more. That will be 8k for this generation. Probably the generation AFTER this coming one will be where 8k is standard, assuming graphics processing power continues to evolve at the current pace.

Or we could be using quantum processors by then and have everything we know today fly out the window!
 

Sinthor

Gold Member
That looks current gen to me.
C55MlJpWUAAouMO


CmP5u6vWYAAXj20

Well, to be fair...I think a lot of what we keep hearing about diminishing returns on graphic fidelity will be what we see in NORMAL cases. Games like this one...on a non-4k monitor may not be easily distinguishable. Maybe more effects to be seen? I think the real difference this generation is in native and rock solid 30+fps in games. Just my opinion though, we shall see. That's why I think the major thing that average consumers may notice is NOT the flashing terraflops number but instead, one box or the other "instantly" turning on and/or loading games. That may be a lot easier difference for people to tell than the 15 or so percent difference in processing power. After all, with PS4 and Xbox One the difference was at least 40%...and it took Digital Foundry videos to "prove" that the PS4 was displaying higher resolution.

I'm hoping we'll start to see some answers here in the next week or so! The speculation and the waiting is worse with this damn quarantine still dragging on!
 

Bo_Hazem

Banned
I was thinking about the Next-Gen console abilities and it just clicked how perfect Silent Hill's world would be in these consoles.
We know Homecoming did the famous World change from the movie
giphy.gif
;lpok
And others have done the transformation of worlds
But I think these new consoles will blow us away with how more impressive those transformations will be and on what scale.
What you guys think?

I'm not a fan of horror games, only finished Resident evil 2 and got stuck in Silent Hill 1 in the piano puzzle due to the lack of English comprehension on PS1. RE2 made the majority of my nightmares going forward:messenger_fearful: especially the lickers and Mr. X.

3e28de086d31d9d2f9e67362df08372f.jpg


We are entering a new generational leap, in terms of graphics and game design. I'm more than ready for a shock, a good one, early June.
 
Last edited:

ToadMan

Member
Great questions, and it's time to give this matter more attention, let's forget 5.5GB/s vs 2.4GB/s or even 7GB/s max PCIe 4.0 for a moment now:

Current NVMe m.2 uses 4 channels vs 12 channels on PS5:





While the NVMe m.2 has 2 priority levels:

0Yd5f.jpg


PS5's custom SSD has 6 to make 6 different orders:

infrastructure-660.jpg


And while PS5's SSD uses 12 channels to fully reach its optimal 5.5GB/s raw, XSX SSD cramps into 4 channels w

More educated people in the matter may contribute or correct me here.

Both the major points you make here are incorrect - 12 vs 4 (see edit) and 6 priority levels vs 2.

Sony's SSD is 12 channels, not 12 lanes - this is not the same thing and these terms are not interchangeable. Sony's SSD has 4 Lanes and in that regard is identical to the Xsex. This is made clear in the very timestamp from Cerny's presentation if you'd watched about 10 seconds of it.

We don't know how many channels the xsex SSD has but I'd assume 8 since that is the current high end SSD standard. Channels are the buses used to access NAND and they are within the SSD assembly. Again Cerny's presentation makes this clear.

Which brings us on to priority. Yes Sony has 6 levels and MS presumably just use 2, but your analogy of road lanes isn't what the prioroty levels mean.

MS and Sony have identical numbers of lanes - Sony just categorises its traffic with more levels. That has nothing to do with the amount of data and everything to do with how quickly that data can jump the queue and get on to the NVME bus.

EDIT: So the original post referred to Channels which I mistakenly interpreted as the OP confusing PCIe lanes with SSD channels. Actually the OP really did mean 12 vs 4 channels for the xsx SSD. It may be accurate (that the xsx has 4 channel NAND architecture) but is based on inferring the specs of xsx - not known data at this stage
 
Last edited:

SgtCaffran

Member
We don't know how many channels the xsex SSD has but I'd assume 8 since that is the current high end SSD standard. Channels are the buses used to access NAND and they are within the SSD assembly. Again Cerny's presentation makes this clear.
Xbox probably only uses 4 channels if the leaked Phison controller is true. That and they use extra NAND chips so PS5 has 1 chip per channel and Xbox 4 chips per channel.

I would really like to hear an expert opinion on how this could influence speeds but from what I have read we can expect improvements in random IOPS.
 

Bo_Hazem

Banned
Both the major points you make here are incorrect - 12 vs 4 and 6 priority levels vs 2.

Sony's SSD is 12 channels, not 12 lanes - this is not the same thing and these terms are not interchangeable. Sony's SSD has 4 Lanes and in that regard is identical to the Xsex. This is made clear in the very timestamp from Cerny's presentation if you'd watched about 10 seconds of it.

We don't know how many channels the xsex SSD has but I'd assume 8 since that is the current high end SSD standard. Channels are the buses used to access NAND and they are within the SSD assembly. Again Cerny's presentation makes this clear.

Which brings us on to priority. Yes Sony has 6 levels and MS presumably just use 2, but your analogy of road lanes isn't what the prioroty levels mean.

MS and Sony have identical numbers of lanes - Sony just categorises its traffic with more levels. That has nothing to do with the amount of data and everything to do with how quickly that data can jump the queue and get on to the NVME bus.

Great, thanks for your reply. But in that case, if priority levels aren't advantageous, why Mark Cenry was hesitant about top end 7GB/s NVMe m.2 might not keep up with the internal one due to 2 priority levels and would need extra work from the I/O to try compensating for it? As been discussed previously, those lanes are good for up to 8GB/s, 2GB/s each, and I didn't talk about them at all but thanks for bringing them up. And having 12 vs 8 channels shouldn't give it better performance, as Mark Cerny said that 8 aren't enough, and he didn't want to add more than 12 because of cost?

So far, I'm not convinced as I will follow what Mark says here as a veteran expert, with all due respect. Maybe go back and watch all the timestamps? He actually didn't talk about 4 lanes, but feel free to timestamp where exactly he said so.

I hope you're serious and not trolling.
 
Last edited:

Sinthor

Gold Member
Both the major points you make here are incorrect - 12 vs 4 and 6 priority levels vs 2.

Sony's SSD is 12 channels, not 12 lanes - this is not the same thing and these terms are not interchangeable. Sony's SSD has 4 Lanes and in that regard is identical to the Xsex. This is made clear in the very timestamp from Cerny's presentation if you'd watched about 10 seconds of it.

We don't know how many channels the xsex SSD has but I'd assume 8 since that is the current high end SSD standard. Channels are the buses used to access NAND and they are within the SSD assembly. Again Cerny's presentation makes this clear.

Which brings us on to priority. Yes Sony has 6 levels and MS presumably just use 2, but your analogy of road lanes isn't what the prioroty levels mean.

MS and Sony have identical numbers of lanes - Sony just categorises its traffic with more levels. That has nothing to do with the amount of data and everything to do with how quickly that data can jump the queue and get on to the NVME bus.

Wow. I am confused where the increased performance comes in then. Just four more channels and maybe a few more levels of priority can mean such a performance differential? I'm probably being dense because it's late but....I'm not understanding this. I THOUGHT I had a handle on it all...till this post. :)
 
Wow. I am confused where the increased performance comes in then. Just four more channels and maybe a few more levels of priority can mean such a performance differential? I'm probably being dense because it's late but....I'm not understanding this. I THOUGHT I had a handle on it all...till this post. :)
I am missing something too.

Is it normal that a "high end SSD" has 2.4GB/s of raw speed?

Edit: lol, I just checked it out.

A high end ssd has around 5GB/s of raw speed right now, the double of what the Series X can deliver.
 
Last edited:

Bo_Hazem

Banned
Xbox probably only uses 4 channels if the leaked Phison controller is true. That and they use extra NAND chips so PS5 has 1 chip per channel and Xbox 4 chips per channel.

I would really like to hear an expert opinion on how this could influence speeds but from what I have read we can expect improvements in random IOPS.

I understand that each of those 12 modules are 68.7GB in size, having 12 channels is probably helpful as data can very possibly be scattered in more than 1 module, is that correct?

What Cerny was talking about for the priority levels is them sending orders in parallel, it sounded pretty clear that 2 priority levels when streaming loads of textures could make some sound/lip synchronization lag/stutter.
 

Bo_Hazem

Banned
I am missing something too.

Is it normal that a "high end SSD" has 2.4gb/s of raw speed?

My 1TB SSD NVMe PCIe 3.0 (Samsung 970 Pro) is 3.5GB/s, 8-channel, and that's not 4.0. So that's definitely not high end by any stretch.

Anyway, about priority levels from EuroGamer:

The controller itself hooks up to the main processor via a four-lane PCI Express 4.0 interconnect, and contains a number of bespoke hardware blocks designed to eliminate SSD bottlenecks. The system has six priority levels, meaning that developers can literally prioritise the delivery of data according to the game's needs.

And the 12-channels:

Delivering two orders of magnitude improvement in performance required a lot of custom hardware to seamlessly marry the SSD to the main processor. A custom flash marries up to the SSD modules via a 12 channel interface, delivering the required 5.5GB/s of performance with a total of 825GB of storage. This may sound like a strange choice for storage size when considering that consumer SSDs offer 512GB, 1TB or more of capacity, but Sony's solution is proprietary, 825GB is most optimal match for the 12-channel interface and there are other advantages too. In short, Sony had more freedom to adapt its design: "We can look at the available NAND flash parts and construct something with optimal price performance. Someone constructing an M.2 drive presumably does not have that freedom, it would be difficult to market and sell if it were not one of those standard sizes," Mark Cerny says.

 
Last edited:

ToadMan

Member
Wow. I am confused where the increased performance comes in then. Just four more channels and maybe a few more levels of priority can mean such a performance differential? I'm probably being dense because it's late but....I'm not understanding this. I THOUGHT I had a handle on it all...till this post. :)

The NVME 4 spec is capable of supporting up to 7Gb/s transfer rates so there's no limit being hit using 4 lanes for either Sony or MS.

For the NAND storage, Sony and MS will be buying commercially available quality from whichever memory partners they're working with. Speed improvements will come from how they've specced the NAND/SSD they're using coupled with what they can do with cusom controllers and firmware. I'm assuming both compaines are using a commercially available SSD as the basis for their solution. MS just chose one and put it in their system with little customisation, Sony chose one and have taken it apart to install their own channel architecture and controller to boost the access times.

What happens "inside" the SSD - that's between the controller and the NANDs - is where the speed gain is made. 12 vs 8 channels is one way to boost access times.

Then there's the type of NAND being used - SLC, MLC, TLC. SLC is fastest offering read speeds less than half that of TLC and it's more resistant to high thermal outputs. SLC is also more expensive and perhaps beyond Sony's budget. MLC is about 1/3rd faster than TLC and not so expensive. Maybe Sony went that way and MS used TLC NAND - this plus 12 channels would account for the latency difference.

It could be Sony are just brute forcing it -they went with TLC, 12 channels and then bumped the clock rates on the SSD and installed heavy cooling so it can run at those speeds. This would map on to the issues around CPU/GPU power consumption, variable clock speeds and allocating power dynamically. I don't think Sony has done that but until we see the PS5 we won't know for sure.
 
D

Deleted member 775630

Unconfirmed Member
Also I will say without any base than I expect between 1-2 GB for the OS.
I think it's more likely to be 2 than 1. They've put some new tech in the controller which is also part of the OS (mic for voice commands, and stuff like that). At the same time their SSD is so fast they might not need that much ram to feed the GPU, so they might allocate more to the OS.
 

ToadMan

Member
Great, thanks for your reply. But in that case, if priority levels aren't advantageous, why Mark Cenry was hesitant about top end 7GB/s NVMe m.2 might not keep up with the internal one due to 2 priority levels and would need extra work from the I/O to try compensating for it? As been discussed previously, those lanes are good for up to 8GB/s, 2GB/s each, and I didn't talk about them at all but thanks for bringing them up. And having 12 vs 8 channels shouldn't give it better performance, as Mark Cerny said that 8 aren't enough, and he didn't want to add more than 12 because of cost?

So far, I'm not convinced as I will follow what Mark says here as a veteran expert, with all due respect. Maybe go back and watch all the timestamps? He actually didn't talk about 4 lanes, but feel free to timestamp where exactly he said so.

I hope you're serious and not trolling.

Priorty isn't about data rate - it's about deciding the order that data gets on the bus. Higher priority data will cut ahead of lower priority, but the data rate doesn't change.


And going back to your original post I see I may have misunderstood - you really were comparing 12 channels to 4 channels. Except we don't know how many channels the MS SSD is using, so that would be an assumption on your part. Perhaps you have a source for this assumption....
 
Last edited:

Bo_Hazem

Banned
Priorty isn't about data rate - it about deciding the order that data gets on the bus. Higher priority data will cut ahead of lower priority, but the data rate doesn't change.


And going back to your original post I see I may have misunderstood - you really were comparing 12 channels to 4 channels. Except we don't know how many channels the MS SSD is using, so that would be an assumption on your part. Perhaps you have a source for this assumption....

Microsoft hasn't been transparent in that regard, but according to the leaked PHISON controller that's suggested for Xbox Scarlet (XSX), and the low-end tier speed in the PCIe 4.0 all point out to 4-channel so far, until confirmed specs of the SSD by Microsoft. Don't pay much attention to that 7.5GB/s as it was only wishful thinking:



PS5019-E19

In fact, the company is readying two such controllers: the PS5016-E16 for high-end drives, as well as the PS5019-E19 for mainstream drives.

To build M.2 SSDs supporting a PCIe 4.0 x4 interface, new controllers are (obviously) needed, and right now the only company that has them ready is Phison. In fact, the company is readying two such controllers: the PS5016-E16 for high-end drives, as well as the PS5019-E19 for mainstream drives. The E16 has eight 800 MT/s NAND channels for ultimate parallelism and sequential read performance of up to 5 GB/s (write speeds depend on the actual chips/SSD capacity, in the best case scenario it is said to reach up to 4.4 GB/s), whereas the E19 has four NAND channels to minimize the die size and cost.



It's also DRAM-less, and this one is cheap and has shorter lifespan compared to regular SSD's. For more details:





I didn't say Priority levels are data rate, but orders that are critical to game design to avoid lag and awkward moments in the game. Listen to Mark Cerny so you can understand why it's crucial to gaming to have more than 2 priorities: (timestamped)

 
Last edited:

ToadMan

Member
Microsoft hasn't been transparent in that regard, but according to the leaked PHISON controller that's suggested for Xbox Scarlet (XSX), and the low-end tier speed in the PCIe 4.0 all point out to 4-channel so far, until confirmed specs of the SSD by Microsoft. Don't pay much attention to that 7.5GB/s as it was only wishful thinking:




I didn't say Priority levels are data rate, but orders that are critical to game design to avoid lag and awkward moments in the game. Listen to Mark Cerny so you can understand why it's crucial to gaming to have more than 2 priorities: (timestamped)



Ah ok, so you're taking a linkedIn profile to the E19 spec for Scarlett. That's quite a leap - I mean this could've been prototyping, a tech demo, an initial run for dev kits or Phison demonstrating their solution to MS to get an order - but OK lets see what MS put in their box.


For the priorities, you used a highway analogy. That is the wrong analogy altogether to explain the priority mechanism.
 
Last edited:

Bo_Hazem

Banned
Ah ok, so you're taking a linkedIn profile to the E19 spec for Scarlett. That's quite a leap - I mean this could've been prototyping, a tech demo, an initial run for dev kits or Phison demonstrating their solution to MS to get an order - but OK lets see what MS put in their box.


For the priorities, you used a highway analogy. That is the wrong analogy altogether to explain the priority mechanism.

They've introduced two, one for high end that's already doing 5GB/s (E16) here:

812z%2BZt7CyL._AC_SL1500_.jpg



So you're only left with one, the E19, except for some brain gymnastic idea that Microsoft will use E16 to simply downgrade it by -120% just for fun.

About that street lanes example for priority levels, let's scrap it if it bothers you. It doesn't need any further explanation at this point as has been wonderfully explained and simplified by Mark Cerny.
 
Last edited:
I nearly bought Samsung Q900R and got broke. Thankfully, I stop. 8K TV's are insane, but not much videos supporting it yet. I'm planning to get with when microLED hit the market. Although I got HDR 4K now, I'll get Sony's new XH90 for 4K@120Hz, VRR, ALLM. Not sure about the size, between 65-85" (65" is the most sensible so far), depending on how stunned when I sit right in front of it as I go crazy near new electronics. I sit ~2 meters away when gaming and on PC now:lollipop_grinning_sweat:. I fell in love with 85" when I saw a Sony 85" that's close to my chest standing on the ground:lollipop_smiling_hearts:!
Yeah, same here, I'm waiting for microLED to be widespread on the market and the prices get in check to do the jump to 8K. Besides, I've got a C9, so it's not like I want to part ways with it any time soon.
 
I found something interesting.

Priority levels are part of the host -interface protocol (eg. NVMe) and not the hardware SSD.
That means it is a software logic on host.

So any SSD put on PS5 will work with six priority levels while motherboards using NVMe will have only two.

T5aBY5e.jpg


That basically explain it... the six priority levels are the custom software implemented by Sony on PS5 SSD interface protocol probably inside the custom I/O processor.

All makes sense now.

This directly contradicts what Cerny was saying, and his reason for needing 7GB/s instead of just 5.5GB/s for the m.2 slot.

I bet the controller built into an NVMe SSD uses NVMe protocol natively, and it’s not just down to the motherboard.
 

THE:MILKMAN

Member
This directly contradicts what Cerny was saying, and his reason for needing 7GB/s instead of just 5.5GB/s for the m.2 slot.

I bet the controller built into an NVMe SSD uses NVMe protocol natively, and it’s not just down to the motherboard.

Mark Cerny didn't say it needed to be 7GB/s though? He just laid out where commercial M.2 NVMe drives where today (4-5GB/s) and where they'll be by years end. (7GB/s - saturate PCIe4)

He then went on to say that any ratified drive for use in PS5 will require a 'little more speed' so as the custom I/O unit can arbitrate the extra priority levels. What that amounts to we'll have to wait and find out. It might turn out a 7GB/s drive will be needed in the end, but he didn't actually say that here.

That was my understanding FWIW.
 
Mark Cerny didn't say it needed to be 7GB/s though? He just laid out where commercial M.2 NVMe drives where today (4-5GB/s) and where they'll be by years end. (7GB/s - saturate PCIe4)

He then went on to say that any ratified drive for use in PS5 will require a 'little more speed' so as the custom I/O unit can arbitrate the extra priority levels. What that amounts to we'll have to wait and find out. It might turn out a 7GB/s drive will be needed in the end, but he didn't actually say that here.

That was my understanding FWIW.

After re-watching your summary is correct. His point seems to be that commercial m.2 PCIe 4.0 drives should eventually be up to the task.

But he also said these commercial drives will need extra speed exactly because the architecture is different (with 6 vs 2 queues as an example). This implies that a commercial m.2 NVMe won’t change architecture because it’s now in a PS5 motherboard, as was implied by the post I replied to.
 

Hyphen

Member
Here:






What it means that at 4K@60Hz you're pretty fine, but at 4K@120Hz you're not getting full color depth, which is critical for HDR at least. So instead of chroma 4:4:4 you'd get 4:2:0 I think. It might be just the CX as you've noted there, but heard on one video that it was the case before that, and that could be wrong. If you could do me a favor and know more about it (C9) I would edit my post to avoid misinformation.

Regards.


Oh, please don't get me wrong, I hope you didn't think I was trying to claim you were spreading misinformation. I'm just curious about the capabilities of the LG OLED models, on the off chance that a console game runs at 4K@120Hz.

Ok, so the furore surrounding the downgrade of bandwidth (48 down to 40) is only applicable to the 2020 OLED models. Those from 2019 have full 48Gbps from their HMDI 2.1 ports.

However, disregarding bandwidth and color depth, both 2019 and 2020 LG OLED TVs can do 4K@120Hz. The difference is that the 2019 models can only do it when paired with another HDMI 2.1 device, whereas the 2020 models can do it over HDMI 2.0 -

--TIMESTAMPED--



"The big difference for 2020 is that we support up to 4K 120p. So the VRR range is from 40 to 120. Whereas in the 2019 TVs it was limited to 60p at 4K resolutions. That was actually an HDMI limitation. So once we start to see HDMI 2.1 graphics cards coming out, we'll be able to still utilise up to 4K (between 40 and 120) on the 2019 models as well. But now it's possible with HDMI 2.0 on the 2020 models."

Neil Robinson, LG Electronics
(Senior Director Strategic Projects)
 

Bo_Hazem

Banned
Oh, please don't get me wrong, I hope you didn't think I was trying to claim you were spreading misinformation. I'm just curious about the capabilities of the LG OLED models, on the off chance that a console game runs at 4K@120Hz.

Ok, so the furore surrounding the downgrade of bandwidth (48 down to 40) is only applicable to the 2020 OLED models. Those from 2019 have full 48Gbps from their HMDI 2.1 ports.

However, disregarding bandwidth and color depth, both 2019 and 2020 LG OLED TVs can do 4K@120Hz. The difference is that the 2019 models can only do it when paired with another HDMI 2.1 device, whereas the 2020 models can do it over HDMI 2.0 -

--TIMESTAMPED--



"The big difference for 2020 is that we support up to 4K 120p. So the VRR range is from 40 to 120. Whereas in the 2019 TVs it was limited to 60p at 4K resolutions. That was actually an HDMI limitation. So once we start to see HDMI 2.1 graphics cards coming out, we'll be able to still utilise up to 4K (between 40 and 120) on the 2019 models as well. But now it's possible with HDMI 2.0 on the 2020 models."

Neil Robinson, LG Electronics
(Senior Director Strategic Projects)


Not accusing you of anything, my friend. I was actually more concerned about how people might take misinformation from me and that can be bad for someone's reputation, even for an amateur like me. Wonderful details there, very appreciated! Thanks a lot 🙌
 

THE:MILKMAN

Member
Oh, please don't get me wrong, I hope you didn't think I was trying to claim you were spreading misinformation. I'm just curious about the capabilities of the LG OLED models, on the off chance that a console game runs at 4K@120Hz.

Ok, so the furore surrounding the downgrade of bandwidth (48 down to 40) is only applicable to the 2020 OLED models. Those from 2019 have full 48Gbps from their HMDI 2.1 ports.

However, disregarding bandwidth and color depth, both 2019 and 2020 LG OLED TVs can do 4K@120Hz. The difference is that the 2019 models can only do it when paired with another HDMI 2.1 device, whereas the 2020 models can do it over HDMI 2.0 -

--TIMESTAMPED--



"The big difference for 2020 is that we support up to 4K 120p. So the VRR range is from 40 to 120. Whereas in the 2019 TVs it was limited to 60p at 4K resolutions. That was actually an HDMI limitation. So once we start to see HDMI 2.1 graphics cards coming out, we'll be able to still utilise up to 4K (between 40 and 120) on the 2019 models as well. But now it's possible with HDMI 2.0 on the 2020 models."

Neil Robinson, LG Electronics
(Senior Director Strategic Projects)


Tell you a pet hate for me is how much of a mess the HDMI "standard" has become. Manufacturers now just arbitrarily including (or excluding) features.

Worrying about HDMI is right at the bottom of my list for my TV and certainly for next-gen consoles. No developers are going to put much effort into this given the mess it is in right IMO.
 
Whichever way the tech is implemented, the tantalising aspect is the ability to bring in such detailed assets in real-time as needed. . The camera zooms up to the character and this is what we see. Zooms away, a reduced LOD is swapped in - or whatever equivalent future tech to LOD there is.

It no longer matters if the model in view takes up 90% of the vram, because it can be flushed out and replaced with world models in an instant. Only the render view is what matters. Game sizes and compression ratios notwithstanding.

I’ve often used when describing to people the prospect of mass streaming, that gameplay should look as good or very close to cutscenes.
I agree. What I mean is that developers can optimize for performance without dumping down quality, so that games can perform well without dips to unplayable framerate.
I still believe that characters will be rendered in the traditional way but with much higher poly count ( if we take UE5 demo).
 

PaintTinJr

Member
I bought an external SSD because I didn't want to spend double on 1 TB so I just keep there some games I play. Also the comparison video I had watched before that showed that USB SSD is faster than an internal drive. I know it sounds illogical but there were some benchmarks. And finally, when time comes, I'll just switch the drive to PS5 and move my games via it.

If I'd known the heat difference is so huge, I think I'd have gone for an internal one but now it's too late. I have 3 other drives in my PC but they're also smaller.
You can (in 99% of cases) still use the existing drive if it isn't on its death bed. It just needs(saves backed up) then formatted outside the PS4 - on PC, where you remove the partitions and maybe do 100GB as a full format before cancelling and doing quick format. When the drive is empty, the PS4/Pro requires you to do a full firmware install from USB. This is probably the real part that fixes it, incremental updates versus a full refresh is probably the means by which the system's gremlins are removed. Nice clean slate file database too.

If you have the technical skills and the time, it is well worth a go IMHO.
 
Xbox Series X could secretly have the best quality NAND chips connected over 16-channels with a premium flash controller, but what does all of that matter if it’s only getting 2.4GB/s sequential read speed?
Isn’t it more likely and doesn’t it actually make sense that it’s not using all of that?

Microsoft calling their SSD “high-end“ is probably accurate once you factor in SATA SSDs. For an NVMe drive it’s pretty average/budget as far as what the sum of its parts ends up delivering when all weighed up.
 
Status
Not open for further replies.
Top Bottom