• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

marvifrom

Member
You got me lost or I'm looking at it wrong.
I thought it should be after RDNA 1 to be based on it.
RDNA1 = GFX10_1
RDNA2 = GFX10_3
This makes it look like Oberon is a prototype.
Or Lite in general means prototype.
2VHH1h6.png
 
XSX-10.jpg




I don't think we have info about PS5...? The XSX has 5MB L2 and RDNA1 has 4MB. This is what I understand.

Then the PS5 would have 5MB of L2 if that's a requirement for RDNA2. But I think it might be more based on the CU count than anything.



Honestly I think the lower clocks is due to the limitations of the cooling solution more than anything.
 
Last edited:

ethomaz

Banned
RDNA L2 shared cache.

“The L2 cache is shared across the whole chip and physically partitioned into multiple slices. Four slices of the L2 cache are associated with each 64-bit memory controller to absorb and reduce traffic. The cache is 16-way set-associative and has been enhanced with larger 128-byte cache lines to match the typical wave32 memory request. The slices are flexible and can be configured with 64KB-512KB, depending on the particular product. In the RX 5700 XT, each slice is 256KB and the total capacity is 4MB.”


256KB to 8MB.
 
Last edited:

yewles1

Member
The spec listing is in the apple mac os beta drivers

/System/Library/Extensions/AMDRadeonX6000HWServices.kext/Contents/PlugIns/AMDRadeonX6000HWLibs.kext/Contents/MacOS/AMDRadeonX6000HWLibs.

The L2 sizes are from estimates based on that data.

Thats all we got, and all the tech you tubers, who may also be using the apple mac os data lol for their "sources" :messenger_beaming: .

Note AMD have patented infinity cache, so it is coming together.

So its real information, but is it correct and up to date ? Who knows, but its a decent source at least.

It is a speculation thread, and we use info that we have that is credible.
I have an idea, but lack the resources to experiment so. Is there a way to compare the transistor count of an RDNA1 CU to a GCN 1.2 CU?
 
Last edited:

THE:MILKMAN

Member
That is not correct RDNA can have from 512KB to 8MB shared L2 cache.

RX 5700 and RX 5700XT have 4MB.

Not sure I follow? 4MB L2 was referring to the 5700/5700 XT. What RDNA1 cards beyond those have >4MB L2? Maybe you're right that 8MB is possible but there are no RDNA1 cards with such a config right?
 

ethomaz

Banned
Not sure I follow? 4MB L2 was referring to the 5700/5700 XT. What RDNA1 cards beyond those have >4MB L2? Maybe you're right that 8MB is possible but there are no RDNA1 cards with such a config right?
I posted the RDNA doc... that is exactly what it says... RDNA can have L2 shared cache up to 8MB.

4MB is a particular product choice by AMD words.
 
Last edited:

Bo_Hazem

Banned
8k, my opinion , a purest cinema goer will want least 16k (digital)ultimately and eventually,to meet film grain detail , but , 8k is needed , now , as assets are 8k and above , texture maps , so relevant , , I think poking fun at a 1080 usage within the control of the system , in this day and age , is totally acceptable , anything else is damage control.

That being said , a.i is getting better at upscaling , and adding pixels , been in use for years , Philips pixel plus TVs bein the first , not merely sharpening the picture , but upping pixels , the better the pixel density, the better the detail. Are you a purest , or a boxist :)

Don't know what you mean by Purest and Boxist. But Scientifically 16K is the near-limit for traditional tv sizes for someone to recognize a difference. People who think 8K is a gimmick "after" seeing it in person means they don't have a sharp vision to begin with.

Sony is making Xperia 1 II smartphone with 4K screen, and been doing it since like 6-7 years back, and yes you should see an upgrade in that tiny 6.5" screen over 1080p.

I remember when I went from OG PS4 to PS4 Pro in 2016 I was so happy looking at the 4K UI, it's simply so clean and sharper. 1080p UI in 2020 is just unacceptable.
 
Last edited:

martino

Member
now i think i will wait games using mesh shader start to ship before choosing what i will replace my 1080ti with
i have more than enough to play on ps5 and launch games will also become cheap enough until this happens.
 

Stooky

Member
8k, my opinion , a purest cinema goer will want least 16k (digital)ultimately and eventually,to meet film grain detail , but , 8k is needed , now , as assets are 8k and above , texture maps , so relevant , , I think poking fun at a 1080 usage within the control of the system , in this day and age , is totally acceptable , anything else is damage control.

That being said , a.i is getting better at upscaling , and adding pixels , been in use for years , Philips pixel plus TVs bein the first , not merely sharpening the picture , but upping pixels , the better the pixel density, the better the detail. Are you a purest , or a boxist :)
Most movies in the theatre are 2k upsampled and projected at 4k. 16k is decades out there is no system available to move that content around. 4k is more than enough to handle any film grain detail. 8k is only good depending on screen sizes 80’+. You could have a smaller 8k but the current technology used to acheivr this (LCD) has artifacts ( blooming, black levels, ghosting) that kill the image resolution advatange. Oled 8k has heat issues with densely packed pixels, high nits at smaller sizes. While it might be able to hit 8k the light output would be dem. Qd-oled q-Ned have light leakage and blue light issues. Once figure out this will bring us 12 bit panels, a true rgb panel like crt, with oled black levels, high nits, this is the future probably about 5 years out. Until then 4k is king, 8k is a waste. 12bit panels, 4:4:4 content is a bigger upgrade than 8k imo. Most content is 8bit 4:2:2 We haven’t hit that with 2k yet.
 
Last edited:

Bo_Hazem

Banned
I've heard lack of ram can cause ssd data to stutter through from time to time, ps5 is said to have ssd ram and I hear xbox lacks ssd ram. If this is true, and dirt streams, it could be issues with ssd data transfer. Issues that wouldn't be present on ps5.

You mean DRAM, yes. The pop-ins in Dirt 5 on XSX are just outrageous, being as close as very few meters away! Glad Cherno pointed that out, and environment looks extremely artificial compared to older Dirt games.
 
Last edited:

Bo_Hazem

Banned
Yes it is. Lacking DRAM was obviously done to cut costs. We will see if it has any stuttering issues due to this which is they biggest advantage of having DRAM.



With the speeds that they are going with they only need two.

Things aren't that simple, especially if you see cheaper SSD's still using 4x lanes. It's a very weird decision, and it'll 100% have its impact just as dumb as going DRAM-less.
 

Dodkrake

Banned
Hope everybody has their popcorn ready. From the looks of things it looks more and more likely that the "PS5 is RDNA 1.5" FUD was just projection from the green camp. Just wait for confirmation that the PS5 has at least 8MB L2 cache.

Also, please remember that Xbox claims 25% IPC gains over the One X, which is GCN. RDNA 2 is 25 % over RDNA 1, which in turn is 25 % over GCN. Probably going to be moderated over this, and that's ok, but in 10 days approximately we'll know more.
 
Last edited:
T

Three Jackdaws

Unconfirmed Member
Oh boy (timestamp)


Assassin's Creed 1 and 2 (Ezio's trilogy) are my all time favourites, I liked 3 and 4 but nothing mind blowing like previous ones. I gave up on the games after AC 4 but tried playing Origins but I really didn't like it, for me the games have lost the "spirit" of the first and second which is such a shame, they're now just shitty RPG's, plagued with bugs, micro transactions that the Ubisoft machine churns out every year or two. Off topic but had to vent for what they did do one of my favourite franchises.
 

gojira96

Member
I'm not liking what I'm seeing here, maybe because it's 1080p, it looks horrendous:




Here it looks much better, but not sure what HW it's running on but most likely PC:




But still, draw distance looks like current gen, dirty in the far distance. I'm a big Assassin's Creed fan, played all of them except Rogue as it came to PS4 pretty late. Usually preorder the gold edition, but this time I might wait as I'm not liking what I'm seeing.


I liked Origins and Odyssey a lot, and will probably pick this up too!
But knowing Ubisoft, this game will be on sale for a 1/3 of the price by February.
I'd recommend waiting for the next-gen version, because while the core game will stay the same, at least 4K60fps will look better
 

Nowcry

Member


Someone has been polishing so that it reflects like a mirror the floor of the parade ground and all the granular details of the floor have been lost to become totally flat. Totally realistic ....... the PC shot with the 3090 is more real. Another thing is that you like the super mega ultra brightness non-realistic reflection more.

PS: With that shot it could be SSR and just be a configuration with more reflections. Nothing to do with greater hardware capacity.

I need it in motion to know if it is SSR or RT.
 
Last edited:
Hope everybody has their popcorn ready. From the looks of things it looks more and more likely that the "PS5 is RDNA 1.5" FUD was just projection from the green camp. Just wait for confirmation that the PS5 has at least 8MB L2 cache.

Also, please remember that Xbox claims 25% IPC gains over the One X, which is GCN. RDNA 2 is 25 % over RDNA 1, which in turn is 25 % over GCN. Probably going to be moderated over this, and that's ok, but in 10 days approximately we'll know more.

You're saying that you're happy with this opportunity to make FUD with something that everyone should know that it's a lie but pretends that may be true for then more days?
 

TLZ

Banned
We have seen many games running without such problems on PS5, with extremely higher quality and RT. Isn't the pattern obvious so far?
I honestly either don't remember or haven't seen any games with or without popins on the PS5. Can you show me some? I also don't remember seeing a multiplat game on both systems to compare. So I'm hoping it's an optimization thing rather than a system one.
 

Bo_Hazem

Banned
I honestly either don't remember or haven't seen any games with or without popins on the PS5. Can you show me some? I also don't remember seeing a multiplat game on both systems to compare. So I'm hoping it's an optimization thing rather than a system one.

Maybe you weren't paying attention, mate. Just go to PlayStation youtube channel and type "PS5". And yes, no multiplat is shown in both sides, but multiplats on PS5 are miles ahead in terms of quality and raytracing support.
 
I honestly either don't remember or haven't seen any games with or without popins on the PS5. Can you show me some? I also don't remember seeing a multiplat game on both systems to compare. So I'm hoping it's an optimization thing rather than a system one.
Well, we can't do a 1:1 comparison but we have videos of Yakuza Like a Dragon running on PS5 and XSX.
 

martino

Member
Assassin's Creed 1 and 2 (Ezio's trilogy) are my all time favourites, I liked 3 and 4 but nothing mind blowing like previous ones. I gave up on the games after AC 4 but tried playing Origins but I really didn't like it, for me the games have lost the "spirit" of the first and second which is such a shame, they're now just shitty RPG's, plagued with bugs, micro transactions that the Ubisoft machine churns out every year or two. Off topic but had to vent for what they did do one of my favourite franchises.
the assassin output is back. you will have to assassinate people from the order (still not templar at point of time apparently) in mission like the first game. infiltration using social bending (in village or city) will be present.
problem is it was only partially present in the demo and need to believe ubisoft on it for now.
here and here
 
Last edited:
Status
Not open for further replies.
Top Bottom