• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

onesvenus

Member
Sure it does. But do you run your oled at max brightness? Hdr requires high peak brightness to show its full potential. New hdr standards go up over 1000 nits brighnes and than oled user play that at 100-200nits to save displays.
That's interesting. Thanks, I did not know that.
 

Evilms

Banned
Black Edition
d8rGEwk.jpg
 
Last edited:

geordiemp

Member
I think Xbox will be fine, the bosses have had units in their homes for almost an entire year now so problems should have been flagged up. But I'm not convinced by PS5. This cooling tech seems to be very space age what with metallic paste or something.

Keep your silly FUD thoughts to yourself, nobody cares and reeks of desperation.

No totally out of thin air. More like a statistical probability.

Care to share your mathematical statistics and how you did your T test and hypothesis.

 
Last edited:

kyliethicc

Member
Could those Big Navi values be boost clocks?
AMD GPUs tend to have 3 clocks. Boost clock, Game clock, and Base clock.

I imagine the 2.2 GHz for the 80 CU card (21) is the boost clock. Same with the 2.5 GHz for the 40 CU card (22) in that leaked info.

But thats a different kind of boost clock than the PS5 uses. And I don't think AMD could use smartshift for a discrete graphics card.
 
Last edited:
So the PS5 GPU has not been overclocked at all, it was Series X GPU which was underclocked! Interesting 🤔

Ha ha. Yes it looks like it, to keep it running cool as they didnt want to go expensive with cooling like PS5.

But obviously now we know RDNA2 scales well with frequency past 2Ghz, so XsX has left some performance on the table.
 
It's not that, it is just that it is an argument that takes you down a paranoid path where you believe that multi-billion dollar corporations, that are governed by a massive number of international laws, are lying about what is in the boxes.

If I am a company and I press release that console Y does use a certain tecnology, and it comes to light that that specific technology isn't there, I would be liable to a number of lawsuits. If AMD delcares that XSX and PS5 are based on the RDNA2 architecture, it is true.
Ah, where were you when ps5 rdna 1.5 fud were around. Man, I just drafted a nice article about XSX gpu actually is rdna 1.1 based on the clock speed, shader layout and yields. took me so much time bro. Can’t shut me down like this easily :lollipop_sad_relieved:

/jk
 

zaitsu

Banned


my guess, oled have less nits, are dimmer than LEDs. OLED in general reach 300 nits (fullscreen), peak at 700 (center), while LED reach 2000+ nits depending your budget. the sony costs 1/3rd of the LG and except for poor viewing angles, it´s up there (kinda) with the LG.

In dark room you really don't care.
And blacks and viewing angle are important.
 
Last edited:

PaintTinJr

Member
Cerny said it gave less performance due to logic, could be the performance tails off a bit, it does seem excessive.

Where are all the frequency concern posters gone
No, I think it is because for PS5 they want minimal differential between clockspeed at maximum occupancy(say 90%?) and normal occupancy(say 40-70?) - within the thermal and noise envelope they needed following PS4 - where they are getting the 2.23GHz clockspeed.

On PC they have more CUs (4 more) and so at full occupancy they will have to either do more cooling, more noise, and most likely drop full occupancy clock more to stay at constant power, which in turn means maximum clock at normal occupancy can go a little higher - but means the clock differential is wider, which may suit PC where they aren't getting maximum optimised to the metal occupancy.
 

duhmetree

Member
Whats all this DLSS hype. PS4 Pro was doing supersampling before Nvidia even brought it to PC space. (2 years before IIRC). People hyping up old tech thats been in consoles for years. How you think PS4 games look so good.

Sony does image quality in their sleep (camera division).For the record PS5 already showcased the upgraded super sampling in the UE5 demo. It's just Sony ain't got time to be naming everything they do. Whats the point if its just basic tool. Cant be naming everything.

Is this a US thing? Where everything has to have a fancy name ,badge ,logo. Marketing pr spin .For people to be sold on how good something is or works.
DLSS 2.0 is superior to Checkerboard rendering.

We're looking for performance equivalent OR BETTER to DLSS 2.0. 1440P with GOOD super sampling seems to be the way to go. We dont know what AMD has but Checkerboard rendering, as is, is not a suitable answer
 

Imtjnotu

Member


my guess, oled have less nits, are dimmer than LEDs. OLED in general reach 300 nits (fullscreen), peak at 700 (center), while LED reach 2000+ nits depending your budget. the sony costs 1/3rd of the LG and except for poor viewing angles, it´s up there (kinda) with the LG.

LG. I better colors. Better blacks. Better contrast. Better hdmi. Alm. VRR. Brightness is fine on an oled.
 

mitchman

Gold Member
Sure it does. But do you run your oled at max brightness? Hdr requires high peak brightness to show its full potential. New hdr standards go up over 1000 nits brighnes and than oled user play that at 100-200nits to save displays.
New OLEDs from the last couple of years have various methods to reduce the OLED brightness for static parts of the screen, like HUDs and logos, to avoid burn-in.
 


my guess, oled have less nits, are dimmer than LEDs. OLED in general reach 300 nits (fullscreen), peak at 700 (center), while LED reach 2000+ nits depending your budget. the sony costs 1/3rd of the LG and except for poor viewing angles, it´s up there (kinda) with the LG.

Wow, I was actually eyeing the x900h. the price is attractive. but i didn’t realize it is so close to oled. I may pull the trigger this holiday season
 

Nickolaidas

Member
Ha ha. Yes it looks like it, to keep it running cool as they didnt want to go expensive with cooling like PS5.

But obviously now we know RDNA2 scales well with frequency past 2Ghz, so XsX has left some performance on the table.
Even if this turns out to be 100% true and the X Series X's CUs are underperforming, it still has more CUs than the PS5. So the thing kind of balances itself.

I think most our legit insiders are right. Both machines took different paths, but in the end, both realized their goal: They're monsters.
 

sircaw

Banned
Even if this turns out to be 100% true and the X Series X's CUs are underperforming, it still has more CUs than the PS5. So the thing kind of balances itself.

I think most our legit insiders are right. Both machines took different paths, but in the end, both realized their goal: They're monsters.

The only way to judge would be to know the actual precise costs to build both machines.

Who is taking the bigger lose at 450 bucks.

People might be OMG this one does 3 more fps that that one but in the grand scheme of things if it costs 100 dollars more to build it's not worth it.

I know console war people are only interested in the fps counter or the resolution but there is so much more to which machine is better. We only really scratch the surface, very shallow of us if i am being honest.
 

Nickolaidas

Member
The only way to judge would be to know the actual precise costs to build both machines.

Who is taking the bigger lose at 450 bucks.

People might be OMG this one does 3 more fps that that one but in the grand scheme of things if it costs 100 dollars more to build it's not worth it.

I know console war people are only interested in the fps counter or the resolution but there is so much more to which machine is better. We only really scratch the surface, very shallow of us if i am being honest.
To be frank, I don't care about that, Sircaw. All that matters to me is to see both consoles running multiplats almost identically.
 
Or was Cerny discussing with the state of the architecture and the devkits at the time and that AMD may have improved logic stability since then, as Road to Ps5 was originally slated for earlier in the year?

That could also be true. And if true it means they didn't increase the clockspeed since then because of other issues.
 

THE:MILKMAN

Member
Or was Cerny discussing with the state of the architecture and the devkits at the time and that AMD may have improved logic stability since then, as Road to Ps5 was originally slated for earlier in the year?

I'm thinking it has more to do with the fact the PS5 is a APU that has a CPU and that substantial I/O hardware all in one chip? This might make it even harder to push clocks beyond 2.23GHz and why a dGPU might be able to go a little higher?
 

TheGejsza

Member
OLED vs LED - I have spent like 1,5 year searching for TV for next gen.

Summed up conclusion - OLED is way better for movies. LED is bit better for gaming.

As someone mentioned LED's have way higher brightnes. And in fact in HDR mode YOU HAVE TO put brightness on max. Properly prepared image should make use of the whole spectrum of brightness - like normal things on screen will have 300-500 nits but when you look at sun it will go to 1000 nits +.

OLEDS have lower brightness but infinite contrast - in dark scenes they are superb. Magic of oled is they they can "turn off" invidual pixels totally so we got true blackness. Currently OLED with good HDR should go above 600 nits.

But the problem with OLED's in gaming is that I personally did not find many totally black scenes. Mostly devs try to show you something and even in night scenes in games everything is properly lit. You can't see much of OLED advantage. And LEDs are way better in overall bright sceens (which I think are way more in games). As I mentioned previously - RDR2 in a bright day when you look at sun it blinds you on LED. LEDs can go way brighter on way bigger part of screens. OLED can give way better "point of light" surrounded by darkness.

So after many hours of testing I come to conclusion that for ONLY gaming LED is better. I can't say much about burn-ins (I did not use any OLED for more than a week) but they are true and especially in games with a lot of HUD.
 

Bo_Hazem

Banned
UPDATED:

Just for fun guys, use your use, trust your eyes. I'll use 2 screenshots, one is claimed by DF to be native 4K (Halo Infinite), and another claimed to be 1440p with no evidence but BS (Demon's Souls), and will chop the screenshot:

Demon's Souls:

vlcsnap-2020-09-28-00h01m46s240.png


Halo Infinite:

vlcsnap-2020-09-28-00h18m25s787.png


800%:

vlcsnap-2020-09-28-00h18m25s787.png


vlcsnap-2020-09-28-00h01m46s240.png


Why DS is having more pixels, smaller? It's either DS is 8K or Halo is 1440p but DF were too tired to acknowledge!
 
Last edited:

sircaw

Banned
Just for fun guys, use your use, trust your eyes. I'll use 2 screenshots, one is claimed by DF to be native 4K (Halo Infinite), and another claimed to be 1440p with no evidence but BS (Demon's Souls), and well chop the screenshot:

Halo Infinite:

vlcsnap-2020-09-28-00h00m55s733.png


Demon's Souls:

vlcsnap-2020-09-28-00h01m46s240.png


Now let's crop 300x300 pixel out of 3840x2160

dsc1.jpg


dsc1-1.png


Now Halo, the same:

hic1.jpg


hic1-1.png


Where is the fucking native 4K here in that blurry mess?

Ok, let's try another one.

vlcsnap-2020-09-28-00h18m25s787.png


300x300

hic2.jpg


hic2-1.png


Yup, you can count that easily.

Now again, Demon's Souls

dsc1-1.png


How the fuck you gonna measure that?

Go back to the original PNG's posted above, and zoom for yourself. Yes, DF 1440p is BS, they're not used to this level of quality.


HALO LOOKS BETTER. "lollipop_disappointed:

I am trying to make xbox friends, leave me alone.

@X-Fighter DarkMage619 DarkMage619 Doncabesa Doncabesa Bill O'Rights Bill O'Rights can i have some hearts plz.
 
Last edited:

kyliethicc

Member
Just for fun guys, use your use, trust your eyes. I'll use 2 screenshots, one is claimed by DF to be native 4K (Halo Infinite), and another claimed to be 1440p with no evidence but BS (Demon's Souls), and well chop the screenshot:

Halo Infinite:

vlcsnap-2020-09-28-00h00m55s733.png


Demon's Souls:

vlcsnap-2020-09-28-00h01m46s240.png


Now let's crop 300x300 pixel out of 3840x2160

dsc1.jpg


dsc1-1.png


Now Halo, the same:

hic1.jpg


hic1-1.png


Where is the fucking native 4K here in that blurry mess?

Ok, let's try another one.

vlcsnap-2020-09-28-00h18m25s787.png


300x300

hic2.jpg


hic2-1.png


Yup, you can count that easily.

Now again, Demon's Souls

dsc1-1.png


How the fuck you gonna measure that?

Go back to the original PNG's posted above, and zoom for yourself. Yes, DF 1440p is BS, they're not used to this level of quality.
This just shows that resolution doesn't matter. Its the quality of the art, the assets, etc.

I'm playing The Last of Us Part II on a 1080p screen using PS4 Pro supersampling and the game looks beautiful. I don't care what resolution the game is, there are moments where the characters look borderline photo real in cutscenes.

Demon's Souls looks so so good. I don't care what resolution it renders at internally, its already so good looking.
 

Bo_Hazem

Banned
This just shows that resolution doesn't matter. Its the quality of the art, the assets, etc.

I'm playing The Last of Us Part II on a 1080p screen using PS4 Pro supersampling and the game looks beautiful. I don't care what resolution the game is, there are moments where the characters look borderline photo real in cutscenes.

Demon's Souls looks so so good. I don't care what resolution it renders at internally, its already so good looking.

I'm here to provide evidence for their BS, and for their blind apologists. I can do this endlessly and it'll only get more embarrassing for the green side. Low effort FUD again from DF.
 

kyliethicc

Member
I'm here to provide evidence for their BS, and for their blind apologists. I can do this endlessly and it'll only get more embarrassing for the green side. Low effort FUD again from DF.
Well the thing is, that Demon's Souls might actually only be rendering at 1440p. But it doesn't matter. The fidelity is too high. Resolution is overrated and can be misleading.

Same with Unreal Engine 5 demo. It doesn't matter what the internal resolution is because our eyes can't see any difference.

Digital Foundry built their entire brand around counting pixels and frames. Heading into next gen, they will end up becoming irrelevant if they don't shift their focus to other things.

No one cares what resolution the UE5 demo is. Same with a lot of next gen games. Such high fidelity defies pixel counting, leaving it useless.
 
The SSD expansion card being so expensive practically killed XSS. Its storage size is too low, and if you want to upgrade it, it will end up costing more than a XSX.

This deal is getting worse all the time...


You have seen the prices of the new pci4 nvme drives right???.....their base prices start at the same price as that expensive expansion slot

personally I hope you can move games from an external drive to internal as u use them as both solutions cost almost half the price of the consoles
 
Just for fun guys, use your use, trust your eyes. I'll use 2 screenshots, one is claimed by DF to be native 4K (Halo Infinite), and another claimed to be 1440p with no evidence but BS (Demon's Souls), and will chop the screenshot:

Halo Infinite:

vlcsnap-2020-09-28-00h00m55s733.png


Demon's Souls:

vlcsnap-2020-09-28-00h01m46s240.png


Now let's crop 300x300 pixel out of 3840x2160

dsc1.jpg


dsc1-1.png


Now Halo, the same:

hic1.jpg


hic1-1.png


Where is the fucking native 4K here in that blurry mess?

Ok, let's try another one.

vlcsnap-2020-09-28-00h18m25s787.png


300x300

hic2.jpg


hic2-1.png


Yup, you can count that easily.

Now again, Demon's Souls

dsc1-1.png


How the fuck you gonna measure that?

Go back to the original PNG's posted above, and zoom for yourself. Yes, DF 1440p is BS, they're not used to this level of quality.

bro, your ds screenshot is actually at a lower quality format. but anyway, here is the 800% view of both. i tried very hard to find evidence of image reconstruction in ds. I just couldn't. thus a few dozen pages back, i asked whether it is indeed 4k/60.

edit: okay i fked it up. how do i upload image actually
 
Last edited:

duhmetree

Member
I'm here to provide evidence for their BS, and for their blind apologists. I can do this endlessly and it'll only get more embarrassing for the green side. Low effort FUD again from DF.
To be fair, maybe it is 1440p but is being SuperSampled with AI. ( Yet to be released AMD secret sauce comparable to DLSS 2.0 )

1440p DLSS 2.0 looks better and sharper in Death Stranding than Native 4K... While ALSO having better framerate. I'd honestly prefer 1440p supersampled with higher frame rates than native 4K.

But I dont get the 'No RT'. It looks like they have raytracing but I'm not an expert
 
Last edited:

Bo_Hazem

Banned
Well the thing is, that Demon's Souls might actually only be rendering at 1440p. But it doesn't matter. The fidelity is too high. Resolution is overrated and can be misleading.

Same with Unreal Engine 5 demo. It doesn't matter what the internal resolution is because our eyes can't see any difference.

Digital Foundry built their entire brand around counting pixels and frames. Heading into next gen, they will end up becoming irrelevant if they don't shift their focus to other things.

No one cares what resolution the UE5 demo is. Same with a lot of next gen games. Such high fidelity defies pixel counting, leaving it useless.

The difference is, UE5 demo has some slight softness indicating below 4K resolution, but the polygon count was too high that makes you hesitant. But you can see and feel some slight softness/blurriness in the UE5. Here with Demon's Souls it's razor-sharp. It could be AI reconstruction? No one knows, we need officials to say so, or solid evidence. So far, all evidence show that it's native 4K, with some assets aren't as high like grass.

To be fair, maybe it is 1440p but is being SuperSampled with AI.

1440p DLSS 2.0 looks better and sharper in Death Stranding than Native 4K.

But I dont get the 'No RT'. It looks like they have raytracing but I'm not an expert

It 100% has raytracing. And if it's the new patented AI image reconstruction from Sony then that makes sense. But the final image/resolution is 4K, if not native 4K. It's pretty easy to spot below 4K when viewing it in a big screen.
 
Last edited:
Status
Not open for further replies.
Top Bottom