Neo Blaster
Member
Could those Big Navi values be boost clocks?So the PS5 GPU has not been overclocked at all, it was Series X GPU which was underclocked! Interesting
Could those Big Navi values be boost clocks?So the PS5 GPU has not been overclocked at all, it was Series X GPU which was underclocked! Interesting
That's interesting. Thanks, I did not know that.Sure it does. But do you run your oled at max brightness? Hdr requires high peak brightness to show its full potential. New hdr standards go up over 1000 nits brighnes and than oled user play that at 100-200nits to save displays.
Oh, new FUD out of thin air...Does anyone else fee like some massive cock-up is going to happen in November?
Like a huge shipping shortage, or mass hardware failures.
I don't know why. I just feel it in my bones. Like a sixth sense.
There is no such thing like boost clock in RDNA2Could those Big Navi values be boost clocks?
Oh, new FUD out of thin air...
I think Xbox will be fine, the bosses have had units in their homes for almost an entire year now so problems should have been flagged up. But I'm not convinced by PS5. This cooling tech seems to be very space age what with metallic paste or something.
No totally out of thin air. More like a statistical probability.
AMD GPUs tend to have 3 clocks. Boost clock, Game clock, and Base clock.Could those Big Navi values be boost clocks?
The RX 5000 series has a boost clock. Why wouldn't the 6000s?There is no such thing like boost clock in RDNA2
So the PS5 GPU has not been overclocked at all, it was Series X GPU which was underclocked! Interesting
I’ll just say HDR and Oled dont like each other
Wtf? They love each other.
No totally out of thin air. More like a statistical probability.
Ah, where were you when ps5 rdna 1.5 fud were around. Man, I just drafted a nice article about XSX gpu actually is rdna 1.1 based on the clock speed, shader layout and yields. took me so much time bro. Can’t shut me down like this easilyIt's not that, it is just that it is an argument that takes you down a paranoid path where you believe that multi-billion dollar corporations, that are governed by a massive number of international laws, are lying about what is in the boxes.
If I am a company and I press release that console Y does use a certain tecnology, and it comes to light that that specific technology isn't there, I would be liable to a number of lawsuits. If AMD delcares that XSX and PS5 are based on the RDNA2 architecture, it is true.
Thats XBOX marketing. Hardware focused.That's marketing for you, nothing else
What ?I’ll just say HDR and Oled dont like each other
my guess, oled have less nits, are dimmer than LEDs. OLED in general reach 300 nits (fullscreen), peak at 700 (center), while LED reach 2000+ nits depending your budget. the sony costs 1/3rd of the LG and except for poor viewing angles, it´s up there (kinda) with the LG.
No, I think it is because for PS5 they want minimal differential between clockspeed at maximum occupancy(say 90%?) and normal occupancy(say 40-70?) - within the thermal and noise envelope they needed following PS4 - where they are getting the 2.23GHz clockspeed.Cerny said it gave less performance due to logic, could be the performance tails off a bit, it does seem excessive.
Where are all the frequency concern posters gone
DLSS 2.0 is superior to Checkerboard rendering.Whats all this DLSS hype. PS4 Pro was doing supersampling before Nvidia even brought it to PC space. (2 years before IIRC). People hyping up old tech thats been in consoles for years. How you think PS4 games look so good.
Sony does image quality in their sleep (camera division).For the record PS5 already showcased the upgraded super sampling in the UE5 demo. It's just Sony ain't got time to be naming everything they do. Whats the point if its just basic tool. Cant be naming everything.
Is this a US thing? Where everything has to have a fancy name ,badge ,logo. Marketing pr spin .For people to be sold on how good something is or works.
So is burn in. Thats my fear with OLED. LCDs cost less, and are safer investment. Plus, I don't sit off axis so viewing angles don't matter to me.In dark room you really don't care.
And blacks and viewing angle are important.
Ah, where were you when ps5 rdna 1.5 fud were around. Man, I just drafted a nice article about XSX gpu actually is rdna 1.1 based on the clock speed, shader layout and yields. took me so much time bro. Can’t shut me down like this easily
my guess, oled have less nits, are dimmer than LEDs. OLED in general reach 300 nits (fullscreen), peak at 700 (center), while LED reach 2000+ nits depending your budget. the sony costs 1/3rd of the LG and except for poor viewing angles, it´s up there (kinda) with the LG.
New OLEDs from the last couple of years have various methods to reduce the OLED brightness for static parts of the screen, like HUDs and logos, to avoid burn-in.Sure it does. But do you run your oled at max brightness? Hdr requires high peak brightness to show its full potential. New hdr standards go up over 1000 nits brighnes and than oled user play that at 100-200nits to save displays.
my guess, oled have less nits, are dimmer than LEDs. OLED in general reach 300 nits (fullscreen), peak at 700 (center), while LED reach 2000+ nits depending your budget. the sony costs 1/3rd of the LG and except for poor viewing angles, it´s up there (kinda) with the LG.
Wow, I was actually eyeing the x900h. the price is attractive. but i didn’t realize it is so close to oled. I may pull the trigger this holiday season
Even if this turns out to be 100% true and the X Series X's CUs are underperforming, it still has more CUs than the PS5. So the thing kind of balances itself.Ha ha. Yes it looks like it, to keep it running cool as they didnt want to go expensive with cooling like PS5.
But obviously now we know RDNA2 scales well with frequency past 2Ghz, so XsX has left some performance on the table.
Even if this turns out to be 100% true and the X Series X's CUs are underperforming, it still has more CUs than the PS5. So the thing kind of balances itself.
I think most our legit insiders are right. Both machines took different paths, but in the end, both realized their goal: They're monsters.
To be frank, I don't care about that, Sircaw. All that matters to me is to see both consoles running multiplats almost identically.The only way to judge would be to know the actual precise costs to build both machines.
Who is taking the bigger lose at 450 bucks.
People might be OMG this one does 3 more fps that that one but in the grand scheme of things if it costs 100 dollars more to build it's not worth it.
I know console war people are only interested in the fps counter or the resolution but there is so much more to which machine is better. We only really scratch the surface, very shallow of us if i am being honest.
Or was Cerny discussing with the state of the architecture and the devkits at the time and that AMD may have improved logic stability since then, as Road to Ps5 was originally slated for earlier in the year?I remember Cerny said they could go higher but then the logic would fall apart. Is it falling apart in the PC GPUs at those clockspeeds?
Or was Cerny discussing with the state of the architecture and the devkits at the time and that AMD may have improved logic stability since then, as Road to Ps5 was originally slated for earlier in the year?
so you put your greasy fingers afterwards on that white, holy controller known as DualSense.
Or was Cerny discussing with the state of the architecture and the devkits at the time and that AMD may have improved logic stability since then, as Road to Ps5 was originally slated for earlier in the year?
If it comes out in 2023, it will be much bigger than 72 CUs due to a better node by then. PS5 Pro well be made to market 8K TVs. This why, I won't waste my money on those 2.1 HDMI TVsSeems like the PS5 Pro will be an easy job, having 72CU's. More proofs that Sony was thinking ahead!
If it comes out in 2023, it will be much bigger than 72 CUs due to a better node by then. PS5 Pro well be made to market 8K TVs. This why, I won't waste my money on those 2.1 HDMI TVs
Those Ark: Survival Evolved Switch trees...You can't take the grass and call it a day. If that was the case, then Halo Infinite should be 240p or 480p according to their trees.
Just for fun guys, use your use, trust your eyes. I'll use 2 screenshots, one is claimed by DF to be native 4K (Halo Infinite), and another claimed to be 1440p with no evidence but BS (Demon's Souls), and well chop the screenshot:
Halo Infinite:
Demon's Souls:
Now let's crop 300x300 pixel out of 3840x2160
Now Halo, the same:
Where is the fucking native 4K here in that blurry mess?
Ok, let's try another one.
300x300
Yup, you can count that easily.
Now again, Demon's Souls
How the fuck you gonna measure that?
Go back to the original PNG's posted above, and zoom for yourself. Yes, DF 1440p is BS, they're not used to this level of quality.
This just shows that resolution doesn't matter. Its the quality of the art, the assets, etc.Just for fun guys, use your use, trust your eyes. I'll use 2 screenshots, one is claimed by DF to be native 4K (Halo Infinite), and another claimed to be 1440p with no evidence but BS (Demon's Souls), and well chop the screenshot:
Halo Infinite:
Demon's Souls:
Now let's crop 300x300 pixel out of 3840x2160
Now Halo, the same:
Where is the fucking native 4K here in that blurry mess?
Ok, let's try another one.
300x300
Yup, you can count that easily.
Now again, Demon's Souls
How the fuck you gonna measure that?
Go back to the original PNG's posted above, and zoom for yourself. Yes, DF 1440p is BS, they're not used to this level of quality.
This just shows that resolution doesn't matter. Its the quality of the art, the assets, etc.
I'm playing The Last of Us Part II on a 1080p screen using PS4 Pro supersampling and the game looks beautiful. I don't care what resolution the game is, there are moments where the characters look borderline photo real in cutscenes.
Demon's Souls looks so so good. I don't care what resolution it renders at internally, its already so good looking.
Well the thing is, that Demon's Souls might actually only be rendering at 1440p. But it doesn't matter. The fidelity is too high. Resolution is overrated and can be misleading.I'm here to provide evidence for their BS, and for their blind apologists. I can do this endlessly and it'll only get more embarrassing for the green side. Low effort FUD again from DF.
The SSD expansion card being so expensive practically killed XSS. Its storage size is too low, and if you want to upgrade it, it will end up costing more than a XSX.
This deal is getting worse all the time...
Just for fun guys, use your use, trust your eyes. I'll use 2 screenshots, one is claimed by DF to be native 4K (Halo Infinite), and another claimed to be 1440p with no evidence but BS (Demon's Souls), and will chop the screenshot:
Halo Infinite:
Demon's Souls:
Now let's crop 300x300 pixel out of 3840x2160
Now Halo, the same:
Where is the fucking native 4K here in that blurry mess?
Ok, let's try another one.
300x300
Yup, you can count that easily.
Now again, Demon's Souls
How the fuck you gonna measure that?
Go back to the original PNG's posted above, and zoom for yourself. Yes, DF 1440p is BS, they're not used to this level of quality.
To be fair, maybe it is 1440p but is being SuperSampled with AI. ( Yet to be released AMD secret sauce comparable to DLSS 2.0 )I'm here to provide evidence for their BS, and for their blind apologists. I can do this endlessly and it'll only get more embarrassing for the green side. Low effort FUD again from DF.
Well the thing is, that Demon's Souls might actually only be rendering at 1440p. But it doesn't matter. The fidelity is too high. Resolution is overrated and can be misleading.
Same with Unreal Engine 5 demo. It doesn't matter what the internal resolution is because our eyes can't see any difference.
Digital Foundry built their entire brand around counting pixels and frames. Heading into next gen, they will end up becoming irrelevant if they don't shift their focus to other things.
No one cares what resolution the UE5 demo is. Same with a lot of next gen games. Such high fidelity defies pixel counting, leaving it useless.
To be fair, maybe it is 1440p but is being SuperSampled with AI.
1440p DLSS 2.0 looks better and sharper in Death Stranding than Native 4K.
But I dont get the 'No RT'. It looks like they have raytracing but I'm not an expert
Still think it’s crazy how we are 6 weeks from launch and still haven’t seen any series X gameplay.
Well, technically speaking we have. But it was this...Still think it’s crazy how we are 6 weeks from launch and still haven’t seen any series X gameplay.