• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Dibils2k

Member
Well according to you it will look even better on my monitor. So if a youtube video looks good then the game on my monitor will look even better. Than there's those screenshots captured from a PS5 that I've seen on my screen.
i mean if you play on a monitor then yeah 1440p would be good enough, not on 55"+ TVs though
 
I think Cerny said there's enough both to run both at max clocks. I believe he said the variation in clockspeed is dependent on the workload. Plus from what I've read here it appears the PS5 can change it's clocks extremely quickly so it might not even be noticeable to the player.
Pretty much what I was thinking. Lots of people misunderstood what he was saying so they assumed PS5 dOwNcLOcKs. There's a difference between running at max frequency and max load. Any hardware running max load will eventually have to downclock which certainly includes Series X.

Also, from the looks of it, seems PS5 has better overall cooling than Series X to begin with. Big boy console for big boy cooling.
 

DaGwaphics

Member
Well, no one said it's totally RDNA1, it's custom, both are. But everyone went with a direction, and PS5 seems inline with RDNA2 with leaks of better custom RT and Geometry Engines, along with the exclusive GPU cash scrubbers. 10 days ;)

Better custom RT you say, on the silicone?
 
Pretty much what I was thinking. Lots of people misunderstood what he was saying so they assumed PS5 dOwNcLOcKs. There's a difference between running at max frequency and max load. Any hardware running max load will eventually have to downclock which certainly includes Series X.

Also, from the looks of it, seems PS5 has better overall cooling than Series X to begin with. Big boy console for big boy cooling.

Plus there's the misunderstanding on what smart shift actually does. When the GPU is under stress and requires more power it can get it from the CPU if the CPU has power to spare.
 

ToadMan

Member
Hotchips is a place were you go to give technical details and be honest!
There are questions that they may not answer because of AMD's NDA, but to think that they would hide and obfuscate this way is a ludicrous idea.

Sorry, but this is a ridiculously naive idea.

You may - yes May - get genuine objective accuracy (truth if you wish to call it that) in a scientific setting. A peer reviewed journal for example.

Beyond that, for big business and certainly tech, everything is about presentation to generate revenue and profits.

The companies that weren’t good at that are out of business.

I remember a senior IBM computer scientist saying “when people ask me what industry I work in, I tell them it’s the fashion industry”.

Right now the fashion is cloud computing and recurring licenses in things like gamepass and PS now. There’s computing power and tthe promise of experiences not possible on a competitions devices.

These aren’t new ideas - they’re just recycled for this “season”.

Hot chips is controlled by marketing just as much as Cerny’s thing was, just as much as the Xsex YouTuber marketing just as much as Sony bringing youtubers to their offices.

The only way you’ll differentiate all this is by understanding the tech and being able to determine for yourself what truth you prefer.

If you can’t do that, you’ll have to wait until you get your hands on the console for yourself.
 
Last edited:

Bo_Hazem

Banned
Demon Souls is at 1440P yet it doesn't look like it to me. Whatever reconstruction they are doing it's working pretty well for the game.

Credit to Bo_Hazem Bo_Hazem for the capture.

dfdfdfffff.png

It's not 1440p, it's not fucking 1440p! A clown at DF that had no idea and blatantly lying/guessing 1440p is being taken like holy words. It's too 4K to most so-called 4K games shown on other platforms, it's just on another level. Is it reconstructed? We don't fucking know, no one knows but Bluepoint and Sony. So unless you have something official, please drop this BS with all due respect.

EaS3Sn7WoAYwAg-


It's fidelity mode and framerates mode.

And here, this is the better version of that framerates mode screenshot:

vlcsnap-2020-09-27-18h09m45s143.png


 
Last edited:

geordiemp

Member
Better custom RT you say, on the silicone?

We have seen better RT on Sony games so far, and RT has a big performance cost, so posters are fine to speculate why there is a void so far (we await Ubisoft but i would not place all my eggs in that basket).

Whats your thoughts , instead of challenging others ?

I dont buy the DX12 api not ready / emulation argument, Nvidia did not have issues with the new cards, bet AMD dont either, and DX12 is not exacttly new.

Also when you type silicon, I keep reading chest enlagement :messenger_beaming:
 
Last edited:
It's not 1440p, it's not fucking 1440p! A clown at DF that had no idea and blatantly lying/guessing 1440p is being taken like holy words. It's too 4K to most so-called 4K games shown on other platforms, it's just on another level. Is it reconstructed? We don't fucking no, no one knows but Bluepoint and Sony, so unless you have something official, please drop this BS with all due respect.

EaS3Sn7WoAYwAg-


It's fidelity mode and framerates mode.

And here, this is the better version of that framerates mode screenshot:

vlcsnap-2020-09-27-18h09m45s143.png



I wasn't talking about the 4K 30FPs mode with RT btw.

Just going with what Digital Foundry says until we get official confirmation that's all.
 
Last edited:

DaGwaphics

Member
We have seen better RT on Sony games so far, and RT has a big performance cost, so posters are fine to speculate why there is a void so far (we await Ubisoft but i would not place all my eggs in that basket).

Whats your thoughts , instead of challenging others ?

I dont buy the DX12 api not ready / emulation argument, Nvidia did not have issues with the new cards, bet AMD dont either, and DX12 is not exacttly new.

Not challenging others at all. Was just pondering the idea out loud.
 

Bo_Hazem

Banned
I wasn't talking about the 4K 30FPs mode with RT btw.

Just going with what Digital Foundry says until we get official confirmation that's all.

I know, DF is just throwing FUD. It's FUD if not scientifically proven. Imagine calling Halo Infinite 720p on XSX because VRS gimped a big chunk of the screen?

It could be a new advanced AI image reconstruction. This is what's official so far:

  • Stunning visuals: See the dark, gritty world of Demon’s Souls come to life on the PS5 console with beautifully enhanced visuals. Players can choose between two graphics modes while playing Demon’s Souls: 4K Mode (play in 4K resolution2) and High Frame Rate Mode (play with a higher targeted frame rate.)

Meaning it'll sacrifice 4K (dynamic) when needed, and grass is lower res if you view closely, and lower/non use of RT.

That's what's official, we stick with that until we see hard evidence or told by Bluepoint/Sony. Sacrificing RT alone could easily guarantee double the framerates, while it still may have RT at 4K@60fps just like GT7.
 
Last edited:

gmoran

Member
The lite refers to the lower spec process I believe, hence the lower 1.9 Ghz, not the size of the IF cache. Ps5 has a bigger bus than Navi 22 (only 192 vs 256 ps5) but ps5 will have a smaller L2 for sure, no way its 96 MB..

Lets hope Cerny has done his calcs and balanced it for ray trace performance per die cost $. What do you think ?

Sorry if this has already been posted, too many pages on this thread, but this seemed relevant:







These tweets are part of the mega-thread around the KittyYYuko leak thread.
 

ToadMan

Member
After losing two laptop HDDs in the span of a year, i coughed up the money for an SSD. A cheap one, but still it is lasting longer than HDDs so far. Moving parts just are inherently dying faster.

I don’t disagree at all.

The article I was referring to (from Tomshardware) suggested that a dram-less SSD will instead write small amounts of data more often to the SSD, and that will reduce the lifespan.

Rather alarmingly the article suggested TLC nand could expire in a little over a year (avoiding the warranty claim) and that some companies put this type of SSD in their machines to reduce cost while avoiding warranty claims.

I’m sure you’ve not cheaped out too much on your SSD though so you should be fine 👍
 
Last edited:
I know since the rtx cards debuted, people think all things shinier are better, but that is not really the case. The settings on PC are at insane quality. The shinier effect on Series X is not necessarily better, it's not using RT for reflections here btw, they've simply decreased on the opacity levels. This has nothing to do with Series X doing something the 3090 can't....or something extraordinary. I suspect the PC game will be patched with the same enhancements the series X is receiving. I don't believe all effects are enhancements though. Some of the tweaks we see on the patched Gears 5 would be done to improve framerate to 4k 60 and 4k 120fps levels.......So, You might see some aesthetic changes that may look more appealing at first glance, but not necessarily more GPU intensive. Optimization is a weird word...
Can’t you see the XSX is doing the heavy job of mopping the floor? /s
 
Sorry if this has already been posted, too many pages on this thread, but this seemed relevant:







These tweets are part of the mega-thread around the KittyYYuko leak thread.


Interesting that he says this. However this could all just be BS to get playstation fans hopes up. It's not like we haven't had that kind of thing before.

Anyways I'm just happy this is going to be over soon for the most part.
 

Antelope

Member
How much sound quality is lost using headphones plugged into the DS4/5(?) compared to wireless with a usb dongle?

I’m trying to choose between the 3D Pulse headset or buying a pair of Sony xm3s to use my current vmoda mic.
 

gmoran

Member
Agreed, I've asked him if it's speculation, or if he has a source.

RedgamingTech has a video up on the XSX RDNA1 leak, so will be interested to see his analysis on that.

Obviously both companies had very different goals for this new gen, both rational from their point of views. Looking forward to getting the low down on RDNA2 and at some point detail on PS5's SOC, so we can get a better insight into why both went the ways they did - fascinating stuff.
 
Sorry if this has already been posted, too many pages on this thread, but this seemed relevant:







These tweets are part of the mega-thread around the KittyYYuko leak thread.


I'm not coming down on this on either side, but remember those strange rumours that Sony and AMD were working together to create Navi:


Forbes said:
According to my sources, Navi isn't just inside the Sony PS5; it was created for Sony. The vast majority of AMD and Sony's Navi collaboration took place while Raja Koduri -- Radeon Technologies Group boss and chief architect -- was at AMD.

The other interesting aspect to all of this is that my sources never mentioned Microsoft in the Navi conversations. This is pure speculation, but maybe Microsoft's next Xbox devices -- code-named "Scarlett" -- won't use Navi at all. Perhaps it will use a separate semi-custom solution incorporating Vega, or something else entirely that we're not privy to. Either way, the conversations I had referred to Navi in the past tense, as if it was already finished.

Interesting, this is all the way back in June 2018 remember.
 

Dibils2k

Member
Hmm maybe you should game on a monitor instead since 4K is really only standard on PCs. I mean both my One X and Pro look great on my monitor. Could be a good idea if you did the same.
nah i find small displays lack immersion and also i wanna play on my comfy couch

thanks for you concern though
 

Dibils2k

Member
What about hooking a PC to your TV? You can still game on a controller that's how I do my gaming on PC.
i tried that early this gen, just doesnt work aswell, gave up on it after a year, was one of my worst purchasing decisions. wasnt a total waste though as i atleast got to play witcher 3 at 1080p/60fps
 

Bo_Hazem

Banned
I liked Origins and Odyssey a lot, and will probably pick this up too!
But knowing Ubisoft, this game will be on sale for a 1/3 of the price by February.
I'd recommend waiting for the next-gen version, because while the core game will stay the same, at least 4K60fps will look better

Played all AC games since the beginning. I'm a big fan. They shifted to this new RPG BS, but still great games. Might pick it up later after Spiderman MM and other games.
 
Last few pages we are back at it with which console is better rhetoric. It'd be better if we save the energy for that debate once when both console launch along with the third party multiplatform games. Until then lets stick to how an upcoming awesome xyz game looks on an xyz platform :)
 

Bo_Hazem

Banned
The PS5 version is leaked, unofficial. So we better wait for an official one.

DMC5 is running with RT on PS5, with no RT on XSX so far for example. It's all over the place, with PS5 having the same or better version so far, but FPS comparisons gonna be interesting, very interesting.
What's another example?
 
This is too much techno-babble for me. I really don't understand what's going on these last few pages.

For a layman, am I to infer that the current rumors is that PS5 utilizes more advanced technology than the XsX? Something like that? It's weaker, but not neccesarily because of the things it has from RDNA2 supposedly?
 
i tried that early this gen, just doesnt work aswell, gave up on it after a year, was one of my worst purchasing decisions. wasnt a total waste though as i atleast got to play witcher 3 at 1080p/60fps

Well if your going to hook up a PC to a 4K TV you have to make sure it's powerful enough to output your games at 4K. At least with PC you can get a guarantee of that but with consoles your limited by whatever resolution the developers want to give you.

I'm mostly picking up a PS5 for the exclusives and the much quicker I/O but I know it won't handle everything at 4K.
 
2K21, COD, Avengers will all have RT, not yet for XSX. COD/2K21 showed it, xbox took PS5 gameplay and used it as "next gen build" in their channel instead of showing it running on XSX.

It's gonna be a rough ride, fasten your belts.

Xbox pretty much uploaded the PS5 trailer but asked 2K to remove the "This has been captured from PS5" and rename that to this has been captured from next gen. Same goes to COD Cold War Xbox trailer.
 
The console with an inferior GPU, a slower CPU, lower memory bandwidth and is larger is now better thought out and developed🤔

Looking at the power supply and the fact the PS5 has to downclock at a certain power threshold, a point a lot of people are conveniently forgetting, I'd be pretty surprised if the XSX isn't also cooler and quieter on top.
Are you trolling, aren't you? I wonder when people will realize that TF aren't the best metric for this shit. I guess you missed Mark Cerny's Road to PS5, please watch it.

This generation is nothing like previous generations, we are not talking having a magical chip like CELL or just mid/low range GPUs with awful laptop CPUs. We are talking innovation from both sides. What you reduce to 'has to downclock at a certain power' is something extremely untold of on console: variable frequencies.

What Cerny is doing with those variable frequencies is getting more compute with a smaller (and probably cheaper) chip, which is what everyone should strive for: more with less. Not only the PS5 had a lot of tought on how to make these changes in frequencies deterministic, they are also visible to the devs. Developers are not working with a fucking black box and trying to figure out if their games will run at X, Y or Z frequency, like many people think the PS5 will downclock like a PC card.

If Cerny's approach proves to be good and doable (spoiler: We already know it is! Just look at PS5 games), this is the future for this industry. 2GHz base clocks is something unheard of on cards, but now we are getting a 2.23GHz console!! If you really want to downplay that, go ahead. Just please don't imply the PS5 isn't the better thought or developed console, because just for this fundamental change in clockspeed, it is the most engineered console in the room.

Make this thought experiment, you go to the engineers at Microsoft and make the following offer:
Do you want a smaller chip that can give you more compute 99% of the time and when the 1% situation comes you will know how to circumvent that, do you want the chip?
Do you want a cooling system that was developed for 2 years and has the same performance of a vapour chamber but for a fraction of its price?

If Xbox is what floats your boat, enjoy it mate. Just don't downplay the PS5 like it is a piece of electronic trash because some numbers are smaller.
 

Hashi

Member
I'm not coming down on this on either side, but remember those strange rumours that Sony and AMD were working together to create Navi:






Interesting, this is all the way back in June 2018 remember.
Hardware is created by programmers.
Look at G....... side ;)
 


EkngzA2WkAArldJ


Ekng4o3XIAA47Iq


Ekng9XYXIAAhXfl


EknhAYCXgAA06np


EknhGuJX0AAvqIE


EknhKyyWAAALAkU





Thanks to zaitsu zaitsu for sharing. Taken from that twitter thread.


That could explain Sonys strange silence over their GPU. Or maybe they just don't want to talk about it because it's inferior.

I guess that AMD event will clear this up.

Edit: I just read through Digital Foundrys hot chips analysis and they did confirm the CUs to be RDNA2 ones.
 
Last edited:
Status
Not open for further replies.
Top Bottom