• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

DaGwaphics

Member
It might be possible that RDNA1 to RDNA2 is quite compatible. That's why there are hybrid GPUs halfway there.

It could just be that PS5 realized they needed higher clocks at some iteration they changed the CU from RDNA1 to RDNA2 and realized they had gained a lot.

The architecture was clear, but it is possible that the update from RDNA1 CU to RDNA2 CU was not as complex as we thought.

o

Maybe RDNA1 CU was going to be directly the original architecture of RDNA2 CU but with RT. However Cerny decided to edit it and finally AMD will adopt that architecture for RDNA2.

That is precisely why MS had to settle for the previous diagram since it came out of a collaboration.


It's also very likely that both camps picked clocks somewhat based on the power envelope they were looking to hit. MS had a wider design and that combined with the stated goal of being 1:1 in power consumption with 1X likely limited them to the chosen clocks. Though MS appears to choose clocks very deliberately, choosing based on the TF target.
 

B_Boss

Member
We have seen many games running without such problems on PS5, with extremely higher quality and RT. Isn't the pattern obvious so far?

The one thing we have not seen yet, to my knowledge, are the literal same 3rd party game running on both consoles to be directly compared. If we have, then I must’ve missed it honestly. Will be interesting from a scientifically technical perspective in terms of understanding the hardware 🧠.

Edit: Ive just read that Yakuza has been shown running on both? Surprised I haven’t seen a topic (or post) dedicated to breaking down the differences and similarities.

I honestly either don't remember or haven't seen any games with or without popins on the PS5. Can you show me some? I also don't remember seeing a multiplat game on both systems to compare. So I'm hoping it's an optimization thing rather than a system one.

I thought someone here mentioned that GT for the PS5 reveal had pop-in? I don’t remember if it was proven or not but that only that it was claimed and even then, I cannot remember by whom 😅.
 

yewles1

Member
The one thing we have not seen yet, to my knowledge, are the literal same 3rd party game running on both consoles to be directly compared. If we have, then I must’ve missed it honestly. Will be interesting from a scientifically technical perspective in terms of understanding the hardware 🧠.

Edit: Ive just read that Yakuza has been shown running on both? Surprised I haven’t seen a topic (or post) dedicated to breaking down the differences and similarities.



I thought someone here mentioned that GT for the PS5 reveal had pop-in? I don’t remember if it was proven or not but that only that it was claimed and even then, I cannot remember by whom 😅.
It was mainly for the shadows, the same gradual shadow warp-in seen in GT6, just not as much.
 

DaGwaphics

Member
Things aren't that simple, especially if you see cheaper SSD's still using 4x lanes. It's a very weird decision, and it'll 100% have its impact just as dumb as going DRAM-less.

Using more than two lanes for 3.5GB/sec is a waste. Better to go 4x2 and let the primary drive and expansion have their own lanes vs. them both being 4 lane drives and sharing.
 

Kagey K

Banned
The one thing we have not seen yet, to my knowledge, are the literal same 3rd party game running on both consoles to be directly compared. If we have, then I must’ve missed it honestly. Will be interesting from a scientifically technical perspective in terms of understanding the hardware 🧠.

Edit: Ive just read that Yakuza has been shown running on both? Surprised I haven’t seen a topic (or post) dedicated to breaking down the differences and similarities.



I thought someone here mentioned that GT for the PS5 reveal had pop-in? I don’t remember if it was proven or not but that only that it was claimed and even then, I cannot remember by whom 😅.
The only other head to head comparison we have is Rock Band 4 announcing backwards compatibility and comparing load times as “on par“ with each other and no video, text only.
 

AeneaGames

Member
Yes it comes with system, I am assuming included in box?

.

If you want to play PSVR, PS4 titles included, on your new PS5, you're going to need to keep your PS4 camera. You'll also need to use the included PS Camera adaptor, which thankfully comes with the system.

Ehmm, the blog post says that there's no purchase required for the camera adapter but I remember an earlier official statement that you needed to contact Sony so that they would send you 1 free of charge, so it's not included with the system.

Of course they might have changed their mind and decided to include it in the box of every PS5
 

ToadMan

Member
We don't need to wait, we already know that what he wrote there is absurd, completely wrong and even impossible to exist.
Just remember that this year Hot Chips already happened.

Let’s also remember that MS was the company that was so worried customers might think PS3 was more advanced than Xbox 2 they named the box 360 instead.

What I’m saying is MS know as well as anyone that the information they published at hot chips would be analysed just like Sony for Cerny’s presentation. Neither company is going give technical specifics if those specifics go against the marketing message and show a disadvantage to their product.


I don't think the SSD's DRAM is used to cache files..


SSDs will keep all or a portion of the map in DDR2 or DDR3 (usually). DRAM is much faster than NAND, so the SSD can access the map quickly to increase performance. DDR-type memory loses data when there isn't any power, but NAND stores it even when the power goes out. The SSD keeps a copy of the map on the NAND to reduce the chance that it will lose the map from a power loss. (Technically, the SSD doesn’t lose the data, it just loses the map, so it can't find it).

There are a few different approaches to eliminating DRAM. We only know of a few because the deep inner workings and algorithms tend to be closely held trade secrets. A common method is to build a small amount of memory into the controller. The Phison S11 controller we have in our tests pool features 32MB of SRAM built into the controller, but that is a very small amount of memory compared to an external module. Other techniques include compressing the flash translation layer map (essentially, it is a complicated spreadsheet, so it compresses easily) or caching a portion of the map in system memory (HMB). The SSD controller accesses as little as 8 percent of the map frequently, so there are ways to reduce the performance loss.

Unfortunately, DRAMless SSDs also have a sinister side. Updating the map directly on the flash requires small random writes, which takes a bite out of the SSD's endurance. This is a particularly vexing issue with low endurance planar 2D TLC NAND flash. At Computex last June, one SSD vendor told us about an OEM 2D TLC SSD that will burn through the rated endurance in a little over a year. The SSD has to last a year because of the notebook's one-year warranty, but anything beyond a year's worth of use is up to the user to fix. Tactics like that are the driving forces behind putting cheap DRAMless SSDs in $500 notebooks.”

https://www.tomshardware.com/reviews/dramless-ssd-roundup,4833.html
 
Last edited:

Kagey K

Banned
Let’s also remember that MS was the company that was so worried customers might think PS3 was more advanced than Xbox 2 they named the box 360 instead.
It’s funny you can remember this, while also forgetting that almost everyone who was in charge of making decisions back then is gone from the company.

This is like trying to take the “Crazy Ken” quote from PS3 about how they like to make it hard for developers so games can look better in 5 years and applying against the PS5.

Things change and you can either grow with it or cling to the past.
 

ToadMan

Member


a071e52e60ad2f09870eaf1a04026ba0o.jpg


Vampire mod?

None of the characters are casting reflections on the floor ... 🤔🤔🤔🤔

Or the reflections are baked in - this is a cutscene screen cap.
 
Let's see what happens come 28th October... Because technically the XSX might not be the most powerful console 🤔
I don't even know what "powerful" is supposed to mean anymore. They should change it to "The console which has the GPU with the most theoretical Teraflops" :messenger_sunglasses:

It's really a pretty useless measure for "power" in the context. IO speeds, custom GPU features and a lot of other stuff will matter, simple as that. Of course, the XSX may very well still prove to be "most powerful" in the end but that Teraflop difference alone is borderline irrelevant.
 
I'm not liking what I'm seeing here, maybe because it's 1080p, it looks horrendous:



Here it looks much better, but not sure what HW it's running on but most likely PC:



But still, draw distance looks like current gen, dirty in the far distance. I'm a big Assassin's Creed fan, played all of them except Rogue as it came to PS4 pretty late. Usually preorder the gold edition, but this time I might wait as I'm not liking what I'm seeing.

Yeah to me it doesn't look any better than AC on the last gen consoles.. the animation looks incredibly amateurish after playing TLOU2, but that goes for almost any other game as well to be fair. But it really screams cross-gen or last-gen. The faces, skin rendering and many other details look average at best and is disappointing.

Not that I'm surprised or anything, but yeah.
 

FranXico

Member
Just a theory.

It's possible that GitHub was real. But it's also possible that someone manipulated the data to put what they wanted into it. And then they deleted it because they didn't want anyone to review it and see that it was manipulated.

Could explain why it looks off.
He deleted it because he wanted people on ERA thinks it was AMD that deleted it... to looks real.

He archived the goal because people took the delete as AMD sending ninjas because it was a internal leak but in reality AMD never give a shit about the data.

It was real, but outdated, even by the time it "leaked". Very early tests, many features disabled. CU count isn't everything. The people who spread it tried to make it sound more recent than it is.
 

Kagey K

Banned
I don't even know what "powerful" is supposed to mean anymore. They should change it to "The console which has the GPU with the most theoretical Teraflops" :messenger_sunglasses:

It's really a pretty useless measure for "power" in the context. IO speeds, custom GPU features and a lot of other stuff will matter, simple as that. Of course, the XSX may very well still prove to be "most powerful" in the end but that Teraflop difference alone is borderline irrelevant.
This is my favorite argument in the internet.

Teraflops don’t matter.... Unless you say a magic number of teraflops (8 was scared of 7 because of what 7 ate) then you get banned.

So Teraflops don’t matter, until they do, and then they don‘t again.

A whole lot of tera flip flopping when it comes to this argument.
 

ToadMan

Member
It’s funny you can remember this, while also forgetting that almost everyone who was in charge of making decisions back then is gone from the company.

This is like trying to take the “Crazy Ken” quote from PS3 about how they like to make it hard for developers so games can look better in 5 years and applying against the PS5.

Things change and you can either grow with it or cling to the past.

I recall this because I find it hilarious that a corporation can have such a disrespect for the intelligence of its audience and let marketing messaging dominate the technology.

It’s also funny because despite that being the 360 era, that legacy of “2-phobia” is being felt so hard today they’re still using “Xbox one series x” and confusing themselves.
 
I recall this because I find it hilarious that a corporation can have such a disrespect for the intelligence of its audience and let marketing messaging dominate the technology.

It’s also funny because despite that being the 360 era, that legacy of “2-phobia” is being felt so hard today they’re still using “Xbox one series x” and confusing themselves.
It’s funny. They were afraid of the Xbox 2 vs PS3 and released the Xbox One vs PS4.

And in the end of the day, all that consumers wanted was a pattern. People were expecting the Xbox 720 because... Patterns. Humans like patterns.
 

onesvenus

Member
Vampire mod?

None of the characters are casting reflections on the floor ... 🤔🤔🤔🤔

Or the reflections are baked in - this is a cutscene screen cap.
Look at the character on the left. It obviously is casting a reflection. Looking at the others they also do, although those are not so clear (maybe due to the marble material?)
It's simulating marble (I think?), not a perfectly polished mirror. Also, that could be done using SSR in that instance.
Notice the cans in the lower left portion of the screen. If it were SSR, wouldn't those be reflected?
 
This is my favorite argument in the internet.

Teraflops don’t matter.... Unless you say a magic number of teraflops (8 was scared of 7 because of what 7 ate) then you get banned.

So Teraflops don’t matter, until they do, and then they don‘t again.

A whole lot of tera flip flopping when it comes to this argument.
Yeah to be fair, most people probably don't really know what a teraflop is or what is being measured here. But it's convenient for people to have some kind of metric to compare, and then they think that a higher number must be "better", right? Like the bit wars, or even better, megapixels when it comes to camera.

Many people believe that a camera with a higher resolution sensor will give you better image quality but then ignore stuff like the quality of the lens you're using, or pixel size, dynamic range and what not. Of course a 24 MP professional DSLR with a high quality lens will give you professional images while a 40 MP smartphone camera will give you utter crap in comparison, etc etc.
 

Kagey K

Banned
you can expect John snow himself to get on his knees and blow you. :messenger_heart:
Bullshit.

The community put up with a lot of shit from you, and if it all turns sour next week there needs to be a real apology.

This “funny“ fish thing is done. A bunch of you have turned the community toxic.

Seriously now if next week doesn’t go as you planned, how do you expect to make it up to them?
 

ToadMan

Member
Look at the character on the left. It obviously is casting a reflection. Looking at the others they also do, although those are not so clear (maybe due to the marble material?)

I don’t see a reflection from any character.

I think you’re looking at the one character that happens to lineup with the reflection from the scenery.
 
Last edited:

sircaw

Banned
Bullshit.

The community put up with a lot of shit from you, and if it all turns sour next week there needs to be a real apology.

This “funny“ fish thing is done. A bunch of you have turned the community toxic.

Seriously now if next week doesn’t go as you planned, how do you expect to make it up to them?

Listen bud, you stood by while 52 points of fucking fud were spread by your friends and comrades in the xbox community on this forum, not once did you try and set them straight, not once did you ask them to fucking apologize for there bull shit in TURNING THE COMMUNITY TOXIC.

Don't play the fucking victim card with me, or give me your holier than thou bullshit. You're upset because other people's technical views are not lining up with yours.

This is a speculation thread, 10 days to go TICK TOCK.
 
I think so... But the floor is so much more shiny on the XSX... However character models don't appear to be reflected. 🤷‍♂️

I wonder if RTX 3090 image is set to RT off.
And the XSX is just screen space. 🤷‍♂️

Yeah, I'm not sure. If they were pre-rendered you'd wonder why they would look different. Why would they decide to go back through and render them again? However, their pipeline may be to generate the render images within the engine at ultra, ultra settings (something that would run far less that even 30fps) and then composite them together to make 30 or 60 fps video.

However, from playing the game in the past, I'm pretty sure this section (and other sections like it) are pre-rendered in some form.

My gut says they just rendered them again with some slightly different settings...
 
Last edited:
Status
Not open for further replies.
Top Bottom