• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

On Demand

Banned
i called it. Next gen xbox consoles got extra features that previous gen xbox and ps consoles did not.

bizarre to say the least. downright bs at worst. most people will be playing this on ps console and cd pr actively fucked them over with a shitty version.

I was shocked in the other video when DF called out the marketing deal as a reason why the PS5 BC version doesn’t have a quality mode and is lacking in world detail. That’s a first I’ve seen any publication say that about any 3rd party game. Much less DF.

I expect the same less effort for full PS5 version next year.
 

Bo_Hazem

Banned
08FFA41FEE1068749589C6C807083FB428475D02
 

Ptarmiganx2

Member
for sure, love that game. I’ve put way more hours into Days than I expected to. Just finished all the hordes after completing the campaign months ago. NG+ time now.

really impressed by the update also, looks amazing at 4kcb/60.

If I ever get my darn PS5 I want to do a second play through. At this rate the PS5 pro will be out before I get one.
 

Bo_Hazem

Banned
I remember reading you were a teacher. Teach a specific subject?

Arabic teacher. But somehow they force us to go to school (25km away from home) while mostly only 1 class via internet per day (Sunday have 2), with shit internet as well. So after that ~45min I come here to gaf because I'm having a blast currently on AC Valhalla. Side missions, the ones with short stories from randos are extremely cringy, written by a woman or a nerd who have no idea what sense of humor is. They are so cringy that you feel insulted.

Order zealots and members, legendary beasts, mythical bosses are so wonderful. Combat is good when you get a hold of it upgrade. The upgrade tree is so satisfying, the loot is so meaningful as you get less loot but upgrade them instead and get attached to them. Exploration is so satisfying and the lore is beautiful. The main story starts rushed then gets better and better eventually. Still only 60 hours and still only 4 regions finished from like 9-10? Not sure how many. Dream maps with their uniqueness are wonderful.

Overall I think this game is gonna sit around 8 or 9 out of 10 to me.
 

Garani

Member
Eh.. seems pretty likely at this point that Sony did not update the PS4 SDK to give devs the ability to detect code is running on PS5.

For XBox it's all one GDK, so "bc games" can detect XSX and offer different settings.

In the PS4 SDK there are provision to enable a PS4 mode (Original, Pro, PS5 Boosted), but we have been shown that the capabilities of the PS5 Boosted don't really go further than unlocking framerate to 60 fps. Everything else is just like PS4 Pro was intended (features, CPU, etc)

lol - nah - having to much fun currently with Spidey. Though surprised that Tech Jesus hasn’t X-rayed his yet ....

Has he ever done something like that? I can't remember he did any die x-ray in the past (but I am not an avid cunsumer of his videos, so I may be very well wrong).
 

chilichote

Member
I was shocked in the other video when DF called out the marketing deal as a reason why the PS5 BC version doesn’t have a quality mode and is lacking in world detail. That’s a first I’ve seen any publication say that about any 3rd party game. Much less DF.

I expect the same less effort for full PS5 version next year.

If Series X can only beat the PS5 with a marketing deal or draw level, what would be without this ... ^^
 

Duchess

Member
so does that mean they'll also use PS5 RDNA 3 features for its upgrade also.

I'm hoping they make the most of the PS5 tech. That's why I'm holding off on getting the game until I can play it on PS5.

RDNA 3 is basically built from PS5 tech, right..? There were mumblings about that on Twitter back in May, but it could've just been people having pissing contests back then.
 

LiquidRex

Member
I'm hoping they make the most of the PS5 tech. That's why I'm holding off on getting the game until I can play it on PS5.

RDNA 3 is basically built from PS5 tech, right..? There were mumblings about that on Twitter back in May, but it could've just been people having pissing contests back then.
Its believed the Geometry Engine and Cache Scrubbers are either RDNA 3 or potentially be part of the RDNA 3 feature set, the CPU also is rather unique, the Cache is similarly handled like the Cache on Zen 3 CPU.
 

Rea

Member
So apparently next year CDPR will use Xbox Series X RDNA 2 features for the Cyberpunk 2077 upgrade next year, so does that mean they'll also use PS5 RDNA 3 features for its upgrade also.
Don't get your hopes up, CDPR always prioritize Microsoft x box and PC, Playstation always been treated as second. Their game engine is built upon Direct X and its features, so if the game were to port on Playstation, it needs extra time and effort, and sometimes don't run very well, see witcher 3 for example. When that game launched ps4 run worse than Xbox. don't expect Cyber punk PS5 version to run better or equal than Series X when next gen patch just arrive. But i believe as time goes by, ps5 version of Cyber punk will be On par or exceeds by alittle than Series X version.
 
T

Three Jackdaws

Unconfirmed Member
Some people are saying that CDPR might ignore heavy optimisation for Cyberpunk 2077 on the PS5, but I would add that CDPR know very dam well that a large chunk of their sales will be on the lead platforms which is the PS5 and PC, so it would be foolish of them to give Series X/S priority over the PS5 when the next-gen patch comes in. This is just speculation on my part but who knows. Here's the Witcher 3 sales by platform and I'm sure Cyberpunk will follow a similar pattern in terms of sales by platform.


BJqzB2c.jpg
 

SkankHunt420

Neo Member
Im back guys, after a week of playing cyberpunk 2077 on PC i want to say a few things:
-guys, remember the witcher 3? How it ran on ps4? Good? How polished was when it came out? If you though the ps4 would do stable 30fps you where dilussional, dont get me wrong, sucks for console only players, but the game needed this, looks like they didnt compromissed their vision just to make it run good on 7year old piece of hardware.
-Im glad to inform you guys that my 6 year old graphics card (gtx 980ti) runs this game at 30fps high settings, and im having the time of my life! What an awesome game, anyone who spected a deus ex experience, let me tell you, we got it!
-definitly the launch dissaster makes the game a flawed one (like everyother game) and for me right now its a 9/10, 50 hours in, and not even 10% of the main story 😂
-last but not least, guys dont take this on cdpr, i get it, it sucks, they kinda lied to us cuz we never saw actual ps4 or xbone gameplay before realease, but the game is seriously good, and we are moving away from ps4 anyways, the game will scale beautifully in next gen systems, and better graphics cards in the future.

PD: nice to be back boiz! Any news about those ps5 die shots?
 

HoofHearted

Member
Has he ever done something like that? I can't remember he did any die x-ray in the past (but I am not an avid cunsumer of his videos, so I may be very well wrong).

Not sure about CPU/APU/GPU die shots breakdowns - I know he works with Buildzoid on GPU card breakdowns though..
 

saintjules

Member
What the issue exactly? You can't use it like that?

Well you're more likely putting the disc upside-down when inserting when it's laying horizontal.

The PS5 has always been advertised the other way or vertically. Does it damage the Console the other way? Probably not.
 
Last edited:

ethomaz

Banned
Well you're more likely putting the disc upside-down when inserting when it's laying horizontal.

The PS5 has always been advertised the other way or vertically. Does it damage the Console the other way? Probably not.
Well I used my PS1 in that way... because it read the CD better (faster).
I had to flip it to chance discs btw.
 
Last edited:

azertydu91

Hard to Kill
About freaking time! My wife finally scored a PS5 for me! She’s had hers for a month. I don’t get it until the 20th, but that‘s a lot better than 2021.
Great having 2 ps5s have you thought about those kids that couldn't get theirs ?

I'm playing with you, congrats she probably had to kill a few other customers to get it.
If you have the opportunity can you give us your feedback about streaming the video of your game to another ps5?
None of my friend could get their hand on one so I couldn't try.
 
It's the one I use. Out of the 6 I owned it was the only one that stayed. Worth every dollar
Cool wanted it for my ps5 and series x,if I were to get the Sony pulse ones I don’t think I could use it for xbox aswell,plus it comes with Razer Base Station V2 Chroma USB Hub Headset Stand with USB 3.1 Hub and 7.1 Surround Sound powered by Razer Chroma™ RGB - Black free.
 

GAF machine

Member
Because PS5 and Series X have price parity, it makes it more justified for people to make direct comparisons. It just seems like it was not an accident that Xbox original and xboxOne were launched at a higher price than the equivalent PS. And the only reason 360 was cheaper than PS3 was because of The Cell.

Not because of CELL, because of Blu-ray. Estimated to initially cost over $400 to produce....


CELL initially cost around $100 per chip (timestamped)...



The suspicion that the heart of Series X was meant to work in servers, also seem to reflect the mistake Sony already made for the PS3. The Cell was meant to be used for everything, and ended up not being that good for running games.

Depends on who you ask...





Asynchrous compute on AMD GPU (Volcanic Island) and 64 queues cames from SPU task manager for Cell Processor.


https://patents.google.com/patent/US20070074207A1/en (Cerny was one of Invetor)
:messenger_smiling_with_eyes:

The number of ACEs/compute queues on a GPU has nothing to do with the SPU task manager (STM). According to Cerny it has everything to do with how much asynchronous compute a vendor (or customer in the case of SIE) wants a particular GPU to do for game systems (PS game systems) and middleware (PS middleware):

"For PS4, we’ve worked with AMD to increase the limit to 64 sources of compute commands -- "... "The reason so many sources of compute work are needed is that it isn’t just game systems that will be using compute -- middleware will have a need for compute as well." -- Mark Cerny

What Volcanic Islands can do is schedule tasks and put them in the same number of compute queues that PS4's GPU has. What it (and every other AMD GPU for PC) can't do is use those queues as efficiently as PS4's GPU can. That's where STM-derived customizations to the cache and bus (to reduce overhead associated with context switching) of PS4's GPU come into play. The 'volatile bit' in the article you linked relies on the STM-derived customization (the addition of a second bus) to effectuate a knockoff SPURS on the GPU. As Cerny put it:

"First, we added another bus to the GPU that allows it to read directly from system memory or write directly to system memory, bypassing its own L1 and L2 caches." -- Mark Cerny

"Next, to support the case where you want to use the GPU L2 cache simultaneously for both graphics processing and asynchronous compute, we have added a bit in the tags of the cache lines, we call it the 'volatile' bit. You can then selectively mark all accesses by compute as 'volatile,' and when it's time for compute to read from system memory, it can invalidate, selectively, the lines it uses in the L2."... "-- in other words, it radically reduces the overhead of running compute and graphics together on the GPU." -- Mark Cerny

We’re trying to replicate the SPU Runtime System (SPURS) of the PS3 by heavily customizing the cache and bus,” -- Mark Cerny

One of the people who worked on Volcanic Islands had no idea what the 'volatile bit' was until Cerny mentioned it:

I worked on modifications to the core graphics blocks. So there were features implemented because of console customers. Some will live on in PC products, some won’t. At least one change was very invasive, some moderately so. That’s all I can say. Since Cerny mentioned it I’ll comment on the volatile flag. I didn’t know about it until he mentioned it, but looked it up and it really is new and driven by Sony. The extended ACE’s are one of the custom features I’m referring to." -- anonymous AMDer

Without STM-derived customization and the 'volatile bit', an AMD GPU for PC has higher context switching overhead. Increasing ACE/compute queue counts won't change that. With STM-derived customization and the 'volatile bit', an AMD GPU becomes something akin to the PS3's SPURS driven SPU+GPU hybrid graphics system with 'radically' reduced context switching overhead, regardless of how few or many ACEs/compute queues it has.

Entries [0012] and [0013] of the patent you linked explain that SPURS is a type of SPU task system; and this patent credits Keisuke Inoue, Tatsuya Iwamoto and Masahiro Yasue with its invention. On CELL, the SPU task manager is a software component of SPURS that reduces context switching overhead within SPURS software.

I understand that Leadbetter at DF did an amazing job in slighting the PS3 through that generation, and hence your view of the PS3's Cell processor
- analysis that certainly wouldn't hold up under any technical scrutiny, now, because the 360 was effectively designed around a defacto standard PC 1024x768 resolution, hence the eDRAM size, and shipped without hdmi output in the first revision because of that.

Throughout the generation the 360 had trouble even properly double buffering with AA, and depth cue-ing fog at a full 720p resolution in most games, including gears 3, and used reduced frustum sizes (with reduced zbuffer precision all the time) even in infamous comparison Bayonetta because of eDRAM size. Although every time the eDRAM gave an advantage in foliage (RDR as an example) that became the focal point of all analysis conclusions and the PS3 version wasn't credited for all the other things it did better.

But the most telling thing about the Cell is that they weren't able to perfectly port AA games like Journey to the PS4 with all fx intact because it was very good for gaming algorithms - even if completely unfriendly to developers. Sony were a year late to market and unlucky with the Cell fabrication issues at the beginning, so the PS3 with unified memory and 2 Cells - 1 for CPU and 1 for Graphics - was compromised to 1 Cell with a split memory and an expensive RSX GPU that was very good pushing more polygons and accurate h/w gamma correction, but weaker in raw pixel rate than the Xenos, so the affordable Cell solution Sony envisaged wasn't what we got.

From the beginning Leadbetter set a narrative about blur from the free AA the RSX provided, and placed input lag of the free triple buffering the PS3 VRAM could provide as worse than the screen tearing double buffering solution of the early 360 games, so Sony/publishers dropped both those free advantages.

Thankfully, the anti-ps3 narrative really helped push the advanced tooling on PS3 beyond what developers might have expected Sony ICE team to provide them, and the results of Sony's first party games from that generation really do show the Cell capability in a good light IMHO, even the way they were doing real-time zlib decompression on SPUs and adding 3D to PS3 games in the back half of the gen with such a small performance overhead IIRC. Killzone 3 in 3D and move controls wouldn't have been possible looking like that if the Cell wasn't adept at gaming algorithms IMO.

The PS4 IMO didn't really take a lot of the PS3 Cell DNA into its design, other than the asynchronous compute and heterogeneous chip design, largely because of cost restrictions having lost market share
to the 360 and Sony's own tough finances of the time. But looking at all the advanced customizations to the PS5, the IO complex, decompression engine, coherency engines, cache scrubbers, the geometry engine, Tempest Engine and patents by the likes of Cerny. IMHO it feels like all of those advances have been informed by the work done with the Cell SPUs in the PS3 era. It is possible that Sony's largest financial failure with the PS3 using the Cell BE, will have directly lead to a massive success 2 generations later.

I share many of your sentiments. In my opinion the decompression engine, DMAC, I/O co-processors and coherency engines of the I/O complex amount to what is basically an imitation CELL designed to do in hardware what SPUs did in software, as according IBM CELL is similar to an I/O processor (IOP)...

ibm-systems-and-technology-group-37-l.jpg

The dissimilarities being that it has an integrated CPU (PowerPC-based PPE) and it isn't limited to just directing inputs/outputs, since one or more of its SPEs (IOP-like in their ability to relieve the PPE of I/O duties) can be partitioned into groups and assigned specific tasks if necessary. A CELL with a sufficient number of PPEs and SPEs could replace all the specialized hardware inside PS5's I/O complex...

the-ps5-features-a-custom-io-unit-and-a-custom-flash-control_havb.1080.jpg

- SPUs extend to the GPU shader pipeline and maintain coherence with the GPU they assist via GPU <--> SPU synchronization techniques
- as you mentioned, SPUs do decompression (also seen on the above CELL slide)
- each SPE has its own DMAC to transfer data into/out of system memory and storage
- SPEs handle memory mapping and file I/O (handled by PS5's two I/O co-processors)

Yeah, the popular opinion is that using CELL for PS3 was a "mistake" that resulted in Sony's worst financial failure and a loss of marketshare; but in my opinion the success of PS4/pro and PS5 is directly and largely attributable to the future-oriented nature of CELL/PS3.

In '07 it was reported that Ken Kutaragi viewed PS3's losses as investments. It's apparent to me that PS3's "ROI" (as it relates strictly to gaming hardware) has come mostly in the form of knowledge that's been leveraged across two gens to beef up scrawny budget-conscious APUs. Prior expertise in heterogeneous system architecture (HSA) design for PS3 and the transplanting of CELL-like asynchronous compute onto the GPU put some meat on their bones. Software techniques used to improve the image quality of PS3 games were built up to give the custom APU some muscle definition.

One of the best past examples of this is PS4 pro's hardware-based ID buffer (probably emulated by PS5's Geometry Engine) used for checkerboard rendering. Its origins can be traced back to PS3's software-based SPU MLAA implementation which used object IDs, color info and horizontal splitting to detect and smooth out aliased edges of objects in scenes...

- quick overview of SPU MLAA
- why edge detection is necessary
- horizontal MLAA with splitting illustrated on a checkerboard pattern
- capture of horizontal MLAA with splitting in KZ3
- process of object IDs and color info for edge detection in SOCOM4
- cover for Frostbite Labs checkerboard rendering slide deck
- the history of checkerboard rendering rooted in SOCOM4 object IDs for MLAA edge detection
- history continued to PS4 pro ID buffer hardware and Frostbite Labs support

In Cerny's explanation of the PS4 pro's ID buffer, he said:

"As a result of the ID buffer, you can now know where the edges of objects and triangles are and track them from frame to frame, because you can use the same ID from frame to frame,"... "And I'm going to explain two different techniques that use the buffer - one simpler that's geometry rendering and one more complex, the checkerboard."... "Second, we can use the colours and the IDs from the previous frame, which is to say that we can do some pretty darn good temporal anti-aliasing." -- Mark Cerny

I think your feelings are justified and the investments (or losses if you prefer) related to PS3's R&D specifically, CELL's R&D generally have been paying dividends in different ways for a long time now.


BASED Ken Kutaragi-sama.
 
Last edited:
Status
Not open for further replies.
Top Bottom