• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Kotaku: Sony is working on a ‘PS4.5; briefing devs on plans for a more powerful PS4

I don't know the hardware that Sony will integrate into this nw PS4 but current-gen games are very demanding especially PS4 exclusive games, because running them at 4K with ease is hard even for the currently released GPUs, unless Sony is opting for a new prototype GPU or APU not released yet, I bet the devs who are working with its devkits don't know the exact specs of the final product. But first,
UHD TVs should get more affordable
.
 

hesido

Member
Yes to everything but anti-aliasing which developers could just skip and allow the OS to do it using the Xtensa DPU. Same with up-scaling to 4K.

There are several types of anti-aliasing, many games now resort to custom software solutions anyway (full post processed anti-aliasing, temporal anti-aliasing, multi-sampling within pixel shaders, MSAA, etc etc), so I don't think the OS can provide a be-all end-all solution for that. Now there are even techniques allowing you to 540p 4xMSAA render and 1080p output, thanks to console API's, with almost native quality 1080p render results, which provide massive bandwidth and computation reductions, which can interestingly be used to get 4K renders much more efficiently by employing 1080p 4xMSAA. You cannot take all those intricate options away.
 

onQ123

Member
Truth? There is no high end market, just a console market as it stands.

PS4 & Xbox One are high end consoles while FireTV , AppleTV & nVidia Shield TV are the lower end consoles.

And from the looks of things Polaris 11 is going to bring on even more "Consoles" / Set Top Box's.
 

Uhyve

Member
I don't know the hardware that Sony will integrate into this nw PS4 but current-gen games are very demanding especially PS4 exclusive games, because running them at 4K with ease is hard even for the currently released GPUs, unless Sony is opting for a new prototype GPU or APU not released yet, I bet the devs who are working with its devkits don't know the exact specs of the final product. But first,
UHD TVs should get more affordable
.
As someone who runs games at 4K on my 1080p monitor, you'd be surprised by how awesome super sampled games look, you wouldn't need a UHDTV to appreciate a PS4K. It's just alittle unrealistic to expect it to be able to transparently enable 4x the resolution for all current PS4 games even with this being the year of awesome new GPUs. But then I was one of the people who said even 4GB of GDDR5 in the PS4 was probably unrealistic, so who knows.
 

onQ123

Member
onQ123, in the PS4 the Xtensa DPU is in southbridge and would be massively easier to upgrade than a APU. The problem then is the PCIe interface bandwidth and latency between the APU and Southbridge.

The video you referenced is about APPs not games. You and I know something about the power of the DPU as a Stream processor with hundreds of CPUs and something like 20X more efficient than a CPU at those Video and audio tasks. From ultra low power Key phrase audio recognition to Gesture/face/head tracking/depth map generation from stereo video recognition, Codecs, Encryption and decryption, up and down-scaling resolution (Digital bridge), optical distortion for VR googles, doubling frame rate for VR and anti-aliasing/video processing.

The XB1 on the other hand has the Xtensa DPU inside the APU and while less efficient at this task could be a GPGPU for Games. Off loading Anti-aliasing from the GPU could result in a potential large increase in game performance if the GPU cycles freed could be used for the game.

From what I can see the DPU is part of Starsha & Starsha is part of the GPU because it has the DCE ( Display Controller Engine )

WQreDjw.png


https://fail0verflow.com/media/32c3-slides/#/6


From the leak

PS4:

New Starsha GNB 28nm TSMC
Milos
Southern Islands

DX11
SM 5.0
Open CL 1.0
Quad Pixel pipes 4
SIMD’s 5
Texture Units 5TCP/2TCC
Render back ends 2
Scalar ALU’s 320

EDIT: Some of those were crossed, may be they were updated/changed at a later date, I have no idea.
Quote:
Couple of more updates

Graphic North Bridge(GNB) Highlights
Fusion 1.9 support
DCE 7.0
UVD 4.0
VCE

IOMMU
ACP
5x8 GPP PCIE cores
SCLK 800MHz/LCLK 800MHz
 

jeffram

Member
Even if it runs at a lower resolution (1920 x 2160, or 1440p) it's still going to look noticibly better on UHD or HD screens than vanilla PS4.

I know that we are looking at 2-2.5x efficiency gains compared to 28nm, but isn't that on the entire Apu? Couldn't they shrink the other bits of it and actually expand the gpu footprint in the same size APU for the same cost as the launch APUs?
 
As someone who runs games at 4K on my 1080p monitor, you'd be surprised by how awesome super sampled games look, you wouldn't need a UHDTV to appreciate a PS4K. It's just alittle unrealistic to expect it to be able to transparently enable 4x the resolution for all current PS4 games even with this being the year of awesome new GPUs. But then I was one of the people who said even 4GB of GDDR5 in the PS4 was probably unrealistic, so who knows.

SupersampledK on a 1080p screen? I have to see that, eventhough that on an UHD TV it will be optimal hence why the UHD bluray and the UHD res for games upgrade.
Anyway, I am almost sure that by the time fo the release of the PS4, UHD TVs will be more affordable. I wonder if the new console will have a newer design and newer controller but I doubt this.

You do see that their source is the redditor with no proof, right? The one who said he was crappy at art in the drawing, claimed it was DDR5, doubled down on it, then said sorry he's an art guy.

It's fake, and that website is dumb to run with it.

Yeah, the source felt so fake for me too. But, by logics, Nintendo is able to release a console slightly more powerful than the PS4 these days. That is expected.
 

LordOfChaos

Member
Yeah, the source felt so fake for me too. But, by logics, Nintendo is able to release a console slightly more powerful than the PS4 these days. That is expected.


Oh it's easily possible technically, especially if they ship on 14nm finfets after the huge 28nm stall. But it's all a matter of how much Nintendo reflected and changed.

I remember threads full of people saying 600Gflops was the absolute, worst case scenario for the Wii U, that Nintendo could not even order a part less powerful. Yet here we are with the 160 shader box coming in under 200Gflops.

After all the Revolution and Project Cafe speculation, I'm just ready to just wait till launch and see teardowns to determine what's in it, rather than speculate.
If one thing is true in the universe it's that Nintendo's gonna Nintend, and could surprise us either way.
 

Metfanant

Member
Even if it runs at a lower resolution (1920 x 2160, or 1440p) it's still going to look noticibly better on UHD or HD screens than vanilla PS4.

Shouldn't 1080p upscale perfectly to 4k as it is, and look the same?

1px(1080p) = 4px(UHD)

1920x2160 or 1440p would require legit upscaling and introduce all the related artifacts etc...no?
 

Ushay

Member
I'm assuming many of you are in the hardware business? Curious, some of you have really good knowledge on these topics, I'm glad.
 

Vashetti

Banned
Shouldn't 1080p upscale perfectly to 4k as it is, and look the same?

1px(1080p) = 4px(UHD)

1920x2160 or 1440p would require legit upscaling and introduce all the related artifacts etc...no?

1920x2160 would upscale perfectly (I think) as 1920 is exactly half of 3840.
 

vpance

Member
Shouldn't 1080p upscale perfectly to 4k as it is, and look the same?

1px(1080p) = 4px(UHD)

1920x2160 or 1440p would require legit upscaling and introduce all the related artifacts etc...no?

Perfect upscaling from 1080p just means that there are no artifacts from upscaling but there's still less unique detail than 1440p.
 
One thing for sure is that Microsoft better have something in place to save Xbox One because if this becomes the standard PS4 it's going to make the Xbox One look really bad when games are 4K vs 900P.

Maybe this is the reason that MS is making the move to get devs to make UWP games right now so they will have the high end PC to fall back on so the Xbox games don't look bad when compared to a upgraded PS4. at the same time Sony could be making this move so PS4 games don't look bad next to UWP games.


Haven't MS been showing UWP games running in 4K lately?

Why do you think Ms won't have an upgraded box as well? There were rumors about Ms going this route even before the generation started and as you pointed out they already have the tools to make it possible. Plus, older xbone games running over a VM makes it easy to support on heftier consoles as well.

We also had reports that Ms is mandating xbone development to switch over to uwp, so they are already taking the next step in declouping development from the hardware, after that releasing a newer box would likely be among the next steps.
 

GameSeeker

Member
New article from the Wall Street Journal:

http://www.wsj.com/articles/sony-plans-new-playstation-for-graphics-heavy-games-1459152941

No new information from what is already rumored, but more proof that Sony is planning an improved PS4 model. The WSJ tends to have a very good track record in these matters.

Sony Corp. is planning to sell a more powerful version of its PlayStation 4 machine to handle higher-end gaming experiences, including virtual reality, people familiar with the matter said, while continuing production of its existing console that has so far sold more than 36 million units world-wide.

Existing PlayStation 4 owners would need to buy the new model to take full advantage of the enhanced graphics and power, though it is likely that the current model and the coming one would share the same software catalog, one of the people said.

The new console would be announced before the planned October release of the PlayStation VR, Sony’s new virtual-reality headset, the people said. It would be able to handle ultra-high-definition resolution graphics. The upgraded console would also provide more power for running the PlayStation VR, whose main competitors, Facebook Inc.’s Oculus Rift and HTC Corp.’s Vive, are designed to work with top-shelf computers.
 

dumbo

Member
I know that we are looking at 2-2.5x efficiency gains compared to 28nm, but isn't that on the entire Apu? Couldn't they shrink the other bits of it and actually expand the gpu footprint in the same size APU for the same cost as the launch APUs?

Kindof - but if you make an APU with 4x the performance, you'd basically need 4x the memory bandwidth.

In general, it seems memory is the big question. For 4k you'd want higher res textures and larger render targets.. for 60fps you'd just want more bandwidth...
 

LordOfChaos

Member
ARM? What? o_O

That site is full of it for citing the redditor for reasons I pointed out above, but I think ARM is perfectly feasible. An AMD designed ARM SoC, why not?

A Cortex A72 on the stationary for instance is certainly in the same ballpark as the Jaguar cores with some ups and downs to either. x86 may have been easier for PS4/XBO toolchains, but ARM has a performance per area die size/performance per watt advantage which Nintendo seems to care about.


8 A72s in the stationary be damn fine to me.
 
I find it really hard to believe it will be able to run games at UHD. It'd have to be a MASSIVE GPU/CPU increase.

Indeed, but seeing how they are handling VR I'm thinking they will have something similar in place, for instance, using the same tech KZ did automatically whenever the render is not 4k native (in a way that's transparent to the developer).

It wouldn't be the same as native 4k, but it would be better than just 1080p, and I would assume that going from 1080p to 4k is going to give better results than going 720p or sub to 1080p (to our eyes at least) due the already high pixel density.
 

Uhyve

Member
Kindof - but if you make an APU with 4x the performance, you'd basically need 4x the memory bandwidth.

In general, it seems memory is the big question. For 4k you'd want higher res textures and larger render targets.. for 60fps you'd just want more bandwidth...
If they're just looking at render resolution and not bothering with higher resolution textures, I wouldn't be surprised if they go with 8GB again, and just claim back 1GB from the reserved OS memory. In that case, improved memory bandwidth would actually be an easy win (especially with HBM2 on the horizon), targeting only render resolution would mean that they only need to improve the bandwidth enough to deal with the extra hundreds of MBs of render target data.
 

slash3584

Member
Probably a stupid question and also probably already answered, but let's say they release a PS4.5 with improved specs.

Would these changes improve the performance of older games without them being patched or something?

I don't mean jumps from 900p to 1080p or 30fps to 60fps but more like stable framerate in games that suffer some drops like TW3.
 

THE:MILKMAN

Member
Probably a stupid question and also probably already answered, but let's say they release a PS4.5 with improved specs.

Would these changes improve the performance of older games without them being patched or something?

I don't mean jumps from 900p to 1080p or 30fps to 60fps but more like stable framerate in games that suffer some drops like TW3.

They would have to be patched I think. It shouldn't be a problem, though. Games now have multiple performance patches post release (even day 1!)
 
A beefed up APU with a enhanced CPU and dual Polaris GPUs is the only way I can see Sony hitting 4k 30fps on standard PS4 games. To meet TDP, heat, and cost requirements it would have to be an off the shelf, lower end 14nm finfet part with GDDR5 or GDDR5x. Just my 2 additional cents. Dual Polaris 10s a la 480x???

Edit: In other words 4k at 30fps isn't happening mid life cycle. Gosh I hope they don't try to pull this off.
 

onQ123

Member
Why do you think Ms won't have an upgraded box as well? There were rumors about Ms going this route even before the generation started and as you pointed out they already have the tools to make it possible. Plus, older xbone games running over a VM makes it easy to support on heftier consoles as well.

We also had reports that Ms is mandating xbone development to switch over to uwp, so they are already taking the next step in declouping development from the hardware, after that releasing a newer box would likely be among the next steps.


I'm not sure if it's going to be a much higher spec'ed console at the moment from them I think they will match the PS4 specs with a small STB like FireTV & have other Windows 10 devices & rely on the PC for the high end UWP games & maybe release a higher end STB in a year or so.
 

benzy

Member
Including dynamic resolution titles that didn't stick reliably to native resolutions the list in question is pretty close to 50% sub-720p, which is actually better than I remembered but still not exactly stellar. A lot of high-profile titles, especially cross-platform ones, wound up scaling from lower resolutions.

The points stands that native HD was being touted in generation 6, and was still not a given in generation 7. Expecting a sudden leap to 4K native half-way through generation 8 is absurdly optimistic. Not that I wouldn't be happy to see the breakthrough happen, but you can count me as deeply skeptical. I'm expecting upscaling to play a dominant role in 4K console gaming for years to come.

Around 80% of the games on that list are 720p native or above... Very few games on last gen had dynamic resolution scaling; the only one I know of being Wipeout HD which was dynamically changing from full 1080p with 1280x1080 being the bare minimum resolution. Where are you getting the dynamic res information from that would account for almost half of that list as being sub-720p? 6th gen wasn't touted as HD, a select few games boasted to have a faked 1080i output but that's it.
 
Party Chat already uses TrueAudio for VoIP encoding/decoding, IIRC.
Yes, Sony is using the DSP but hadn't given developers APIs to use the DSP or accelerators. Same with APPs and HTML5 <video> MSE EME with embedded Playready. When that happens APPs drop in size. There is no reason Netflix is 1.1 GB or Youtube is 147MB. Youtube should be less than a few K in size as it's completely a Web browser app and the PS4 has a Browser desktop UI.

Edit: The Southbridge ARM SoC has it's own 256MB of DDR3. At the time of the PS4 release Cadence didn't have a GDDR5 memory controller and there is no VCE in the XB1 or PS4 APU. The XB1 uses Xtensa DPUs for both Codec encode and decode including h.264 and HEVC. When Sony decided to use GDDR5 they had to move the ARM block out of the APU. Both the XB1 and PS4 use Xtensa DPUs as accelerators and they need a ARM AXI bus which supports NOC with DDR3 memory.

ip-diagram-memory.gif


With minor changes the PS4 APU can use GDDR5X for more bandwidth and the XB1 could use DDR4 as that is supported by Cadence..
 

LordOfChaos

Member
Yes, Sony is using the DSP but hadn't given developers APIs to use the DSP or accelerators. Same with APPs and HTML5 <video> MSE EME with embedded Playready. When that happens APPs drop in size. There is no reason Netflix is 1.1 GB or Youtube is 147MB. Youtube should be less than a few K in size as it's completely a Web browser app and the PS4 has a Browser desktop UI.

So you've said a few times; can you provide a source? From Cerny interviews it sure sounded like devs could already offload to the Trueaudio-like DSP. It was touted as an advantage over the 7th gen where say the 360 would often have half or more the resources of its third core tied up with audio.
 

duhmetree

Member
The problem is not whether there's already a DPU or not, or where it is on the chip. We're talking about 4x performance increase over PS4 apu to be able to render 4K without developer intervention and optimization

Sony could quadruple the PS4s 140 GB/s bandwidth. Add in a better cpu and gpu.. then there's the dev optimization...

It's not impossible, it just comes due to cost.
 
So you've said a few times; can you provide a source? From Cerny interviews it sure sounded like devs could already offload to the Trueaudio-like DSP. It was touted as an advantage over the 7th gen where say the 360 would often have half or more the resources of its third core tied up with audio.
What he says is contradictory.

We already know that the initial OS RAM allocation is around ~3GB. 15min H.264 720p30 videos require ~1GB of space. With further optimization and the latest (v3.50) SDK, then perhaps the total OS RAM allocation could be brought down to ~1.5-2GB (half of that is just for the PVR). Ideally FreeBSD shouldn't need more than 512MB of RAM (the PS4 OS/UI is extremely barebones for the time being).

The H.264 decoder/encoder is inside the APU, otherwise it wouldn't be able to function, since it needs direct access to the GDDR5 pool and the GPU framebuffer.

It's the same thing with the TrueAudio DSP: it has an allocation of 64MB GDDR5 RAM and it needs the GPU for audio processing (GPGPU).

It doesn't make any sense to put these dedicated co-processors in the southbridge, not to mention the bus between the APU and the SB would cause a severe bottleneck.

The SB has 256MB of DDR3 (or is it LPDDR3?), which is too little and too slow for these kinds of multimedia tasks.
 

ZoyosJD

Member
Even if it runs at a lower resolution (1920 x 2160, or 1440p) it's still going to look noticibly better on UHD or HD screens than vanilla PS4.

I know that we are looking at 2-2.5x efficiency gains compared to 28nm, but isn't that on the entire Apu? Couldn't they shrink the other bits of it and actually expand the gpu footprint in the same size APU for the same cost as the launch APUs?

To answer your questions and give a little general background:

1920*2160 would scale more cleanly to 4k than 1440*2560 but the scaling would be done on the system as it is a nonstandard resolution.

The "2-2.5x" gains being quoted are from AMD's 300 series cards to Polaris. Those gains are a combination of architecture and the node change. The PS4 is based on 7000 series architecture. Since then we have seen the introduction of the 200 and 300 series cards with minor architectural changes themselves.

Thus we come into this situation where the node shrink is enough for a resolution like the 1920*2160 you mentioned, but not full 4k. It's that architectural improvements in addition to the die shrink which would allow 4k to be within spitting distance.

Herein lies the problem, architectural revisions (as minor as they may be) and "adding little bits" onto the APU complicate the backward - forward capability potential. particularly in a Gen where some games no longer have devs to support and potentially patch them. Thus some are trying to consider a system in which all games on the system would be natively rendered at a 4k without any effort from devs.

We will also need increases in CPU, RAM capacity, and bandwidth to support 4k resolution games.

16nm is an entirely new node process related to 3D -finFETs and guarantees that yields will be lower with the same size APU die and thus more costly per area than 28nm.

But first,
UHD TVs should get more affordable
.

The Vizio M series is an amazing value for the price.
 
What he says is contradictory.

We already know that the initial OS RAM allocation is around ~3GB. 15min H.264 720p30 videos require ~1GB of space. With further optimization and the latest (v3.50) SDK, then perhaps the total OS RAM allocation could be brought down to ~1.5-2GB (half of that is just for the PVR). Ideally FreeBSD shouldn't need more than 512MB of RAM (the PS4 OS/UI is extremely barebones for the time being).

The H.264 decoder/encoder is inside the APU, otherwise it wouldn't be able to function, since it needs direct access to the GDDR5 pool and the GPU framebuffer.

It's the same thing with the TrueAudio DSP: it has an allocation of 64MB GDDR5 RAM and it needs the GPU for audio processing (GPGPU).

It doesn't make any sense to put these dedicated co-processors in the southbridge, not to mention the bus between the APU and the SB would cause a severe bottleneck.

The SB has 256MB of DDR3 (or is it LPDDR3?), which is too little and too slow for these kinds of multimedia tasks.
1) First the Xtensa DPU is a stream processor which has memory of it's own and needs very little system memory to process a video stream.
2) You have a hard disk running connected to Southbridge
3) the video buffer is in Southbridge. The video buffer for DRM reasons has to be in the TEE with the codec, player, DRM etc. ARM recommends the southbridge be in the same TEE too.
4) Since the video buffer is in southbridge, southbridge can read that buffer, h.264 encode it, send it to the Hard disk and at the same time send video to the HDMI. Video only needs to move one way from APU to Southbridge through the PCIe.

The OS is a combination of the APU and Southbridge. My understanding is that with full screen video the GPU is off and GDDR5 memory is in self refresh/standby. It can wake in a few clock cycles when video ends or some condition requires it. This is a requirement that Media be less than 21 watts for a blu-ray player or computer IPTV. There is no way to meet this 21 watt requirement with the GPU on and no way to meet network standby requirements with GDDR5 memory. There are multiple reasons for an ARM block in the XB1 APU using DDR3 memory with power islands that allow the GPU and other components to be turned off and in the PS4 with the Southbridge as an ARM block with it's own DDR3 memory with the APU and GDDR5 able to be "turned off" put into self refresh.

And before you jump on me, the above is what is supposed to happen but hasn't been implemented yet....The PS4 draws more power in Video playback mode than the PS3. With the HTML5 <video> MSE EME enabled in Southbridge, Sony will enable the media power mode, APPs will drop in size and three or four apps including video chat can be in memory at the same time. Since it uses the Zero memory move embedded scheme where the player software is always loaded and used by all apps, only registers, pointers and copies of stacks need to be kept in memory for each application.

The PS4 commercial WebMAF framework uses a 2 Megabyte Mono engine that calls Web browser native libraries and everything is loaded into memory at boot by the ARM block as a trusted boot.

You are really saying the XB1 with DDR3 memory can't support HEVC with the Xtensa stream processors when in June they announced both HEVC encode and decode?
 

Screaming Meat

Unconfirmed Member
There will always be a cheaper or more powerful console down the road if you wait long enough, and nothing about iterative consoles says they'd be obsolete any faster.

While I understand that, if I have to fork out a huge chunk of money for a console, then I'd rather it be the better performing/latest machine and not have another machine come along a year or so later (or whatever) that is incrementally better.
 
Top Bottom