• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

CyberPanda

Banned
Old, but relevant I think.

PIX 1803.16-raytracing – DirectX Raytracing support

Brian

March 19th, 2018

Today we released PIX-1803.16-raytracing which adds experimental support for DirectX Raytracing (DXR).

As just announced at GDC this morning, Microsoft is adding support for hardware accelerated raytracing to DirectX 12 and with this release PIX on Windows supports DXR rendering so you can start experimenting with this exciting new feature right away. Please follow the setup guidelines in the documentation for how to get started.

PIX on Windows supports capturing and analyzing frames rendered using DXR, so you can start debugging and improving your raytracing code. DXR support is seamlessly integrated with the features of PIX on Windows you already know and use for your D3D12 titles. Specifically, PIX allows you to inspect the following properties of your raytracing rendering:

  • The Events view shows new API calls for DXR such as DispatchRays and BuildRaytracingAccelerationStructure.
  • The Event Details view provides additional insight into each of the DXR API calls.
  • The Pipeline view shows the resources used for raytracing such as buffers and textures as well as visualization of acceleration structures.
  • The API Object view shows DXR API state objects to help you understand the setup of your raytracing.
Raytracing is an upcoming Windows 10 feature and a new paradigm for both DirectX and PIX on Windows and consequently we plan to evolve PIX on Windows significantly in this area based on input from developers. Please see the documentation for details on limitations of this initial release and use the feedback option to tell us about your experience with using PIX on Windows with your raytracing code.

Please note: This is an experimental release dedicated to support the upcoming DXR features in DirectX 12. We recommend using the latest regular release for non-DXR related work.

pix-dxr1.png

Inspecting DirectX Raytracing rendering in PIX on Windows



Within our gamer bubble things like this seem important, but console sales over holiday 2020 are not going to be affected by this sort of thing.

As things stand, on the verge of a new generation, there is no doubt that Sony are in pole position. They have sold at least double the number of consoles in the current gen and have a raft of top notch AAA exclusive titles that are ready to drop over the last year of current gen and the first year(s) of the next. They have all the momentum, the onus is on Microsoft to show their hand first.

I have no doubt that Sony will keep their powder dry until Microsoft have announced.
Power matters to people. This is an interesting article if you want to read it. :)

No Stranger to the (Video) Game: Most Eighth Generation Gamers Have Previously Owned Consoles
 
Support of Ray Tracing via the GPU. Was the comment.

The direct quote was ….

The GPU, a custom variant of Radeon’s Navi family, will support ray tracing, a technique that models the travel of light to simulate complex interactions in 3D environments.

…. Unfortunately that means nothing. It may lead some people to believe it has baked in raytracing hardware. Some Sony version of RT cores. And that MIGHT be true. But it might not mean that either.

Currently, Nvidia's Pascal GPU's "support ray tracing" and can do it to some extent. But that just goes to show that ANY GPU can do raytracing. We've seen Vega 56 do raytracing too, but it is just a "normal" GPU.

The PS5 GPU might not have ANY specific raytracing hardware at all.
 
Last edited:

Ar¢tos

Member
Sony's lead this gen gives them more margin to invest more in ps5 hardware and even take a bigger loss per unit than Xbox.
MS is richer, buy MS =/= Xbox, and PlayStation means a lot more to Sony than Xbox means to MS.

Anyway, having too much RT hardware is a waste on a console, balance is fundamental on consoles and wafer/chip sizes have more significant limits.
No manufacturer would implement 6Tfs equivalent of RT hardware, because they would be sacrificing space that could be better used for 2-3Tfs of GPU that can be used for anything, including lighter use of RT.
It's console limitations that lead developers to innovative & creative solutions. I expect developers to use tricks to simulate proper RT illumination with only weak RT hardware. Just looking at what SSM did in God of War with only 1.8TF, we can only imagine what they will come up with having hardware of 10Tfs+.
 

SaucyJack

Member

Panajev2001a

GAF's Pleasant Genius
Sony's lead this gen gives them more margin to invest more in ps5 hardware and even take a bigger loss per unit than Xbox.
MS is richer, buy MS =/= Xbox, and PlayStation means a lot more to Sony than Xbox means to MS.

Anyway, having too much RT hardware is a waste on a console, balance is fundamental on consoles and wafer/chip sizes have more significant limits.
No manufacturer would implement 6Tfs equivalent of RT hardware, because they would be sacrificing space that could be better used for 2-3Tfs of GPU that can be used for anything, including lighter use of RT.
It's console limitations that lead developers to innovative & creative solutions. I expect developers to use tricks to simulate proper RT illumination with only weak RT hardware. Just looking at what SSM did in God of War with only 1.8TF, we can only imagine what they will come up with having hardware of 10Tfs+.

Agreed, consoles could include acceleration of some widely used RT operations (ray/ triangle intersection test for example) and some features that may make managing and updating data structures commonly used in Ray tracing algorithms, but spending way too many transistors on HW that can only be used for RT operations in consoles is only a win if they crack the problem and find a way to deliver fully raytraced 1080p@60 or 4K@30 FPS game’s with a good enough graphics and interactivity jump over the top of the line current games... this is a hard task which I do not see arriving in a console before the generation after Xbox Two / PS5.

I fear this is going to become the dGPU meme from one side unless it is discovered that it was that side that has more FLOPs and less custom RT HW 😂.
 

Panajev2001a

GAF's Pleasant Genius
All will become clear, patience.
Both will likely have some form of HW acceleration of basic RT ops, not enough to make fully Ray traced games possible at the quality people expect, but to start adding some marked reflection / lighting improvements in next generation games here and there yes. At this point they need to run RT algorithms decently for PR purposes and developers marketshare alone.
 
Last edited:

bitbydeath

Member
The direct quote was ….

The GPU, a custom variant of Radeon’s Navi family, will support ray tracing, a technique that models the travel of light to simulate complex interactions in 3D environments.

…. Unfortunately that means nothing. It may lead some people to believe it has baked in raytracing hardware. Some Sony version of RT cores. And that MIGHT be true. But it might not mean that either.

Currently, Nvidia's Pascal GPU's "support ray tracing" and can do it to some extent. But that just goes to show that ANY GPU can do raytracing. We've seen Vega 56 do raytracing too, but it is just a "normal" GPU.

The PS5 GPU might not have ANY specific raytracing hardware at all.

In that quote it directly states-

The GPU will support Ray Tracing.
 

pawel86ck

Banned
Exactly. As shown in the recent Crytek demo, RT is possible without ANY hardware implementation.
LOL 😂😂That's just 1080p 30fps tech demo without huge game world, characters and gameplay elements (explosions, AI etc). Have you tried UE4 tech demos? Some of them looks and run like a dream, but you cant see anything like that in real games. Maybe 30fps 1080p would be possible in some older games with RT like minecraft, but PS5 games will probably use 1440p or higher resolution as a standard, and games will use high quality character models, huge locations etc. and they need HW RT if they want to use raytracing with reasonable results. 1.5 year from now RT effects in PC games will slowly become a standard. Sony knows that and they will make sure PS5 would offer similar experience.
 

Ar¢tos

Member
Exactly, which is why it wouldn’t otherwise be mentioned especially in line with the GPU specifically if it was software only based and again not GPU based as per the article.
Just don't expect too much, or you will get disappointed.
Dedicated RT hardware on a console is a waste of space. Hardware with some optimizations to help doing RT is the most likely outcome.
The jump from 1080p to 4k will eat up most of the TFlops difference between ps4 to ps5 (assuming native res), so all extra flops will be needed to improve the other aspects of graphics.
 

Dontero

Banned
Exactly, which is why it wouldn’t otherwise be mentioned especially in line with the GPU specifically if it was software only based and again not GPU based as per the article.

It was mentioned because it is hot new buzzword.
It was also answer for question not that they came up with it themselves informing everyone about it.

If you would ask them same question back in PS2 era they would also say it "supports" raytracing.
 

MilkyJoe

Member
Sony's lead this gen gives them more margin to invest more in ps5 hardware and even take a bigger loss per unit than Xbox.
MS is richer, buy MS =/= Xbox, and PlayStation means a lot more to Sony than Xbox means to MS.

MS are already championing they are going to have the stronger box. You can pretty much guarantee it.
 

thelastword

Banned
Nvidia =/= AMD at the theoretical performance level, for proof :

A GTX 1070 with 6.5 Tflops easily rivals the RX Vega 56 and its 10.6 Tflops.

So no interest to compare the next consoles with the nvidia gpu.
NO, it depends on the API used and the Arch....The performance differences is more about NV's superiority in DX11 and of course many of it's performance enhancing compression techniques which lowers IQ on NV cards overall.....

Three things;

1.) All NV cards you see in Benchmarks are OC'd out the box and are AIB cards
2.) 99% of Vega cards you see are stock clocks, forget about OC, they're not even undervolted to stabilize clocks....
3.) Compare Vega 56 to "GTX 1070 out of the box OC'd", so obviously more performance than stock for NV.......to a stock Vega 56 as they normally do.....Compare titles like Dirt 1+2, Strange Brigade,World War Z and many more.......You will see the difference in their TF counts......Vega 56 even beats the GTX 1080 under the right API, in essence, if a game tailors to it's arch.......Right now, Vega even beats 1070 handily in certain DX11 titles too......Like Kingdom Come etc...

I don't know if you've been following benchmarks, but a vega 56 is knocking on GTX 1080's door, and there's the 1070ti in between, there's no way a GTX 1070 easily rivals an RX Vega 56....



Again, in the video above, Vega is at stock clocks, there is no OC, which in Vega's case is an undervolt to stabilize clocks.....His GTX 1080 is OC'd as usual and still the results are -9% from a GTX 1080......I can show you another benchmark video where in titles like Counterstrike and Hellblade Vega 56 clocks falls to 1100Mhz-1200MHz, all because no undervolting was done.....If you undervolt Vega, you can stabilize clocks at 1550 to 1600Mhz easily.... Also keep in mind that the DX 11 titles you see Vega underperforming in are usually those where Vega clocks go down very low.......Even in these tests, something like fortnite is just not optimized for Vega, that in no ways says the 1070 is keeping up or as powerful to Vega if it beats it in that title.......As you can see, Vega beats NV in many recent DX11 titles......RE2, Battlefield, Kingdom Come Deliverance etc............
 

ethomaz

Banned
Xbox could me 10 TF, with a hardware raytracing unit punching well above it’s weight, making it look like a 16 TF machine. With Sony having the upper hand in TF.
Sorry but 10TFs didn’t look like 16TFs... there is no such miracle in hardware engineer.

Why people need to push that “secret sauce” every new gen? To be disappointed after the real hardware?
 
Last edited:

ethomaz

Banned
NO, it depends on the API used and the Arch....The performance differences is more about NV's superiority in DX11 and of course many of it's performance enhancing compression techniques which lowers IQ on NV cards overall.....

Three things;

1.) All NV cards you see in Benchmarks are OC'd out the box and are AIB cards
2.) 99% of Vega cards you see are stock clocks, forget about OC, they're not even undervolted to stabilize clocks....
3.) Compare Vega 56 to "GTX 1070 out of the box OC'd", so obviously more performance than stock for NV.......to a stock Vega 56 as they normally do.....Compare titles like Dirt 1+2, Strange Brigade,World War Z and many more.......You will see the difference in their TF counts......Vega 56 even beats the GTX 1080 under the right API, in essence, if a game tailors to it's arch.......Right now, Vega even beats 1070 handily in certain DX11 titles too......Like Kingdom Come etc...

I don't know if you've been following benchmarks, but a vega 56 is knocking on GTX 1080's door, and there's the 1070ti in between, there's no way a GTX 1070 easily rivals an RX Vega 56....



Again, in the video above, Vega is at stock clocks, there is no OC, which in Vega's case is an undervolt to stabilize clocks.....His GTX 1080 is OC'd as usual and still the results are -9% from a GTX 1080......I can show you another benchmark video where in titles like Counterstrike and Hellblade Vega 56 clocks falls to 1100Mhz-1200MHz, all because no undervolting was done.....If you undervolt Vega, you can stabilize clocks at 1550 to 1600Mhz easily.... Also keep in mind that the DX 11 titles you see Vega underperforming in are usually those where Vega clocks go down very low.......Even in these tests, something like fortnite is just not optimized for Vega, that in no ways says the 1070 is keeping up or as powerful to Vega if it beats it in that title.......As you can see, Vega beats NV in many recent DX11 titles......RE2, Battlefield, Kingdom Come Deliverance etc............

Completely false.

nVidia is one or two generations ahead AMD in GPU hardware development and implementation and that shows in how people are now speculation if Navi will finally add feature that nVidia implemented 5 years ago in their GPU.

AMD has a series of non-efficiency tasks and limitations in GCN that makes it not use the raw power at fully extend... the opposite of nVidia that still has head room to launch stronger and more efficient cards.

Vega is a card that was supposed to trade blows with 2080TI and it has raw power for that but the reality is cruel and you show Vega cards against 2070 that pretty much shows how much AMD tech is behind nVidia.

BTW that is why AMD give up high-end GPU market to nVidia... then can’t compete in that market... they can of course be always the cheaper option for low and mid-end market.

AMD desperate needs a Ryzen shake in their GPU division but Navi continue being the old and limited GCN.

Maybe the next-gen GPU after Navi can finally put AMD on the spotlight again something ATI used to be all the time.
 
Last edited:

MilkyJoe

Member
How can they know that? I find it really weird making such statement so far from the consoles being in mass production. Nothing is stopping Sony from doing changes until then.

I dunno, they are both buying from the same same supplier, who knows what changes hands. But what they are will be pretty much set in stone soon.
 

Fake

Member
To be honest I not entire dissapointed with PS4pro native custom checkerboard hardware. Was great in fact. If they could appounch some kinda of native hardware for temporal injection used in Spiderman game I onboard.
 

mckmas8808

Banned
Just don't expect too much, or you will get disappointed.
Dedicated RT hardware on a console is a waste of space. Hardware with some optimizations to help doing RT is the most likely outcome.
The jump from 1080p to 4k will eat up most of the TFlops difference between ps4 to ps5 (assuming native res), so all extra flops will be needed to improve the other aspects of graphics.

Which is why I want the big dogs like Naughty Dog to make game native 1440p or that equivalent using CB, but toss in ray-tracing for the lighting engine at 30 FPS.
 

Aceofspades

Banned
PS5 being more powerful than Anaconda is not far fetched at all, hell the sudden silence from MS "insiders" almost proves it. Not that it matters in the end since I expect the power difference to be within 1-2 TF or less than 10%
 

ethomaz

Banned
BTW more 4chan.

Currently a technical artist at 3rd party studio that has just finally received both kits. OP is not lying. PS5 has near final silicon in the towers. The low speed version previously lacked the newest PCI-E buses to enable the fastest possible reading for the SSD. Xbox Two? is still using Zen 1 and Vega 64 clocked to 1400mhz.

So far PS5 is smoking the Xbox alpha kits, but maybe the next version of Xbox kits will fare better. Though my dev friend told me that docs said the dev kits should be close to the final power envelope.
 

ethomaz

Banned
It's hard to believe that the PS5 will really be "THAT" much faster and stronger than the Xbox Next. And only Zen 1? Is Zen2 just not ready yet?
Zen2 is indeed not ready yet.

Devkits (even Sony ones) are probably using Zen1 + Vega... or in best case scenario they are using a sample APU that is not the final one yet.
 
Last edited:

mckmas8808

Banned
Zen2 is indeed not ready yet.

Devkits (even Sony ones) are probably using Zen1 + Vega... or in best case scenario they are using a sample APU that is not the final one yet.

Okay. So the only thing more advanced as far as dev kits go is probably Sony having there close to final PCI-E (probably 4.0) buses?
 

Evilms

Banned
NO, it depends on the API used and the Arch....The performance differences is more about NV's superiority in DX11 and of course many of it's performance enhancing compression techniques which lowers IQ on NV cards overall.....

Three things;

1.) All NV cards you see in Benchmarks are OC'd out the box and are AIB cards
2.) 99% of Vega cards you see are stock clocks, forget about OC, they're not even undervolted to stabilize clocks....
3.) Compare Vega 56 to "GTX 1070 out of the box OC'd", so obviously more performance than stock for NV.......to a stock Vega 56 as they normally do.....Compare titles like Dirt 1+2, Strange Brigade,World War Z and many more.......You will see the difference in their TF counts......Vega 56 even beats the GTX 1080 under the right API, in essence, if a game tailors to it's arch.......Right now, Vega even beats 1070 handily in certain DX11 titles too......Like Kingdom Come etc...

I don't know if you've been following benchmarks, but a vega 56 is knocking on GTX 1080's door, and there's the 1070ti in between, there's no way a GTX 1070 easily rivals an RX Vega 56....



Again, in the video above, Vega is at stock clocks, there is no OC, which in Vega's case is an undervolt to stabilize clocks.....His GTX 1080 is OC'd as usual and still the results are -9% from a GTX 1080......I can show you another benchmark video where in titles like Counterstrike and Hellblade Vega 56 clocks falls to 1100Mhz-1200MHz, all because no undervolting was done.....If you undervolt Vega, you can stabilize clocks at 1550 to 1600Mhz easily.... Also keep in mind that the DX 11 titles you see Vega underperforming in are usually those where Vega clocks go down very low.......Even in these tests, something like fortnite is just not optimized for Vega, that in no ways says the 1070 is keeping up or as powerful to Vega if it beats it in that title.......As you can see, Vega beats NV in many recent DX11 titles......RE2, Battlefield, Kingdom Come Deliverance etc............


The difference between a GTX 1070 and an RX Vega 56 is not huge, knowing that the latter came out a year later and there are a lot of games where the GTX 1070 is in front of the RX Vega 56 despite its age,
so I maintain what I said before:

NJV3bSX.png


relative-performance_3840-2160.png


https://www.techpowerup.com/gpu-specs/radeon-rx-vega-56.c2993
 
Last edited:
So nextbox dev kits are using VEGA? Surprise surprise.
Considering Navi is not out and the Devkits are most likely beefy PC's for both Sony and Microsoft at this stage, I would say that would be an astute observation. One could also consider that they are not 7nm Zen 2 processors also.
 
Last edited:

Fake

Member
Considering Navi is not out and the Devkits are most likely beefy PC's for both Sony and Microsoft at this stage, I would say that would be an astute observation. One could also consider that they are not 7nm Zen 2 processors also.
If you take the old rumors and recent 4chan post about 'PS5 devkit' using a custom NAVI make a little sense.
 

Fake

Member
There is nothing else to use... unless Sony already taped some APU samples they devkit are using Vega too.
Are old leaks spoke about a 'unnamed' incomming gpu from AMD? Are these leaks based on previous/olds dev kits don't?
 

Ar¢tos

Member
We have to wait for Navi to be out to compare the performance of Navi 7nm vs Vega 7nm. It might make sense to use vega in alpha dev kits if the difference is not that big.
 
Status
Not open for further replies.
Top Bottom