• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumour: PS5 Devkits have released (UPDATE 25th April : 7nm chips moving to mass production)

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
I think there is a desire to see Xbox One X and PS4 Pro as machines where the console maker pulled out all the stops and went with the bestest and most expensive system they could build and said “screw profit margins”... while these intentionally stop-gap consoles are quite about the opposite attitude IMHO.

It's not really just your opinion. It's basically almost word for word what Sony said. They barely even advertised that the PS4 Pro exists. It was all about just providing a multi-year stop gap so that people wouldn't go the route of the PC between the years 2016 and 2019. And Sony specifically wanted a better console for PSVR too as an addition.

It's weird to see some many people like you said, actually think the PS4 Pro and Xbox One X were real/true generation upgrades in thought and materials.
 

TheMikado

Banned
I think there is a desire to see Xbox One X and PS4 Pro as machines where the console maker pulled out all the stops and went with the bestest and most expensive system they could build and said “screw profit margins”... while these intentionally stop-gap consoles are quite about the opposite attitude IMHO.

But both have said that's the intent for consoles here on out.
 

Swizzle

Gold Member
But both have said that's the intent for consoles here on out.
Could you expand on what you mean here and quote please? It would help :).

For me these consoles are a great way to raise profit margins and put emphasis on a new product launch than just downward pressure on the previously released HW. Extensions of the original model (especially PS4 Pro were Sony quite clearly stated its purpose) meant to cater to premium (high paying) customers who wanted new HW and make better use of their 4K screens (and paying the margin console makers asked for)... and then be replaced by a roper new generation entry.
 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
But both have said that's the intent for consoles here on out.

No they did not. Sony specifically said that the PS4 Pro was an option system that was just a stop gap. The pressures and energy to make it the best system possible for $400 was not there. That clearly was not the goal. They cared equally about making sure all PS4 games played on the PS4 Pro, as they did making it a powerful system.

For the PS5 BC will probably be a focus, but it certainly won't be as much as a focus as it was with the PS4 Pro. As a matter of fact, PlayStation 4 games playing on the Pro isn't even considered backward compatibility. It's literally the same family of a system. The foundation of the PS4 Pro is the OG PS4. That won't be the case for the PS5. It'll be something clearly different.
 

TheMikado

Banned
Could you expand on what you mean here and quote please? It would help :).

For me these consoles are a great way to raise profit margins and put emphasis on a new product launch than just downward pressure on the previously released HW. Extensions of the original model (especially PS4 Pro were Sony quite clearly stated its purpose) meant to cater to premium (high paying) customers who wanted new HW and make better use of their 4K screens (and paying the margin console makers asked for)... and then be replaced by a roper new generation entry.

We will no longer see the the risks of high margin losses with future consoles. All the major console makers have changed their business model to something we’re they won’t lose tremendous amount of money with exotic hardware.

https://www.forbes.com/sites/erikka...e-a-loss-on-playstation-4-sales/#2c33d2586f1d

“Perhaps more importantly, the fact that the PS4 is built using standard x86 architecture---essentially the same basic components powering home computers---the cost of manufacturing the PS4 is much less than the PS3, and should go down over the years. As the install base grows and parts become cheaper to manufacture, Sony will benefit.

CEO of Sony Computer Entertainment, Andrew House, said that he believes PlayStation 4 losses won't come close to the losses Sony took on the PlayStation 3. The losses on that console totaled $3.5 billion in 2007 and 2008, largely due to the $599 price-tag and lack of compelling software.”

“Xbox chief marketing officer Yusuf Mehdi told GamesIndustry International that Microsoft is "looking to break even or low margin at worst."

“Meanwhile, Nintendo's Wii U is still selling at a loss, nearly a year after its 2012 launch.

The Wii U broke company precedent by being sold at a loss, though Nintendo of America president Reggie Fils-Aime told Mercury News that the "business model doesn't change dramatically, in that as soon as we get the consumer to buy one piece of software, then that entire transaction becomes profit positive”

The age of taking massive upfront losses on console hardware is dead.
 

bitbydeath

Member
10tflops in 2020 means 7-8 years with an underpowered console.

If the CPU is powerful enough they can pass back some of the CPU oriented GPGPU tasks to free up more GPU.

That and also God of War is currently being said to be the best looking game ever and was built for just 1.84TF.

I still hope for more as well but I trust Sony knows what the right amount for next-gen is more than I do.
 

octiny

Banned
10tflops in 2020 means 7-8 years with an underpowered console.

Not if they follow the 3-year mid-refresh stop-gap cycle, which seems what both MS & Sony are going to be doing from now on.

In any case, 11ishTF is a realistic expectation for both Sony & MS & a fantastic upgrade (80%+ more than the X). Let alone the amount of stuff developers will be able to do with the extremely large CPU upgrade/memory bandwidth increases is simply going to be awesome.

People need to stop looking at comparing what's possible with TF's on a PC vs console. Completely different in terms of extracting the power/optimizing, even more so for the fact that's it's being developed with 1 spec in mind versus thousands of different combinations on a PC. No doubt the new consoles will have tons of horsepower to push a 4K standard while also improving the fidelity (especially after seeing what's being currently done on the X @ 6.2 with a shit cpu).
 

Codes 208

Member
you'd be surprised what devs will be able to make with faster CPUs and faster RAM. its not just about the GPU. besides in 2020 most will still have 1080p tvs.
Not sure if thatll still be true. 4ktvs are practically dirt cheap now. You can buy a samsung or vizio at walmart for around $320-$360 and i suspect theyre only bound to get cheaper as they slowly become the norm.
10tflops in 2020 means 7-8 years with an underpowered console.
I think you mean 3 years of relatively on par with mid-high graphic cards before we get a ps5 pro.
 

Swizzle

Gold Member
We will no longer see the the risks of high margin losses with future consoles. All the major console makers have changed their business model to something we’re they won’t lose tremendous amount of money with exotic hardware.

https://www.forbes.com/sites/erikka...e-a-loss-on-playstation-4-sales/#2c33d2586f1d

“Perhaps more importantly, the fact that the PS4 is built using standard x86 architecture---essentially the same basic components powering home computers---the cost of manufacturing the PS4 is much less than the PS3, and should go down over the years. As the install base grows and parts become cheaper to manufacture, Sony will benefit.

CEO of Sony Computer Entertainment, Andrew House, said that he believes PlayStation 4 losses won't come close to the losses Sony took on the PlayStation 3. The losses on that console totaled $3.5 billion in 2007 and 2008, largely due to the $599 price-tag and lack of compelling software.”

“Xbox chief marketing officer Yusuf Mehdi told GamesIndustry International that Microsoft is "looking to break even or low margin at worst."

“Meanwhile, Nintendo's Wii U is still selling at a loss, nearly a year after its 2012 launch.

The Wii U broke company precedent by being sold at a loss, though Nintendo of America president Reggie Fils-Aime told Mercury News that the "business model doesn't change dramatically, in that as soon as we get the consumer to buy one piece of software, then that entire transaction becomes profit positive”

The age of taking massive upfront losses on console hardware is dead.

You again are replying to an argument I am not making. Actually, you are making a long argument and putting quotes (thanks for that :)) to mostly agree with what I just said... the thing is that it still does not explain the statement you made before...

Unless even earlier you took a small part of the post, which actually was stating that yes per HW profitability early on was much more important now than it ever used to be (hint: remember when we talked about manufacturing processes costing a lot and jumping to the next evolution in manufacturing is getting a lot more expensive each time and needing more and more time... that is what put a stop to that extreme loss leading mentality), and took that differently from what it meant to say.
 
Not sure if thatll still be true. 4ktvs are practically dirt cheap now. You can buy a samsung or vizio at walmart for around $320-$360 and i suspect theyre only bound to get cheaper as they slowly become the norm.

most people with full HD tvs are not gonna change tvs until their current tv dies. has nothing to do with price.
 

TheMikado

Banned
You again are replying to an argument I am not making. Actually, you are making a long argument and putting quotes (thanks for that :)) to mostly agree with what I just said... the thing is that it still does not explain the statement you made before...

Unless even earlier you took a small part of the post, which actually was stating that yes per HW profitability early on was much more important now than it ever used to be (hint: remember when we talked about manufacturing processes costing a lot and jumping to the next evolution in manufacturing is getting a lot more expensive each time and needing more and more time... that is what put a stop to that extreme loss leading mentality), and took that differently from what it meant to say.

I don't think I was arguing against anything, I probably shouldn't have said "but" but you asked me to explain what I meant and I did. My point was that console makers are no longer saying "screw profit margins" when developing new consoles.
 

TheMikado

Banned
just like this gen. my xbox one original only has 1.21 tflops.
Which has been a well known bottleneck for developers from the beginning.

If the CPU is powerful enough they can pass back some of the CPU oriented GPGPU tasks to free up more GPU.
That and also God of War is currently being said to be the best looking game ever and was built for just 1.84TF.
I still hope for more as well but I trust Sony knows what the right amount for next-gen is more than I do.

Games like GOW are exactly the issue at hand. The vast majority of games are multi-platform now, yet the most impressive games visually have been the console exclusives. I don't want a handful of "showcase" games which require developers to spend extraordinary amounts of time or or be an AAA game for them to develop the effort. The point is two fold-close the gap between PC&console cycles to advance the game development and make it easier to get the maximum performance out of a system faster. There's no secret sauce from simply being a console that prevents a PC from achieving the same or better results beyond having a different system target.

Not if they follow the 3-year mid-refresh stop-gap cycle, which seems what both MS & Sony are going to be doing from now on.
In any case, 11ishTF is a realistic expectation for both Sony & MS & a fantastic upgrade (80%+ more than the X). Let alone the amount of stuff developers will be able to do with the extremely large CPU upgrade/memory bandwidth increases is simply going to be awesome.
People need to stop looking at comparing what's possible with TF's on a PC vs console. Completely different in terms of extracting the power/optimizing, even more so for the fact that's it's being developed with 1 spec in mind versus thousands of different combinations on a PC. No doubt the new consoles will have tons of horsepower to push a 4K standard while also improving the fidelity (especially after seeing what's being currently done on the X @ 6.2 with a shit cpu).
This is comparing consoles against themselves. The power/value ratio goes down. In this thread we have people arguing that its a good thing that we wait until 2020 to get 10Tflop consoles for less value to performance ratio. None of that makes sense to wait long, to pay more, relative to less power of the previous generation. The comparison to PC needs to happen as I'll explain below.

I think you mean 3 years of relatively on par with mid-high graphic cards before we get a ps5 pro.
Except I wouldn't even consider the PS4 on mid-high. The PS4 gpus is tentatively based on the HD 7870, which came out in March 2012 and was 2.5 Tflops. The PS4 was already behind the curve on release.
While this goes back to doing less with more as stated above, the problem is it forces devs to do less with more and coding more for the bottom denominator, which will generally be consoles and due to their 5+ year lifespan they remain the target even after a new mid-gen refresh. To AMD's credit we have an APU which theoretically would have similar performance to the PS4 and surpasses the X1.

The thing about these consoles is that they have been utilizing tech which eventually made its way into general computing systems. The PS3 for instance focuses on high clock multi-core architecture at a time when people we debating the value of dual core processors. The X360 by contrast pioneered unified shader arc through AMD. These two elements, multi-threading and unified shaders are now cornerstones of gaming and computing. Similarly this generation was suppose to pioneer GPGPU and cloud service, and to an extent these will come into play both in the greater computing market and in games in the next generation. The coming generation is aligning more with PC hardware. With that said the future comes down to the reduction to 7nm, power efficiency (which allows higher clocks and less heat), ray tracing, and low-level APIs. The reason to get as much parity as possible IMHO is to reduce the gap between PC and console to have a higher baseline and allow more developers and thus more games to get the most out of the hardware faster.

Games, developed faster, requiring less specialization due to wide hardware gaps, means cheaper games and more developers able to take risks. Having more consistent hardware across the gaming platforms is a good thing for GAMES which I why I am against passing off under-powered systems on huge multi year gaps that set the baseline for almost all games for the next 7 years. It is not conducive to the production of actual games.

While it sounds underwhelming I'll still buy it for the games :D. Can't skip it. If that's the case I would've skipped all of Nintendo's consoles since the Wii.

As multi-platform titles have increased, I have less reason to own both. I gave my X1 to my dad so he could play forza and I've kept and invested in my Playstation ecosystem. I'll likely have both when MS releases their streaming service.
It's less and less appealing to purchase an entire console for a couple of exclusives which I generally don't play anyway.
 

LordOfChaos

Member
If the CPU is powerful enough they can pass back some of the CPU oriented GPGPU tasks to free up more GPU.
.

Modern GPUs are set up in such a way that compute doesn't reduce graphics performance, with asynchronous compute and even several Sony specific optimizations (often with regard to buffers and caches). I don't think just removing compute loads from the GPU would free up very much graphics horsepower at all

rx480-5b.png



With QRQ it's filling in (inevitable, even with multi million dollar blockbusters) gaps in execution hardware. Filling in those gaps is good and we wouldn't gain much leaving them to fill up the CPU. Nvidia hardware may have benefited more from that offload, they can approximate this but there's still some brute forcing with a very powerful architecture to get there with slower granularity (per-clock granularity on AMD vs 100us on Pascal iirc).



The large increase in CPU performance should be used for what CPUs are still most at home in - branchy, non-sequential code that GPUs can't parallelize well with. Still thinking a vastly more responsive world, but leaving things the GPU is good at on the GPU - physics, collisions, etc.
 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
10tflops in 2020 means 7-8 years with an underpowered console.

It'd be almost impossible for Sony or MS to sell a new console in 2020 that's only 10 Tflops. Unless it's only $299 or something. Didn't someone say that GPU technology should increases by 33% per year at the same price? So wouldn't that mean that if the Xbox One X is 6 Tflops at $500, then a console in 2020 would be 14 Tflops given the 33% increase in power per year?
 

TheMikado

Banned
It'd be almost impossible for Sony or MS to sell a new console in 2020 that's only 10 Tflops. Unless it's only $299 or something. Didn't someone say that GPU technology should increases by 33% per year at the same price? So wouldn't that mean that if the Xbox One X is 6 Tflops at $500, then a console in 2020 would be 14 Tflops given the 33% increase in power per year?

Exactly, but for some reason we are having a discussion about how great 10 tflops is going to be in 2020.
 

Ar¢tos

Member
Im happy with 10 tlopfs, it's already a huge jump from 1.8 tflops of the base Ps4, but im usually pleased with little and expecting less leads to a smaller disappointment...
 
Yeah will buy the pro version of 5 a year after it comes out. Learned my lesson from this gen. In my mind the graphics expected for this gen only arrived last october. Pro and xb1x are finally making the games from this gen look and perform the way they should have in the first place, so happy to stick with that until they get the looks and performance of games for 5 right.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
X will tie me over til ps5 pro comes

No it won't. I can promise you that. The (potential) PS5 Pro would be coming out around 2023. NO WAY, you'd be good with the Xbox One X for another 5 years when you'll be missing out on all kinds of games.
 

rokkerkory

Member
No it won't. I can promise you that. The (potential) PS5 Pro would be coming out around 2023. NO WAY, you'd be good with the Xbox One X for another 5 years when you'll be missing out on all kinds of games.

Sure it can because I'll still be able to play most of the same games just not at an uber level. I game and game less these days and that's also a big factor.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Sure it can because I'll still be able to play most of the same games just not at an uber level. I game and game less these days and that's also a big factor.

Oh well yeah if you game less than ever before, then you probably could wait. You'll really be missing out on the next-gen exclusives though. I don't think MS will make all Xbox 2 games run on the Xbox One X either.
 

rokkerkory

Member
Oh well yeah if you game less than ever before, then you probably could wait. You'll really be missing out on the next-gen exclusives though. I don't think MS will make all Xbox 2 games run on the Xbox One X either.

I know right... hopefully however when/if PS5 pro comes I can play all the games I missed out on but in uber quality (much like what I am doing with X now)...
 

LordOfChaos

Member
If people are still doubting a 'next big thing' for what the 9th gen is looking like so far, I wonder if there will be any baked in silicon goodies to accelerate this





https://developer.nvidia.com/rtx

The hybrid approaches already make a clear difference helping traditional rasterization, wonder what more could be done with more performant silicon that took less of a hit doing both at once.
 

TheMikado

Banned
If people are still doubting a 'next big thing' for what the 9th gen is looking like so far, I wonder if there will be any baked in silicon goodies to accelerate this





https://developer.nvidia.com/rtx

The hybrid approaches already make a clear difference helping traditional rasterization, wonder what more could be done with more performant silicon that took less of a hit doing both at once.


I don't think that Ray-Tracing scenes real-time will be fully feasible at this point.

I think that the implementation will be limited to immediate moving and interactive objects, which shouldn't be over taxing.
My other theory is that this will be the era of true cloud computing where ray tracing could and may be theoretically offloaded to the cloud. Mixing both traditional rasterization and cloud ray tracing should allow them to fool many gamers.
Cloud compute was never truly feasible for things like real-time physics, but for partial ray tracing of environments is where I would expect it to shine and when cloud isn't available it can fall back to traditional rasterization.

Here's what I think the future gen should look like in game, this rendered real-time in game engine.

 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Here's what I think the future gen should look like in game, this rendered real-time in game engine.



This looks AMAZING!!!! I'm 100% happy if that's what launch games look like on the PS5.

Supposedly this is a game that's being released in 2019 on the Unreal 4 Engine. It's 100% real-time running on a i7 processor and 1080 GTX GPU on 12 GBs of RAM. The inside of the ship is beyond anything that I've ever seen in a game.

 
Last edited:

TheMikado

Banned
This looks AMAZING!!!! I'm 100% happy if that's what launch games look like on the PS5.

Supposedly this is a game that's being released in 2019 on the Unreal 4 Engine. It's 100% real-time running on a i7 processor and 1080 GTX GPU on 12 GBs of RAM. The inside of the ship is beyond anything that I've ever seen in a game.



We're technically already there. The one I showed you was rendered in 2015, three years ago.

We also have these gems too.



It's why the advancement of the shader model is so important and why I believe frequent updates are important.
Shader Model 6.0 requires certain hardware compatibility which is currently not fully compliant with X1 and PS4, the longer this gen drags on any subsequent generations, the longer we have to wait until developers can program games for the baseline.
 

LordOfChaos

Member
I don't think that Ray-Tracing scenes real-time will be fully feasible at this point.

I think that the implementation will be limited to immediate moving and interactive objects, which shouldn't be over taxing.
My other theory is that this will be the era of true cloud computing where ray tracing could and may be theoretically offloaded to the cloud. Mixing both traditional rasterization and cloud ray tracing should allow them to fool many gamers.
Cloud compute was never truly feasible for things like real-time physics, but for partial ray tracing of environments is where I would expect it to shine and when cloud isn't available it can fall back to traditional rasterization.

Here's what I think the future gen should look like in game, this rendered real-time in game engine.


Not saying full ray tracing in real time, that's why I mentioned the hybrid approaches such as the DF video, which is also what the AMD tech video employs.

Similar to Async compute for computational workloads, maybe something could be employed to reduce the cost of doing both rasterization and ray tracing in a hybrid approach, making it available to higher end games and on more workloads and surfaces.
 
Last edited:

TheMikado

Banned
Not real time, that's why I mentioned the hybrid approaches such as the DF video, which is also what the AMD tech video employs.

Similar to Async compute for computational workloads, maybe something could be employed to reduce the cost of doing both rasterization and ray tracing in a hybrid approach, making it available to higher end games and on more workloads and surfaces.

I disagree, the hybrid approach is exactly what is going to make this possible in real-time. Right now current GPUs excel at rasterization while cloud systems could technically be used to generate the ray-tracing scenes and could go a long way to handle tasks such as global illumination.

One of the projects Intel has been working on possibly 10 years ago was cloud based ray-tracing.
https://upload.wikimedia.org/wikipe...raced.ogv/Quake_Wars_Ray_Traced.ogv.720p.webm

You can see it gives a number of advantages in specific situations but is not going to be able to make up for lower levels of hardware.
Basically, in theory modern GPUs could be used to focus on rasterization and specific ray tracing objects while leaving ray tracing to cloud based systems. I wish I could draw a diagram, but this is what I would envision.

cloud Ray-traced, global illuminated scene. Every object has a light map pre-calculated from the cloud which can also be dynamic. Ray tracing, tends to trace the view from the camera, to the object, to the light source. Using cloud ray tracing, the onboard GPU would simply be responsible for calculating the rays from the camera to objects within its sight while having the opposing information of ray tracing from objects to other light sources already available.

https://en.wikipedia.org/wiki/Quake_Wars:_Ray_Traced
https://en.wikipedia.org/wiki/Wolfenstein:_Ray_Traced

https://www.pcper.com/reviews/Proce...r/Hybrid-Rendering-Combining-Ray-tracing-and-

This isn't a new concept at all, its just that we have just gotten to the point where we have the hardware and shader models to render the rest of the scene properly while also having the cloud infrastructure (bandwidth/latency/server farms) to make it relevant.
 

TheMikado

Banned
Actually just found this, granted its a bit outdated due to GPUs being even more programmable today, but the basics still apply.

https://www.pcper.com/reviews/Proce...er/Ray-tracing-faster-rasterization-Example-1

logvslin.jpg


The green curve represents the logarithmic behavior of ray tracing when the number of triangles are increased, the red line represents the linear behavior of rasterization. As you can see, initially for ray tracing (when the polygon count is low) ray tracing performance is at a disadvantage compared to rasterization, but quickly the two curves meet, and from that point on, as complexity increases ray tracing is ALWAYS faster than rasterization. This cross-over point depends on many factors: the performance of the CPU, the performance of the GPU etc, but this trend is a mathematical certainty, a logarithmic curve will always intersect a linear curve and the logarithmic curve will always win! Due to the linear scaling of ray tracing performance, doubling the number of CPUs would shrink the height of the green curve by half, moving the intersection point (S) closer and closer to 0, ie throw enough CPU cores at the problem and Ray Tracing would always be faster than Rasterization using a GPU.
 

bluefan9

Neophyte
What do you guys mean by hybrid ray tracing? Ray tracing and rasterization are different ways of rendering, so the only way I can think of using both is at real time cutscenes when certain sequences run with traditional rendering and in different scenes depending on the needs of the sequence, whatever engine is running it then switches to ray tracing.

That Star Wars demo by the way, according to the makers of the demo that talked about it at GDC, ran @ 1080p and 8 fps with the Titan V so I highly doubt we'll see ray tracing in the next generation specially considering that things are going 4k and high fps with VR.
 

TheMikado

Banned
What do you guys mean by hybrid ray tracing? Ray tracing and rasterization are different ways of rendering, so the only way I can think of using both is at real time cutscenes when certain sequences run with traditional rendering and in different scenes depending on the needs of the sequence, whatever engine is running it then switches to ray tracing.

That Star Wars demo by the way, according to the makers of the demo that talked about it at GDC, ran @ 1080p and 8 fps with the Titan V so I highly doubt we'll see ray tracing in the next generation specially considering that things are going 4k and high fps with VR.

https://www.tomshardware.co.uk/ray-tracing-rasterization,review-31636-8.html

A Hybrid Rendering Engine?
If you've read this far into this article, you may think that ray tracing is still far from being ready to replace rasterization, but that in the meantime it might be a good idea to mix the two techniques. And at first look, they do seem to be complementary. It's easy to imagine rasterizing triangles to determine visibility, taking advantage of the excellent performance that the technique offers, and use ray tracing only on certain surfaces to add realism where it's necessary, such as adding shadow or achieving exact reflections or transparency. (Which I what I predict we will see real-time) After all, that's the approach Pixar used to make Cars. The geometric models are rendered with REYES and the rays can be cast on demand to simulate certain effects.


Lord of Chaos is referring to this problem:
Unfortunately, though it sounds very promising, the hybrid solution is not easy to apply. As we've seen, one of the main disadvantages of ray tracing has to do with the data structure needed to organize objects in such a way as to limit the number of tests for ray/object intersection. Using a hybrid rendering model instead of pure ray tracing doesn't change that. The data structure will still have to be put in place, with all the disadvantages that implies. For example, we might consider ray tracing the static data and rendering dynamic data using rasterization. But with that scenario, we lose all the advantages of ray tracing. Since the dynamic data does not exist for the ray tracer, it'll be impossible to make objects cast a shadow or to see their reflections.
What's more, in terms of performance, the biggest problem is with the memory accesses generated by the secondary rays, which are typically the rays we need to keep in our hybrid rendering engine. So, the performance gain won't be as great as one might think. Since most of the rendering time is dominated by calculating secondary rays, the gain from avoiding calculation of primary rays is negligible.
In other words, by attempting to combine the advantages of both methods, this solution could end up combining the disadvantages, while losing the elegance of ray tracing and the high performance of rasterization.

My counter to this is about the issue which I believe would cause the most trouble in rendering:
What's more, in terms of performance, the biggest problem is with the memory accesses generated by the secondary rays, which are typically the rays we need to keep in our hybrid rendering engine. So, the performance gain won't be as great as one might think. Since most of the rendering time is dominated by calculating secondary rays, the gain from avoiding calculation of primary rays is negligible.

My solution to this is to leverage cloud computing to pre-calculate the secondary rays real-time so that the primary rays already have the secondary ray calculations handy rather than calculating them real-time. Thus only focusing on Primary rays and dynamic objects.

Basically there would be three levels of rendering.

Full rasterized scene.
Cloud ray-tracing at the mid and near range for non dynamic objects.
Real-time ray-tracing for near dynamic objects.

If looking at the diagram below, I'm proposing that the only thing which is calculated real-time on the primary rays (red lines) and rasterization.
The rest remainder of the near-mid scene is calculated via the cloud and the ray tracing is only calculated on the output of the cloud computations of rays and objects.
Basically only the red lines are resource intensive.
BasicRayTracing_1.png
 
Last edited:

Blayzedblue

Neo Member
2020,2021 sounds much more likely. The PS4 is still selling great. I'm amazed at what Sony's studios are doing now with the current hardware.
 

LordOfChaos

Member
What do you guys mean by hybrid ray tracing? Ray tracing and rasterization are different ways of rendering, so the only way I can think of using both is at real time cutscenes when certain sequences run with traditional rendering and in different scenes depending on the needs of the sequence, whatever engine is running it then switches to ray tracing.

That Star Wars demo by the way, according to the makers of the demo that talked about it at GDC, ran @ 1080p and 8 fps with the Titan V so I highly doubt we'll see ray tracing in the next generation specially considering that things are going 4k and high fps with VR.


One of the benefits of ray tracing and limitations of rasterization is the former is able to account for light sources, or light changing materials, that are outside of the players field of view. This is part of why ray tracing feels so much better to our brains - rasterization doesn't account for this in a physically realistic way. A hybrid approach uses some amount of ray casting to apply this benefit to traditional rasterized graphics, taking some of the benefit while keeping existing renderers.

You should watch this video for good demonstrations of this.



That's what I was musing out loud about, in-silicon methods to take away any performance penalty of doing such a hybrid approach to enhance rasterization, allowing more rays to be traced and in increasingly complex scenes.
 
Last edited:
Ps5: better graphics, that's all, somebody really believe if the ps5 will be cost 600-800usd it will give you, native 4k, with not mobile CPU? Probably the CPU will be the weakest ryzen, at 3ghz or less,don't expect big changes in the ai,and physics, the videogames still use only 4core, the native 4k thing is still a big question, the navi GPU won't give you performance to render every games native 4k with 30/ (60fps fighting games etc..)
Won't give a power like the new titan z card with 8flops. What you can expect is some upscaled sit again,HDD is 5400rpm with 1tb or 2tb, with gamesizes minimum 100gb on the HDD.
 
Last edited:

Swizzle

Gold Member
Ps5: better graphics, that's all, somebody really believe if the ps5 will be cost 600-800usd it will give you, native 4k, with not mobile CPU? Probably the CPU will be the weakest ryzen, at 3ghz or less,don't expect big changes in the ai,and physics, the videogames still use only 4core, the native 4k thing is still a big question, the navi GPU won't give you performance to render every games native 4k with 30/ (60fps fighting games etc..)
Won't give a power like the new titan z card with 8flops. What you can expect is some upscaled sit again,HDD is 5400rpm with 1tb or 2tb, with gamesizes minimum 100gb on the HDD.

I think you will be surprised, pleasantly or not in your case I am not sure ;), but I think you will :).
 

octiny

Banned
Ps5: better graphics, that's all, somebody really believe if the ps5 will be cost 600-800usd it will give you, native 4k, with not mobile CPU? Probably the CPU will be the weakest ryzen, at 3ghz or less,don't expect big changes in the ai,and physics, the videogames still use only 4core, the native 4k thing is still a big question, the navi GPU won't give you performance to render every games native 4k with 30/ (60fps fighting games etc..)
Won't give a power like the new titan z card with 8flops. What you can expect is some upscaled sit again,HDD is 5400rpm with 1tb or 2tb, with gamesizes minimum 100gb on the HDD.

1) There will be a 400-500% IPC increase over jaguar cores with Zen 2 cores at same speed, possibly more with a faster clock speed. So yes, AI/Physics will get a huge boost.
2) Native 4K gaming is already a thing on X with the majority of games at only 6.2TF. So 11TF+ will no doubt deliver 4K games with ease, on top of the massively upgraded CPU.
3) Developers will use how many cores necessary to meet their criteria, this is console, not PC, so if it has 8 cores, it will use all 8 cores (if needed) as it's being developed with 1 system spec in mind as the baseline. Even on the PC side of things, more & more games are taking advantage of more than 4 cores. Especially when it comes to minimum frame rate percentiles.
4) Titan Z is a dual-gpu card released years ago & if you mean Titan V, well that card is 15TF & the PS5 will not be 8TF....expect anywhere between 11-12tf realistically.
 
Top Bottom