• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Dr. Claus

Vincit qui se vincit
Believe it or not I'm not very high up on the totem pole in my studio so I'm not privy to that information but for sure both of these consoles are at least 12 teraflop machines. One thing that keeps blowing my mind is how we are able to take destruction to the next level, if you think battlefield has good destruction just wait till you see what next-gen is going to bring!!!!
Listen I am telling you that next gen graphics are going to blow people's minds!!! The graphics look like CGI from a movie! I wish I could tell you which studio I work at but I don't want to be fired. And yes Bonnie Ross is way hotter in person!!! Though lately she's looking like an old grandma but back in 2013 she looks so gorgeous in person!!!!
Yeah 12 teraflops, why is that so hard to believe? You guys have no idea what's coming, this is going to be the Golden age in video games history, the stuff I'm seeing when I go to work everyday just blows my mind! Microsoft and Sony know what they are doing they are rolling out powerful machines and most likely will charge $600 but this is going to usher in a new era where anything can be destructible and anything can be ray-traced with ultra real textures. I really can't give anything too detailed otherwise I could get fired.

giphy.gif


You are so full of shit that you could fill to bursting the Hirakud Dam.
 
I do think however that Spider-Man is in dire need of ray-traced reflections on its skyscrapers, it's one thing that really breaks the illusion in that game..
I would tend to agree with you there. Ray tracing would do that game a world of justice as GI may go unseen in that world.

Yeah 12 teraflops, why is that so hard to believe? You guys have no idea what's coming, this is going to be the Golden age in video games history, the stuff I'm seeing when I go to work everyday just blows my mind! Microsoft and Sony know what they are doing they are rolling out powerful machines and most likely will charge $600 but this is going to usher in a new era where anything can be destructible and anything can be ray-traced with ultra real textures. I really can't give anything too detailed otherwise I could get fired.

Is there any way to get this guy verified to ensure what he says is truthful?
 
Last edited:

svbarnard

Banned
I'm also a developer for a major studio and can confirm that ps5 will include the grindr app, allowing homosexuals to easily form connections (strands) across America. This is enabled at the hardware level using dedicated gay tracing cores.
Hahahahahahaha!!!!!

Alright I'm gonna level with you guys, I work at 989 studios.
 

Audiophile

Member
While I'm still of the opinion that devs will mostly favour eye-candy and spatial resolution over framerate... If you're aiming for 4K/30 or 4K/60, it might make more sense to go ~1440p/60 or ~1440p/120 and use reconstruction 4k reconstruction technique such as Insomniac's Jitter/Temporal Injection (the best there is imo)...

Such techniques are already getting very close to 4K with minimal artifacting; and with twice as many samples (frames) to reference that reconstruction is only going to be better, the increase in temporal resolution will also increase perceived clarity, you'll have the responsiveness and fluidity that comes with increased framerates; and finally if you stick in the 1440-1500p range you'll still have some change left over for additional fx/logic.
 
While I'm still of the opinion that devs will mostly favour eye-candy and spatial resolution over framerate.

I've said it (many times) before, but:

Coder: It's running at 70fps, with everything enabled.
Management: Oh, so you've got frames to spare?
Coder: Well, yeah, but ...
Management: Add more bling!
Coder: Okay, we're at 65fps
Management: More bling!
Coder: 62fps
Management: MORE BLING!
Coder: 59fps
Management: Right, cap it at 30fps
 
I've said it (many times) before, but:

Coder: It's running at 70fps, with everything enabled.
Management: Oh, so you've got frames to spare?
Coder: Well, yeah, but ...
Management: Add more bling!
Coder: Okay, we're at 65fps
Management: More bling!
Coder: 62fps
Management: MORE BLING!
Coder: 59fps
Management: Right, cap it at 30fps
I don't think that's the case. Framerate is mostly variable these days. So they usually decide early in the design phase whether or not they want to hit 30 or 60.
 

Tqaulity

Member
Yeah 12 teraflops, why is that so hard to believe? You guys have no idea what's coming, this is going to be the Golden age in video games history, the stuff I'm seeing when I go to work everyday just blows my mind! Microsoft and Sony know what they are doing they are rolling out powerful machines and most likely will charge $600 but this is going to usher in a new era where anything can be destructible and anything can be ray-traced with ultra real textures. I really can't give anything too detailed otherwise I could get fired.

I believe you :). I've been saying this since April when Sony first unveiled the PS5 that the GPUs in the both will likely end up around 12TFLOPS. For marketing reasons, it makes sense to at least say that the next boxes are 2x the graphical performance of the Xbox One X (even though we now all now that Navi is waaay more efficient so 12TFLOPS of Navi is likely ~16TFLOPS of Polaris). In terms of raw horsepower, we know that the next high end Navi cards from AMD is aiming to deliver 2080 Super level of performance as configured for PC. In typical console fashion, the cards will have to be paired down a bit for yields and power considerations so expect the consoles to be roughly based on the RX 5800 (or whatever it will be called) but with slightly less CU count and under-clocked when compared to the PC equivalent. With that said, we will still be looking at 2070 Super level perf at a minimum but more likely closer to RTX 2080 level (I would not be surprised if one console is slightly less powerful (i.e 2070 Super vs 2080 standard)). Yes, RTX 2080 level performance in a dedicated closed box console ?! Some amazing things to come! U R NOT E

But here's the breakdown for those that may be confused: The RTX 2080 is ~10TFLOPs by Nvidia's calculations. However, we can extrapolate what an AMD Navi based card with similar power will measure in terms of TFLOPs. How? Just look at the specs vs perf between the RX 5700 XT and the RTX 2070 Super/ RTX 2080. The 5700 XT is ~9.7TFLOPs out of the box so let's say 10TFLOPs with an overlock (same at RTX 2080). Yet, according the the perf benchmarks at sites like TechPowerUp, the RX 2070 Super is ~10% faster than the RX 5700XT (assuming a slight overclock to reach the 10 TFLOPs i.e. the Anniversary edition). Meanwhile the RTX 2080 is roughly 20% faster despite having the same TFLOPS measurement. So that means that AMD's Navi architecture is still roughly 20% less efficient than Nvidia's Turing. Thus, for AMD Navi to reach the 10TFLOPs perf of an RTX 2080, the Navi card will need to be roughly ~12TFLOPs (10TFLOPs + (10*0.2)).

Coincidence!?? I think NOT :) As someone who has worked with PC hardware, software, and games professionally for over 10 years, I assure you it's not rocket science ;)
 
Last edited:

svbarnard

Banned
This user has been removed from thread. Ignored prior warning.
I believe you :). I've been saying this since April when Sony first unveiled the PS5 that the GPUs in the both will likely end up around 12TFLOPS. For marketing reasons, it makes sense to at least say that the next boxes are 2x the graphical performance of the Xbox One X (even though we now all now that Navi is waaay more efficient so 12TFLOPS of Navi is likely ~16TFLOPS of Polaris). In terms of raw horsepower, we know that the next high end Navi cards from AMD is aiming to deliver 2080 Super level of performance as configured for PC. In typical console fashion, the cards will have to be paired down a bit for yields and power considerations so expect the consoles to be roughly based on the RX 5800 (or whatever it will be called) but with slightly less CU count and under-clocked when compared to the PC equivalent. With that said, we will still be looking at 2070 Super level perf at a minimum but more likely closer to RTX 2080 level (I would not be surprised if one console is slightly less powerful (i.e 2070 Super vs 2080 standard)). Yes, RTX 2080 level performance in a dedicated closed box console ?! Some amazing things to come! U R NOT E

But here's the breakdown for those that may be confused: The RTX 2080 is ~10TFLOPs by Nvidia's calculations. However, we can extrapolate what an AMD Navi based card with similar power will measure in terms of TFLOPs. How? Just look at the specs vs perf between the RX 5700 XT and the RTX 2070 Super/ RTX 2080. The 5700 XT is ~9.7TFLOPs out of the box so let's say 10TFLOPs with an overlock (same at RTX 2080). Yet, according the the perf benchmarks at sites like TechPowerUp, the RX 2070 Super is ~10% faster than the RX 5700XT (assuming a slight overclock to reach the 10 TFLOPs i.e. the Anniversary edition). Meanwhile the RTX 2080 is roughly 20% faster despite having the same TFLOPS measurement. So that means that AMD's Navi architecture is still roughly 20% less efficient than Nvidia's Turing. Thus, for AMD Navi to reach the 10TFLOPs perf of an RTX 2080, the Navi card will need to be roughly ~12TFLOPs (10TFLOPs + (10*0.2)).

Coincidence!?? I think NOT :) As someone who has worked with PC hardware, software, and games professionally for over 10 years, I assure you it's not rocket science ;)

I know what you mean I have worked at 989 Studios for the last twenty years, who here liked syphon filter?
 

Panda1

Banned
I believe you :). I've been saying this since April when Sony first unveiled the PS5 that the GPUs in the both will likely end up around 12TFLOPS. For marketing reasons, it makes sense to at least say that the next boxes are 2x the graphical performance of the Xbox One X (even though we now all now that Navi is waaay more efficient so 12TFLOPS of Navi is likely ~16TFLOPS of Polaris). In terms of raw horsepower, we know that the next high end Navi cards from AMD is aiming to deliver 2080 Super level of performance as configured for PC. In typical console fashion, the cards will have to be paired down a bit for yields and power considerations so expect the consoles to be roughly based on the RX 5800 (or whatever it will be called) but with slightly less CU count and under-clocked when compared to the PC equivalent. With that said, we will still be looking at 2070 Super level perf at a minimum but more likely closer to RTX 2080 level (I would not be surprised if one console is slightly less powerful (i.e 2070 Super vs 2080 standard)). Yes, RTX 2080 level performance in a dedicated closed box console ?! Some amazing things to come! U R NOT E

But here's the breakdown for those that may be confused: The RTX 2080 is ~10TFLOPs by Nvidia's calculations. However, we can extrapolate what an AMD Navi based card with similar power will measure in terms of TFLOPs. How? Just look at the specs vs perf between the RX 5700 XT and the RTX 2070 Super/ RTX 2080. The 5700 XT is ~9.7TFLOPs out of the box so let's say 10TFLOPs with an overlock (same at RTX 2080). Yet, according the the perf benchmarks at sites like TechPowerUp, the RX 2070 Super is ~10% faster than the RX 5700XT (assuming a slight overclock to reach the 10 TFLOPs i.e. the Anniversary edition). Meanwhile the RTX 2080 is roughly 20% faster despite having the same TFLOPS measurement. So that means that AMD's Navi architecture is still roughly 20% less efficient than Nvidia's Turing. Thus, for AMD Navi to reach the 10TFLOPs perf of an RTX 2080, the Navi card will need to be roughly ~12TFLOPs (10TFLOPs + (10*0.2)).

Coincidence!?? I think NOT :) As someone who has worked with PC hardware, software, and games professionally for over 10 years, I assure you it's not rocket science ;)

The problem with this story is that today a 2080RTX

Is about £600

You wont be able to get this cards power even if half price next year and the parts for the rest of the console especially with SSD, controller, memory etc for £500. No one is going to be stupid enough to launch a base console at 599. (even though I would not mind). Im thinking they will use another metric rather than FLOPS to compare the machines to make it seem more of a jump - but 8TF is more realistic.
 

svbarnard

Banned
The problem with this story is that today a 2080RTX

Is about £600

You wont be able to get this cards power even if half price next year and the parts for the rest of the console especially with SSD, controller, memory etc for £500. No one is going to be stupid enough to launch a base console at 599. (even though I would not mind). Im thinking they will use another metric rather than FLOPS to compare the machines to make it seem more of a jump - but 8TF is more realistic.
Lol 8TF!!!! I am telling you the next Xbox will be 12TF, Navi will do that.
 

Tqaulity

Member
The problem with this story is that today a 2080RTX

Is about £600

You wont be able to get this cards power even if half price next year and the parts for the rest of the console especially with SSD, controller, memory etc for £500. No one is going to be stupid enough to launch a base console at 599. (even though I would not mind). Im thinking they will use another metric rather than FLOPS to compare the machines to make it seem more of a jump - but 8TF is more realistic.
Actually the problem with your point is that you are comparing Nvidia products and pricing to AMD's AND you are comparing console component pricing to PC which is absolutely not the same. Fact: AMD's next high end Navi card will deliver RRTX 2080/2080 Super performance and will likely cost less than the current Nvidia cards on the market. Also, Fact: console manufacturers purchase the components with a much different pricing model since they are purchasing in bulk for a multi-year term of the contract. Example, the PS3 was based on Nvidia's 7800 series of GPUs and the 7800 GTX for PC launched at $600 back in 2005. Yet, the PS3 BOM estimate for it's GPU was only $83 per unit at launch.

Rumors and leaks have been pointing to the next gen console have 11 TFLOPS or more since last year this time. I don't know why people are so resistant to the idea now?
 

Tqaulity

Member
Actually the problem with your point is that you are comparing Nvidia products and pricing to AMD's AND you are comparing console component pricing to PC which is absolutely not the same. Fact: AMD's next high end Navi card will deliver RRTX 2080/2080 Super performance and will likely cost less than the current Nvidia cards on the market. Also, Fact: console manufacturers purchase the components with a much different pricing model since they are purchasing in bulk for a multi-year term of the contract. Example, the PS3 was based on Nvidia's 7800 series of GPUs and the 7800 GTX for PC launched at $600 back in 2005. Yet, the PS3 BOM estimate for it's GPU was only $83 per unit at launch.

Rumors and leaks have been pointing to the next gen console have 11 TFLOPS or more since last year this time. I don't know why people are so resistant to the idea now?
Here is another way to think about it: IF the next gen console were to be only 7-8Tflops, that would put them at the current RX 5700 (non-XT) level. Does anybody here really believe that the next gen consoles will not be at least as powerful if not significantly more so than the current RX 5700 XT (~10TFLOPS)? Does anyone think Microsoft and Sony would invest billions of dollars to build a machine to last 5-7 years and be the future of gaming to launch with a entry level - mid-range GPU released over 1 year earlier than their box? With the slew of more powerful Navi cards still to launch between now and end of 2020, they will settle on a small mid-range card from 2019 that is barely a step up from the Xbox One X? Does that even make sense?
 
The problem with this story is that today a 2080RTX

Is about £600

You wont be able to get this cards power even if half price next year and the parts for the rest of the console especially with SSD, controller, memory etc for £500. No one is going to be stupid enough to launch a base console at 599. (even though I would not mind). Im thinking they will use another metric rather than FLOPS to compare the machines to make it seem more of a jump - but 8TF is more realistic.
Turing uses 12nm FinFET. Roughly the same process as 14-16nm.

7nm offers 3.3x scaling on the exact same die size. 3 times more transistors:


That means a 754mm2 chip (RTX 2080 Ti) can be 1/3 (252mm2). Not sure how many people realize this.

Don't believe me? Well, just wait for Ampere next year and you'll see firsthand what I'm talking about. A high-end/enthusiast card from 2018 will be 2020's mid-range card.

It's the same reason the "weak" RSX was able to be as fast as 2 x GeForce 6800 Ultra (2004 high-end/enthusiast card). 2 years of lithography progress can do wonders. Console makers are experts on this.

Also: https://www.tsmc.com/tsmcdotcom/PRListingNewsAction.do?action=detail&language=E&newsid=THHIHIPGTH

"The N7+ volume production is one of the fastest on record. N7+, which began volume production in the second quarter of 2019, is matching yields similar to the original N7 process that has been in volume production for more than one year."
 

Munki

Member
its over for you sony ponys. call it a day boys. XBOX is gonna be stronger



To be fair, the information in this video is stuff we have already seen in this thread (Zen2, Acturus, X1X with beefier CPU, new contoller for Scarlett with haptic feedback).
 
Last edited:

DeepEnigma

Gold Member
Arcturus Navi is a thing now?

I thought Arcturus is based on the Vega series, while RDNA is a new microarchitecture.


 

llien

Member
The expense of these effects is not in dispute. A hardware implementation that only does one thing is. It's like saying you have a shader that only does water - once you solve for fast shading, a developer can apply it however they see to budget out, right? It's the same thing with ray casting. Some effects are more expensive than others, but Mr xmedia is saying it's hardware limited to shadows, which makes no sense on the face of it.

Guys, please realize how POOR today's RT performance is. A ray per pixel (at best current gen could do that, and even than, at low resolution) scene looks like this:

VRDZ8wm.png


Nvidia blurs/denoises it alot, of course losing details, and gets something good enough for some reflections/shadows, but that's it.
We need RT performance that is dozens of times ahead of what we have now, to start thinking about it being rendered as RT only.

Even Pixar, which uses server farms to render it's movies, at first combined raster and RT, only later on switching to pure RT, as it is simpler for them to model and their srever farms grew powerful enough to handle that. This is how their "render farm" looks like:

YKqIJTk.jpg
 

R600

Banned
There's no way we're gonna have an eSRAM situation this time around.

OG PS4 and OG XB1 might have been dissimilar, but PS4 Pro and XB1X converged a lot (since MS copied PS4's successful unified GDDR5 setup).

This trend of technological convergence isn't going to stop here. As I've said before, utilizing economies of scale makes sense. Going exotic and using non-standard tech is risky.


^ Can someone vet this guy?
There was nothing to copy. Unified RAN was in 360, it was pretty straight forward and logical way IF you didnt care all that much about absolutely having to have 8GB of RAM.

In 2010, when consoles where designed, that was only way to guarantee 8GB of RAM being there on day one.

Critical strat mistake by MS, but it was more of a gamble duo to Kinect and TV. Sony where led by gaming first design, and won.
 
There was nothing to copy. Unified RAN was in 360, it was pretty straight forward and logical way IF you didnt care all that much about absolutely having to have 8GB of RAM.

In 2010, when consoles where designed, that was only way to guarantee 8GB of RAM being there on day one.

Critical strat mistake by MS, but it was more of a gamble duo to Kinect and TV. Sony where led by gaming first design, and won.
What do you mean there was nothing to copy?

They abandoned the eSRAM+DDR3 combo and went with a conventional GDDR5 setup. No huge scratchpad memory wasting transistors that could have been GPU ALUs. That's what the PS4 has had since 2013.

There's a reason for that as I said. It made sense when fast DRAM was expensive and resolution was low (PS2 was an SD machine, XBOX 360 was sub-HD, since 10MB eDRAM was not enough for 720p).

It doesn't make sense anymore, because eSRAM requirements for native 4K (let alone 8K) would be insane. Imagine having 128MB of eSRAM (4k = 4 * 1080p)!
 
Last edited:

Mass Shift

Member
The problem with this story is that today a 2080RTX

Is about £600

You wont be able to get this cards power even if half price next year and the parts for the rest of the console especially with SSD, controller, memory etc for £500. No one is going to be stupid enough to launch a base console at 599. (even though I would not mind). Im thinking they will use another metric rather than FLOPS to compare the machines to make it seem more of a jump - but 8TF is more realistic.

MS and Sony do NOT pay retail prices for any of their processors. WE are the ones who pay msrp.

I'll bet they won't even be paying $150 USD for their next gen APUs.
 

Panda1

Banned
MS and Sony do NOT pay retail prices for any of their processors. WE are the ones who pay msrp.

I'll bet they won't even be paying $150 USD for their next gen APUs.

Then every single crypto miner will buy the consoles and mine crypto if they can access more power per dollar than buying a card straight up - like the ps3 a lot of people got them to run non gaming applications.
 

R600

Banned
What do you mean there was nothing to copy?

They abandoned the eSRAM+DDR3 combo and went with a conventional GDDR5 setup. No huge scratchpad memory wasting transistors that could have been GPU ALUs. That's what the PS4 has had since 2013.

There's a reason for that as I said. It made sense when fast DRAM was expensive and resolution was low (PS2 was an SD machine, XBOX 360 was sub-HD, since 10MB eDRAM was not enough for 720p).

It doesn't make sense anymore, because eSRAM requirements for native 4K (let alone 8K) would be insane. Imagine having 128MB of eSRAM (4k = 4 * 1080p)!
Im just saying, copying unified memory arrangement sounds dubious. It was either Esram + DDR3 or GDDR5 from single pool, for both if them.

Designing console, with main requirement being it having 8GB of RAM back in 2010, there was only one choice. If MS knew 100% that GDDR5 could deliver 8GB in 2013, they would have gone with that thus putting more compute on chip without raising BOM.
 
Last edited:

iamvin22

Industry Verified
The problem with this story is that today a 2080RTX

Is about £600

You wont be able to get this cards power even if half price next year and the parts for the rest of the console especially with SSD, controller, memory etc for £500. No one is going to be stupid enough to launch a base console at 599. (even though I would not mind). Im thinking they will use another metric rather than FLOPS to compare the machines to make it seem more of a jump - but 8TF is more realistic.

Why do people still think it works like this? Every gen we get this quote.
 

Mass Shift

Member
Then every single crypto miner will buy the consoles and mine crypto if they can access more power per dollar than buying a card straight up - like the ps3 a lot of people got them to run non gaming applications.

Good for them. It's certainly their option to do so. But the average person playing video games doesn't possess such skills. So I don't think the console industry has anything to be concerned about.
 

Panda1

Banned
Good for them. It's certainly their option to do so. But the average person playing video games doesn't possess such skills. So I don't think the console industry has anything to be concerned about.

If i can mine crpto on it for 100 less than a pc alternate and resell the ps5 after 18 months at 10-20% loss then i will save around $200 per ps5 i purchase over 18 months,
If I buy 100 then I save 20.000 plus I also have my cryto - tell me how the hell you gonna get your hands on one if they are always going to be sold out in the first 2 years and sony wont recop any profit on any person doing this - it makes no sense in todays world when the ability to compute can actually be used to make money.
There was no cryto mania in 2005 with ps3/x360 but with x1 and ps4 being under powered it was not an issue.
If the new consoles are overpowered per dollar than a pc comparable then its open season for mining. T
 

Mass Shift

Member
If i can mine crpto on it for 100 less than a pc alternate and resell the ps5 after 18 months at 10-20% loss then i will save around $200 per ps5 i purchase over 18 months,
If I buy 100 then I save 20.000 plus I also have my cryto - tell me how the hell you gonna get your hands on one if they are always going to be sold out in the first 2 years and sony wont recop any profit on any person doing this - it makes no sense in todays world when the ability to compute can actually be used to make money.
There was no cryto mania in 2005 with ps3/x360 but with x1 and ps4 being under powered it was not an issue.
If the new consoles are overpowered per dollar than a pc comparable then its open season for mining. T

Uh, what can I say? I wish you well on your new business venture.

But this is the console industry's business model and what differentiates from other gaming platforms. If you're okay with MS, Sony and Nintendo making the hardware choices for you and you're willing to buy into their cloistered eco systems, they pass along their bulk purchase savings to you as apart of their exclusive install base.

You get multiplatforms at decent specs, and optimized exclusives at their best possible performance. There are trade offs but it's quite mutually beneficial.
 
Status
Not open for further replies.
Top Bottom