bitbydeath
Gold Member
60CU’s lines up with the RX 3090. (Navi20)
The same one I mentioned is the only confirmed hardware to support Ray Tracing.
Actually 3080 (4 disabled)60CU’s lines up with the RX 3090. (Navi20)
The same one I mentioned is the only confirmed hardware to support Ray Tracing.
Actually 3080 (4 disabled)
So the 3080 is a 56CU part? I assumed it was 60CU with 4 disabled60 in total though with 4 disabled.
In all fairness after closer inspection the user in the screenshot you posted its not really passing it as rumor, its just his theory that he is posting.
He is not an insider, just posted his prediction
The Vega=Navi is from the latest adoredtv video that made some claim about Navi being worse than vega. Take it with a grain of salt
btw you should expect 12-13TF max, I don think we'll see more.
So the 3080 is a 56CU part? I assumed it was 60CU with 4 disabled
edit: My bad the disabled CUs its just a console thing it seems
Im with you in that i predict a 60CU GPU with 4 disabled.
That would suck
We would be back to the PS3 days of inferior ports, 3rd party devs ignoring specialized hw and only exclusives taking advantage of it.
Not at all lol, just read a bunch of forum posts and articles on people more knowledgeable, its just acquired internet knowledge.You seems to have a very good understanding on how these thing works
Do you work at IT tech company or something?
Have any recomended youtube channel for PS5 tech rumours?
Devs are familiar with CUs and all the tools are ready to take advantage of them out of the box.Having more CUs would be more like the PS3 SPEs than having a FPGA .
wow first time i read that, all i know is CUs tend to be disabled/lasered to improved yieldsI also heard that the CU’s are disabled to be reactive to faults. So if one faulted it’d disable and another locked will become enabled.
Devs are familiar with CUs and all the tools are ready to take advantage of them out of the box.
Ah that's interesting, so what kind of fixed functions do you have in mind?Game devs wouldn't be programming the FPGA that would be done by Sony & middleware companies
One thing for sure nextgen console need to have at least 10TB capacity
i have the 500GB PS4 and thing keep running out of spaces
Maybe nvme + hdd combo?I have 750gb ssd in mine, got a new nvme for my PC so i’m going to put 1tb ssd in it. Even then its not enough for me. Thats why putting 1tb ssd isn’t good for next gen and i’m hoping Sony does go with some new tech with 2tb hdd.
Maybe nvme + hdd combo?
I meant thats what PS5 might end up doing as cost saving measureI’d rather not have to move one game to the other drive to utilize the fast loading.
They are referencing the reddit "leak" that talked about hitting X GB/s per TF"When you add DDR4 bandwidth the range is 11.whatever Tflops"
"XTflops is the rumor if you only count HBM bandwidth"
GPU Flop calcs don't touch on bandwidth at all, that's the rambling of someone with not quite enough information to be dangerous...
Hold on so Sony going to revert back to Vega?
Not only that but also its Vega 56 ( an inferior version compared to 64?)
A 2020 console perfomance matching a 2017 PC?
Wtf this is dissapointing af if its true
Is this guys even reliable and trustworthy? Who is he?
Ah that's interesting, so what kind of fixed functions do you have in mind?
woulnt RT hw be better suited for that?Lighting maybe
Lighting maybe
60CU’s lines up with the RX 3090. (Navi20)
The same one I mentioned is the only confirmed hardware to support Ray Tracing.
Where are you hearing that the 3090 is the only one "confirmed" to support ray tracing? You seem to have heard that somehwhere and are mixing that with Cerny's ray tracing comment and concluding that the PS5 GPU "must" be the 3090. Extremely thin reasoning.
You are also putting a lot of faith in that Wired article with Cerny. Literally the only thing he said was "will support ray tracing" and then went on to an example of using ray tracing for SOUND and not graphics. It's going to be far FAR less computationally expensive to bounce a couple rays around "“If you wanted to run tests to see if the player can hear certain audio sources or if the enemies can hear the players’ footsteps, ray tracing is useful for that,”
That's nice but that's nothing compared to using ray tracing for actual lighting, shadows and reflections. Cerny doesn't talk at all about using ray tracing for global illumination or anything like that. And what he's not saying is just as important as what he is.
Also in that article they talk about the PS5 "supports 8k" but you're not expecting much from that I assume.
"Supports ray tracing" may not mean ANYTHING at this point. If you wanna take that very small comment from Cerny to mean that the PS5 will be a ray tracing beast, go for it. But you can ray trace on Pascal GPU's now and they have no RT cores but they absolutely do "support ray tracing." They're just not very good at it. My point is that you don't need anything special hardware wise to claim "ray tracing support."
Most likely.Is it going to be larger than the pro?
Assuming we're going by adoredtv's "RX 30xx" nomenclature, RX 3080 is 52CUs and 175W+.So the 3080 is a 56CU part? I assumed it was 60CU with 4 disabled60 in total though with 4 disabled.
edit: My bad the disabled CUs its just a console thing it seems
Im with you in that i predict a 60CU GPU with 4 disabled.
Based on that video tho there are tweaks to the cu's in Navi 10 and 20. I'm praying Sony goes with the Navi 20 variantAssuming we're going by adoredtv's "RX 30xx" nomenclature, RX 3080 is 52CUs and 175W+.
The big Navi's, RX 3090XT and RX 3090, are Navi 20 and 64CU and 60CU, respectively. Parts with disable CUs are not new for AMD, and definitely not restricted to consoles. R9 390x was a 44CU part, while R9 390 was a 40CU part. In many cases you could flash the bios and enable the full 44CU "R9 390x" using the cheaper R9 390(mine didn't work). 6950 to 6970 unlock was one that goes even further back, and the practice has been around even longer.
All that said, RX 3080 in adoredtv's updated chart does in fact look like a decent candidate for 52CUs with 4 more disabled in the retail unit making 48CUs. imo. Consoles go for GPU in the 150-160W PC tier.
Pretty much all rumors for PS5 are an amalgam of either Gonzalo(Apisak and Komachi) or adoredtv. The Gonzalo leak is identified as "Navi 10 Lite" and looks to have a GPU core clock that gone from 1GHz to 1.8GHz.Based on that video tho there are tweaks to the cu's in Navi 10 and 20. I'm praying Sony goes with the Navi 20 variant
Pretty much all rumors for PS5 are an amalgam of either Gonzalo(Apisak and Komachi) or adoredtv. The Gonzalo leak is identified as "Navi 10 Lite" and looks to have a GPU core clock that gone from 1GHz to 1.8GHz.
Digital Foundry even sought out his advice(click to read the whole discussion)...
Where are you hearing that the 3090 is the only one "confirmed" to support ray tracing? You seem to have heard that somehwhere and are mixing that with Cerny's ray tracing comment and concluding that the PS5 GPU "must" be the 3090. Extremely thin reasoning.
You are also putting a lot of faith in that Wired article with Cerny. Literally the only thing he said was "will support ray tracing" and then went on to an example of using ray tracing for SOUND and not graphics. It's going to be far FAR less computationally expensive to bounce a couple rays around "“If you wanted to run tests to see if the player can hear certain audio sources or if the enemies can hear the players’ footsteps, ray tracing is useful for that,”
That's nice but that's nothing compared to using ray tracing for actual lighting, shadows and reflections. Cerny doesn't talk at all about using ray tracing for global illumination or anything like that. And what he's not saying is just as important as what he is.
Also in that article they talk about the PS5 "supports 8k" but you're not expecting much from that I assume.
"Supports ray tracing" may not mean ANYTHING at this point. If you wanna take that very small comment from Cerny to mean that the PS5 will be a ray tracing beast, go for it. But you can ray trace on Pascal GPU's now and they have no RT cores but they absolutely do "support ray tracing." They're just not very good at it. My point is that you don't need anything special hardware wise to claim "ray tracing support."
That portion is the PCI id and doesn't correspond to any compute metric. The DF article has the updated product code with 2 new PCI id and it looks like it might now correspond to "Ariel". What that means for whether it's still Navi 10 Lite or Navi 10 is up to you.The 13F8 part is said to mean 13.8TF.
Can Navi 10 reach that figure?
That sounds more like Navi 20 to me.
That portion is the PCI id and doesn't correspond to any compute metric. The DF article has the updated product code with 2 new PCI id and it looks like it might now correspond to "Ariel". What that means for whether it's still Navi 10 Lite or Navi 10 is up to you.
13E9” has been guessed at signifying Navi 10 “lite” – but this is now “13F8” in the new codename.
There is a suggestion that "13F8" refers to the teraflop compute performance of the GPU part of the APU.
He's referencing a random comment on Reeee-.The old code suggested Navi 10, but then it changed.
New codename for AMD's rumored custom console APU Gonzalo throws up new conundrums
An engineering sample processor has turned up on Twitter with the codename “ZG16702AE8JB2_32/10/18_13F8”. Fortunately, a little bit of light has been shed on what this mystery processor could be, and it is claimed that it is actually AMD’s custom chip Gonzalo, which is expected to appear in the...www.notebookcheck.net
Hold on so Sony going to revert back to Vega?
Not only that but also its Vega 56 ( an inferior version compared to 64?)
A 2020 console perfomance matching a 2017 PC?
Wtf this is dissapointing af if its true
Is this guys even reliable and trustworthy? Who is he?
Your expectations arent realistic if you wait 399-499€ console to have latest high end performance of PCs.
Imo 2017 highish end performance sounds really good.
Tech have its limitations, cant force 20Tflops into small case with limited power envelope.
And look what they can do with "~2011 level of pc" aka ps4, i would not worry about it unless they go nintendo and cheap out in everything, which isnt likely
There are plenty of people here expecting and almost demanding at least 13 TFLOPS out of a system that uses an AMD APU enclosed in the small case with particular power requirements. And that APU is still GCN. And it should cost under 499 dollars while even Radeon VII is, what, over 700 dollars?
And it should cost under 499 dollars while even Radeon VII is, what, over 700 dollars?
But that's buying a single graphics cards from a retailer. There's so many points in that chain where entities are taking a cut and thereby elevating the price.
How much would those Radeon VIIs cost if you bought 20 million (or more) of them, directly from AMD, and paid for them upfront? I can tell you right now it's going to be a lot less than $700. Probably even less if there's no sale-or-return clause in the contract.
Just to illustrate, a tear down of the iPhone XS Max (~$1250) revealed that the components cost ~$450. What's the cost if you bought a few million of those components in bulk? You could maybe halve the total bill if you include things like tax deductions for businesses.
I was thinking about the 3080XT which is listed at 56CUAssuming we're going by adoredtv's "RX 30xx" nomenclature, RX 3080 is 52CUs and 175W+.
The big Navi's, RX 3090XT and RX 3090, are Navi 20 and 64CU and 60CU, respectively. Parts with disable CUs are not new for AMD, and definitely not restricted to consoles. R9 390x was a 44CU part, while R9 390 was a 40CU part. In many cases you could flash the bios and enable the full 44CU "R9 390x" using the cheaper R9 390(mine didn't work). 6950 to 6970 unlock was one that goes even further back, and the practice has been around even longer.
All that said, RX 3080 in adoredtv's updated chart does in fact look like a decent candidate for 52CUs with 4 more disabled in the retail unit making 48CUs. imo. Consoles go for GPU in the 150-160W PC tier.
Pretty much all rumors for PS5 are an amalgam of either Gonzalo(Apisak and Komachi) or adoredtv. The Gonzalo leak is identified as "Navi 10 Lite" and looks to have a GPU core clock that gone from 1GHz to 1.8GHz.
Digital Foundry even sought out his advice(click to read the whole discussion)...
Not really, hbm2 is more expensive than its worth, the leak slow bandwidth is too low to make up for ddr4 slackSplitting the RAM pool is a good cost saving measure, these specs sound amazing. 13tflops hbm2 3.2ghz 8core zen 2 its pretty high end stuff.
Not really, hbm2 is more expensive than its worth, the leak slow bandwidth is too low to make up for ddr4 slack
The rumor which talks about hbm2 mentions salvaged parts reaching a anemic 400GB/sprices are going down apparently. but isn't hbm2 superfast bandwidth, superlow energy AND super low latency? using gddr6 for cpu is a waste?
The rumor which talks about hbm2 mentions salvaged parts reaching a anemic 400GB/s
I wouldn't put any weight to this leak
Not enough to pick up the slack of ddr4 also consider GCN hugely benefits from bandwidth and consider the X already has 320GB/s for 6TFsthat not fast enough?