• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
Honestly, it’s going both ways. As soon as the 12TF rumor for Xbox came out, suddenly “insiders” started to question whether it was RDNA even though it was said a million times that it was Navi. Then the rumor of a 9TF PS5 gained some traction From somewhat reputable sources, then suddenly It was 13 TF. Now both consoles are marginally the same, with the Xbox having a slight edge. So sure MS may have some shills but Sony does also. It just depends on which side of the fence you are rooting for. I’m rooting for both. If both console makers make bad ass hardware, my PC will actually be forced to use its muscle.


Depends on how many teams are working on it. I don’t think Microsoft would risk not having next-gen graphics on their 12TF machine. I mean, look at the anniversary editions of Halo. You can turn the old and new graphics engines on and off and there are very clear noticeable differences. Ray tracing will definitely help with that as well.

The PS5 rumor of 13 TFLOPS comes from April
 

Gamernyc78

Banned
Perfect comment, I agree with you a 100%
You can see YouTubers like Colteastwood and Dealer Gaming and how they’ve been making videos about how powerful the Xbox Series X will be over the PS5 EVEN when there are verified reports and reputable people like Colin Moriarty and Andrew Reiner stating the PS5 is more powerful, it’s really embarrassing and disgusting to see how Xbox fanboys lie and misinform people into thinking this way when reality isn’t like that, I just wanna see their reactions when the PS5 really is as powerful as XSX or more because I can guarantee you that this will happen, too much reports and evidence behind it happening.

I remember someone named Klobrille IMMEDIATELY chimed in saying XSX is “clearly more advanced and more powerful than PS5” after that 1st WIRED article by Sony about the PS5, and that Klobrille statements turned out to be fake and it was just what Microsoft is targeting, not what Sony had and what Microsoft will have, and Colteastwood has been making videos for over a year talking about how XSX will be more powerful when in fact, all reports are saying otherwise.

It’ll be sooo funny when the final specifications come out and PS5 may end up having the more powerful parts, oh man, the reactions. 🤣🤣🤣🤣🤣

Nothing different from the Mistermediax and Tim Dog bs tht went on this gen and we know how stupid thy ended up looking in the end. The thing with these ppl is it amazes me how thy go so hard to push a narrative for their console of choice, get proven factually that thy are wrong but then come with the same bs again, it's amazing.

Pure insanity. It just makes me think tht it's not just pure fanaticism to this propaganda machine but being paid to drum up confusing and false info
 

Mriverz

Member
Why would the boss of their studios have an impact? He's just a figurehead. Doesn't impact hardware development at all and should not impact software development either. It's not like a company stops functioning if they get a new CEO.

I never said anything about hardware or software. The impact would be how they would proceed to reveal the console.
 

Mod of War

Ω
Staff Member
Mod of War Mod of War User camefromthenearfuture was banned because he refused verification process and continued to pose as an "insider", even after several warnings. User xcloudtimdog does the same, yet you let him be. Partiality.

Nobody has been reporting the posts. These pages can move fast in between sleep cycles.

XcloudTimdog XcloudTimdog , got anything to share with us? If not, then move along from the “insider” talk. Your internet history is well documented, down to the advert forum name.
 
All evidence so far on how Sony handle BC is by directly mimicking the old hardware as close as possible. They did it with PS4 Pro, they have patents suggesting it, and then there is Oberon which does the same. PS4 to PS4 Pro was 18 to 36CU's with ability to disable half for BC.

One thing to consider is that they will also be planning ahead right now with how to do PS5 BC on either PS5 Pro or PS6. So quite possibly it's the same method. 36CU's to 72CU's on PS5 Pro / PS6. It's a straight forward tidy system of doubling the CU's and achieves their goals.
 
All evidence so far on how Sony handle BC is by directly mimicking the old hardware as close as possible. They did it with PS4 Pro, they have patents suggesting it, and then there is Oberon which does the same. PS4 to PS4 Pro was 18 to 36CU's with ability to disable half for BC.

One thing to consider is that they will also be planning ahead right now with how to do PS5 BC on either PS5 Pro or PS6. So quite possibly it's the same method. 36CU's to 72CU's on PS5 Pro / PS6. It's a straight forward tidy system of doubling the CU's and achieves their goals.
He has patent to disable cu . It doesn’t have to be half of cu . Half makes it easier Ofcourse but doesn’t have to be half of you read the patent
 
Agreed. It's silly seeing some folks label others fanboys because they take GitHub leak into consideration. GitHub leak does have issues with it, but I think it still has validity in discussion since there would be no reason to test an old APU in middle of 2019. Saying that it was tested just to see what limits it could be pushed to also doesn't make sense since testing an old APU with less CUs than the apparent new one would be pointless. It could maybe be old results being posted that late, but testing supposed 2019 release date APU in 2019 makes zero sense. The other explanation is that Sony shortly after the tests went with a new GPU for the APU and made very late changes, which is the most believable of all the possibilities if the new 11-12 tflop PS5 specs are true.

This is something the naysayers keep seeming to dismiss. If Oberon has no relevance, why would it still be getting tested for benchmarks? Why have even later steppings since been datamined?

That said, I don't think those tests are showing the full picture. Another poster like two weeks ago, made a mention about PS4 Pro chip testing having a block of CUs disabled in order to mirror PS4 compatibility; however there were of course more CUs on the chip than the tests actually showed. I think a similar thing might be the case with Oberon (and for all we know, possibly Arden as well), to mirror the setup Sony used with PS4 Pro.

There was a graphic estimating chip sizes and CU counts (among other things) based on the data available and that particular graphic mentioned a possible 48 CUs on Oberon. That could be possible; it'd be about 10.75TF @ 1750MHz. Assuming by some chance they could actually push it to 2000MHz, that would be 12.28TFs, but that is well past Navi's sweetspot. So at 1800MHz, it'd give 11.058TF. I'm just going with 48CUs as that's what the graphic posted on ResetEra (I know, Era's garbage but the graphic itself was well-done and logical), but you can see how that does give an 11-12TF range when clocking at Navi's sweetspot.

Now the only thing that isn't explained, is why has Oberon been tested at 2GHz? Some of the later Oberon steppings seem to have made some bug fixes (such as to the memory, possibly expanding the bandwidth). It could be that the version in the Github leak had problems with turning on an extra set of CUs. So far the post-Github datamines still seem to list the chip at 40 max CUs, so with Oberon looking like a persistently tested chip this close to launch, rather than some magical other chip popping up, I'm hoping there's a datamining of an Oberon stepping with 48-52CUs (or hell, possibly more than that) on it that surfaces. Maybe closer to GDC, we'll see.

Right now I'm in the camp Oberon is still very much the PS5's chip, but there are very likely a chunk of CUs disabled we aren't seeing, that could be due to bugs in the silicon that should be fixed with some later stepping. And that it's very likely the person who did the chip graphic estimate for PS5 and XSX (that was posted on Era) could be correct with estimating around 48 active CUs, lining up with another poster who mentioned that PS4 Pro's GPU had a chunk of disabled CUs for testing PS4 compatibility, and given Cerny headed both that and now PS5, he'd very likely employ a similar design choice with Oberon.

All we need is a later Oberon stepping with more active CUs on it to confirm all of this. With taking Navi's sweetspot into consideration, an 11.058 TF PS5 would be a hell of a beast, and fit within the ~ 10% range some insiders have claimed (I mean for all we know, XSX could indeed be a 56CU chip but clocked really low ATM, such as @1450MHz, pegging it around 10.3936TFs currently and they're just waiting to up the clocks later. Of course, 1450MHz is well below Navi's sweetspot). So in the end with that looking very possible, XSX could have a slight TF edge but it'd be less than 10% if they really are aiming just for 12TF; if the maximum difference of 10% is reached that would give XSX about 12.3648TF @1725Mhz.

So even in that case, the difference wouldn't be big, and we know both systems are using the same CPU designs (possible the L3$ could be different between them I guess), similar memory (GDDR6), and custom SSD as a cache (rumors of PS5's being possibly faster). But that looks like the most probable scenario for right now. I do know that PS4 Pro's GPU actually disabled half its chip for PS4 functionality; by that metric we could speculate Oberon has 72 active CUs potentially (or using dual Oberons in a chiplet setup). However, the reason I've dismissed this option personally is because it would be overkill for the numbers insiders themselves have been pegging for any upper limit to next-gen systems, as the chip(s) would have to be severely underclocked below Navi's sweetspot to hit "just" 12TF or even "just" 13TF. A waste of silicon and BOM that doesn't effectively maximize the potential, sounds very unlike Cerny.

(following is just pure radical speculation btw...)

Unless you could, say, "upgrade" the performance of such a setup by buying an optional custom cooling kit to install in the PS5, to basically turn it into a 'PS5 Pro' but without needing to purchase an actual PS5 Pro. That could bump the performance up to, say, 16.58TF if clocking the GPU(s) to 1800MHz. It's a super-wild possibility, and it would pretty much cut out any chance for Sony to get double-dip system buys in people purchasing a PS5 early on and then a PS5 Pro a few years later. But if they've decided that the Pro model approach wasn't ultimately profitable compared to the costs, and they still want to provide a pathway for owners to upgrade the specs, they COULD do that. They COULD eat costs on PS5 with such a big chip/dual chiplet GPU setup heavily underclocked with a "poorer" cooling solution built-in to hit 10-11TF @ $399 (for example), but provide a cooling upgrade for $100 - $150 that would make it a PS5 Pro by giving the system a much better cooling solution and upgraded PSU that's super-easy to install.

Can't quite wrap my head behind that type of idea as a business decision (plus it wouldn't answer other potential problems like the "upgraded" GPU(s) potentially being memory and bandwidth-starved due to the upclock)
 

Reindeer

Member
You are missing one hipótesis, if the tests are only for back-compat mode (and assuming PS5 will use the same method as the Pro), Oberon only has 36CUs active to 'mimic' the legacy hardware.
I don't have access to all the data, and don't know if someone here has, that would be the only way to confirm it.
Still doesn't make sense testing a supposed old chip that late, surely they would have been testing a new one for back compat if it was ready. Which again could mean that the supposed new chip wasn't ready at the time of these tests. But then again, they would have to test the new chip for back compat again anyway. There are many unanswered questions regarding the GitHub leak.
 
This is something the naysayers keep seeming to dismiss. If Oberon has no relevance, why would it still be getting tested for benchmarks? Why have even later steppings since been datamined?

That said, I don't think those tests are showing the full picture. Another poster like two weeks ago, made a mention about PS4 Pro chip testing having a block of CUs disabled in order to mirror PS4 compatibility; however there were of course more CUs on the chip than the tests actually showed. I think a similar thing might be the case with Oberon (and for all we know, possibly Arden as well), to mirror the setup Sony used with PS4 Pro.

There was a graphic estimating chip sizes and CU counts (among other things) based on the data available and that particular graphic mentioned a possible 48 CUs on Oberon. That could be possible; it'd be about 10.75TF @ 1750MHz. Assuming by some chance they could actually push it to 2000MHz, that would be 12.28TFs, but that is well past Navi's sweetspot. So at 1800MHz, it'd give 11.058TF. I'm just going with 48CUs as that's what the graphic posted on ResetEra (I know, Era's garbage but the graphic itself was well-done and logical), but you can see how that does give an 11-12TF range when clocking at Navi's sweetspot.

Now the only thing that isn't explained, is why has Oberon been tested at 2GHz? Some of the later Oberon steppings seem to have made some bug fixes (such as to the memory, possibly expanding the bandwidth). It could be that the version in the Github leak had problems with turning on an extra set of CUs. So far the post-Github datamines still seem to list the chip at 40 max CUs, so with Oberon looking like a persistently tested chip this close to launch, rather than some magical other chip popping up, I'm hoping there's a datamining of an Oberon stepping with 48-52CUs (or hell, possibly more than that) on it that surfaces. Maybe closer to GDC, we'll see.

Right now I'm in the camp Oberon is still very much the PS5's chip, but there are very likely a chunk of CUs disabled we aren't seeing, that could be due to bugs in the silicon that should be fixed with some later stepping. And that it's very likely the person who did the chip graphic estimate for PS5 and XSX (that was posted on Era) could be correct with estimating around 48 active CUs, lining up with another poster who mentioned that PS4 Pro's GPU had a chunk of disabled CUs for testing PS4 compatibility, and given Cerny headed both that and now PS5, he'd very likely employ a similar design choice with Oberon.

All we need is a later Oberon stepping with more active CUs on it to confirm all of this. With taking Navi's sweetspot into consideration, an 11.058 TF PS5 would be a hell of a beast, and fit within the ~ 10% range some insiders have claimed (I mean for all we know, XSX could indeed be a 56CU chip but clocked really low ATM, such as @1450MHz, pegging it around 10.3936TFs currently and they're just waiting to up the clocks later. Of course, 1450MHz is well below Navi's sweetspot). So in the end with that looking very possible, XSX could have a slight TF edge but it'd be less than 10% if they really are aiming just for 12TF; if the maximum difference of 10% is reached that would give XSX about 12.3648TF @1725Mhz.

So even in that case, the difference wouldn't be big, and we know both systems are using the same CPU designs (possible the L3$ could be different between them I guess), similar memory (GDDR6), and custom SSD as a cache (rumors of PS5's being possibly faster). But that looks like the most probable scenario for right now. I do know that PS4 Pro's GPU actually disabled half its chip for PS4 functionality; by that metric we could speculate Oberon has 72 active CUs potentially (or using dual Oberons in a chiplet setup). However, the reason I've dismissed this option personally is because it would be overkill for the numbers insiders themselves have been pegging for any upper limit to next-gen systems, as the chip(s) would have to be severely underclocked below Navi's sweetspot to hit "just" 12TF or even "just" 13TF. A waste of silicon and BOM that doesn't effectively maximize the potential, sounds very unlike Cerny.

(following is just pure radical speculation btw...)

Unless you could, say, "upgrade" the performance of such a setup by buying an optional custom cooling kit to install in the PS5, to basically turn it into a 'PS5 Pro' but without needing to purchase an actual PS5 Pro. That could bump the performance up to, say, 16.58TF if clocking the GPU(s) to 1800MHz. It's a super-wild possibility, and it would pretty much cut out any chance for Sony to get double-dip system buys in people purchasing a PS5 early on and then a PS5 Pro a few years later. But if they've decided that the Pro model approach wasn't ultimately profitable compared to the costs, and they still want to provide a pathway for owners to upgrade the specs, they COULD do that. They COULD eat costs on PS5 with such a big chip/dual chiplet GPU setup heavily underclocked with a "poorer" cooling solution built-in to hit 10-11TF @ $399 (for example), but provide a cooling upgrade for $100 - $150 that would make it a PS5 Pro by giving the system a much better cooling solution and upgraded PSU that's super-easy to install.

Can't quite wrap my head behind that type of idea as a business decision (plus it wouldn't answer other potential problems like the "upgraded" GPU(s) potentially being memory and bandwidth-starved due to the upclock)
Because their final chip is not ready and they don’t want to lose time when they could use their old chip from old dev kits to do BC regression tests ? I mean there are many reasons if they have more that 1 chip .(if they have )
 

OsirisBlack

Banned
Care to point me towards where you said anything about a new Tsushima trailer or that the devkit is the final box? Because the search function doesn't return any such results.

Also for a person with no agenda you sure took some smashing hits at Tim when he says something that doesn't sound great about PS5. Not the reaction I would expect from an objective person.
That was obviously sarcasm directed at Tim. Me pandering to his want / need for a gaming box to be inferior. I’m sure you’re being purposely obtuse as it was over the top and obvious.

Edited to add: Beer can affect production.
 
Last edited:

schaft0620

Member
GMbscYV.jpg
I'm sorry what's this?
 
The PS5 rumor of 13 TFLOPS comes from April
Which was most likely bullshit at that time too. Unless Sony wants a giant bulky console. Considering the Japanese market and how they don’t like bulky electronics, (part of the reason Xbox was shunned) I really don’t think their in a place to allow that. In order to bring the size down they’ll need to bring the power down. But RDNA should help with that 9TF will be no slouch by any means also taking into considering what Sony did with 1.4TF.
 

liviopangt

Neo Member
secondo me ps5 e dual gpu chipled con il raytracing separato dalle gpu connesso in parellelo alle gpu .
perche dual gpu ? per gestire meglio i 2 segnali video per il psvr2 le 2 gpu possono renderizzare in vr2 redering separati
 
  • Thoughtful
Reactions: TLZ

DJ12

Member
Still doesn't make sense testing a supposed old chip that late, surely they would have been testing a new one for back compat if it was ready. Which again could mean that the supposed new chip wasn't ready at the time of these tests. But then again, they would have to test the new chip for back compat again anyway. There are many unanswered questions regarding the GitHub leak.
Without the context of the tests it could be anything.

Lets speculate that some senior engineer was given an intern for a few months, and didn't want to spend anytime with him, found an old chip lying around and gave it to the intern to run wild with and not bother him for a few hours each day.
 
secondo me ps5 e dual gpu chipled con il raytracing separato dalle gpu connesso in parellelo alle gpu .
perche dual gpu ? per gestire meglio i 2 segnali video per il psvr2 le 2 gpu possono renderizzare in vr2 redering separati
Though NeoGAF is an international forum, discussion is to be held in English unless otherwise designated.
 

ANIMAL1975

Member
secondo me ps5 e dual gpu chipled con il raytracing separato dalle gpu connesso in parellelo alle gpu .
perche dual gpu ? per gestire meglio i 2 segnali video per il psvr2 le 2 gpu possono renderizzare in vr2 redering separati
Dude you got to translate that shit if you want to be understood... or i will start to reply to you in portuguese, percebido?
 
Last edited:

TGO

Hype Train conductor. Works harder than it steams.
The docs of the github leak were uploaded to google drive and AMD recently issued a DMCA to make Google take the data down. It peoves the authenticity of the source, but little more beyond that.
They can issue a DMCA on false information too
 

geordiemp

Member
Still doesn't make sense testing a supposed old chip that late, surely they would have been testing a new one for back compat if it was ready. Which again could mean that the supposed new chip wasn't ready at the time of these tests. But then again, they would have to test the new chip for back compat again anyway. There are many unanswered questions regarding the GitHub leak.

So Xbox has 2 chips for Lockart and SeX

And why would sony only look at 1 chip and corner themselves ?

If they had any sense they would work on a low and more expensive offerings and see what transpires with yields and costs. Options are good, no options are not.
 

Gamernyc78

Banned
So Xbox has 2 chips for Lockart and SeX

And why would sony only look at 1 chip and corner themselves ?

If they had any sense they would work on a low and more expensive offerings and see what transpires with yields and costs. Options are good, no options are not.

Lol so you think thy only test one chip 🤦‍♂️or was your question rhetorical and you know tht these companies tend to test more than one chip.
 
Last edited:

Captain Hero

The Spoiler Soldier
secondo me ps5 e dual gpu chipled con il raytracing separato dalle gpu connesso in parellelo alle gpu .
perche dual gpu ? per gestire meglio i 2 segnali video per il psvr2 le 2 gpu possono renderizzare in vr2 redering separati

translated via Google :

in my opinion ps5 and dual gpu chipled with raytracing separate from the gpu connected in parallel to the gpu.
why dual gpu? to better manage the 2 video signals for the psvr2 the 2 gpu can render in separate vr2 redering
 
This is something the naysayers keep seeming to dismiss. If Oberon has no relevance, why would it still be getting tested for benchmarks? Why have even later steppings since been datamined?

That said, I don't think those tests are showing the full picture. Another poster like two weeks ago, made a mention about PS4 Pro chip testing having a block of CUs disabled in order to mirror PS4 compatibility; however there were of course more CUs on the chip than the tests actually showed. I think a similar thing might be the case with Oberon (and for all we know, possibly Arden as well), to mirror the setup Sony used with PS4 Pro.

There was a graphic estimating chip sizes and CU counts (among other things) based on the data available and that particular graphic mentioned a possible 48 CUs on Oberon. That could be possible; it'd be about 10.75TF @ 1750MHz. Assuming by some chance they could actually push it to 2000MHz, that would be 12.28TFs, but that is well past Navi's sweetspot. So at 1800MHz, it'd give 11.058TF. I'm just going with 48CUs as that's what the graphic posted on ResetEra (I know, Era's garbage but the graphic itself was well-done and logical), but you can see how that does give an 11-12TF range when clocking at Navi's sweetspot.

Now the only thing that isn't explained, is why has Oberon been tested at 2GHz? Some of the later Oberon steppings seem to have made some bug fixes (such as to the memory, possibly expanding the bandwidth). It could be that the version in the Github leak had problems with turning on an extra set of CUs. So far the post-Github datamines still seem to list the chip at 40 max CUs, so with Oberon looking like a persistently tested chip this close to launch, rather than some magical other chip popping up, I'm hoping there's a datamining of an Oberon stepping with 48-52CUs (or hell, possibly more than that) on it that surfaces. Maybe closer to GDC, we'll see.

Right now I'm in the camp Oberon is still very much the PS5's chip, but there are very likely a chunk of CUs disabled we aren't seeing, that could be due to bugs in the silicon that should be fixed with some later stepping. And that it's very likely the person who did the chip graphic estimate for PS5 and XSX (that was posted on Era) could be correct with estimating around 48 active CUs, lining up with another poster who mentioned that PS4 Pro's GPU had a chunk of disabled CUs for testing PS4 compatibility, and given Cerny headed both that and now PS5, he'd very likely employ a similar design choice with Oberon.

All we need is a later Oberon stepping with more active CUs on it to confirm all of this. With taking Navi's sweetspot into consideration, an 11.058 TF PS5 would be a hell of a beast, and fit within the ~ 10% range some insiders have claimed (I mean for all we know, XSX could indeed be a 56CU chip but clocked really low ATM, such as @1450MHz, pegging it around 10.3936TFs currently and they're just waiting to up the clocks later. Of course, 1450MHz is well below Navi's sweetspot). So in the end with that looking very possible, XSX could have a slight TF edge but it'd be less than 10% if they really are aiming just for 12TF; if the maximum difference of 10% is reached that would give XSX about 12.3648TF @1725Mhz.

So even in that case, the difference wouldn't be big, and we know both systems are using the same CPU designs (possible the L3$ could be different between them I guess), similar memory (GDDR6), and custom SSD as a cache (rumors of PS5's being possibly faster). But that looks like the most probable scenario for right now. I do know that PS4 Pro's GPU actually disabled half its chip for PS4 functionality; by that metric we could speculate Oberon has 72 active CUs potentially (or using dual Oberons in a chiplet setup). However, the reason I've dismissed this option personally is because it would be overkill for the numbers insiders themselves have been pegging for any upper limit to next-gen systems, as the chip(s) would have to be severely underclocked below Navi's sweetspot to hit "just" 12TF or even "just" 13TF. A waste of silicon and BOM that doesn't effectively maximize the potential, sounds very unlike Cerny.

(following is just pure radical speculation btw...)

Unless you could, say, "upgrade" the performance of such a setup by buying an optional custom cooling kit to install in the PS5, to basically turn it into a 'PS5 Pro' but without needing to purchase an actual PS5 Pro. That could bump the performance up to, say, 16.58TF if clocking the GPU(s) to 1800MHz. It's a super-wild possibility, and it would pretty much cut out any chance for Sony to get double-dip system buys in people purchasing a PS5 early on and then a PS5 Pro a few years later. But if they've decided that the Pro model approach wasn't ultimately profitable compared to the costs, and they still want to provide a pathway for owners to upgrade the specs, they COULD do that. They COULD eat costs on PS5 with such a big chip/dual chiplet GPU setup heavily underclocked with a "poorer" cooling solution built-in to hit 10-11TF @ $399 (for example), but provide a cooling upgrade for $100 - $150 that would make it a PS5 Pro by giving the system a much better cooling solution and upgraded PSU that's super-easy to install.

Can't quite wrap my head behind that type of idea as a business decision (plus it wouldn't answer other potential problems like the "upgraded" GPU(s) potentially being memory and bandwidth-starved due to the upclock)
First of all this
Because their final chip is not ready and they don’t want to lose time when they could use their old chip from old dev kits to do BC regression tests ? I mean there are many reasons if they have more that 1 chip .(if they have )
+ if oberon is related to the PS5 it also means they need to push it to 2Ghz to emulate what a 48CU's can deliver in graphics perf @1700Mhz, that's what explains the V Devs Kit (2Ghz is a lot), final product WILL NOT run @2Ghz.
- The Dual Chiplet theory is CRAZY, when i read it i was WOOOOOOOOOOOOOOO, the amount of power that would need is beyond 2GHZ, to communicate between chip to chip, travel (pikajul, i think that's the right name) to travel to a point to another needs more power.
-buying addons like a cooler will never happen.
-their business is to make it easier for consumers, not turning it into PC, they might make an easy SSD swap that's it.
 

Gamernyc78

Banned
translated via Google :

in my opinion ps5 and dual gpu chipled with raytracing separate from the gpu connected in parallel to the gpu.
why dual gpu? to better manage the 2 video signals for the psvr2 the 2 gpu can render in separate vr2 redering

Why do dual gpus keep popping up? I was in the "bullshit" team but now I'm thinking there might be fire where there is constant smoke.
 

hemo memo

Gold Member
secondo me ps5 e dual gpu chipled con il raytracing separato dalle gpu connesso in parellelo alle gpu .
perche dual gpu ? per gestire meglio i 2 segnali video per il psvr2 le 2 gpu possono renderizzare in vr2 redering separati

perché Sony dovrebbe concentrarsi troppo su VR2?
 

ANIMAL1975

Member
Without the context of the tests it could be anything.

Lets speculate that some senior engineer was given an intern for a few months, and didn't want to spend anytime with him, found an old chip lying around and gave it to the intern to run wild with and not bother him for a few hours each day.
And how do we even know the exact dates of the tests? The all thing was deleted... do we have someone who has saved the leak?
And by the way, i remember a post on the thread ( i think it was one day after the leak) by some guy saying that people, after some digging, had already seen that the docs weren't really datamined from... if i could find the little f**er, it's a pain on mobile to search for shit.
 

Gamernyc78

Banned
I don’t know man adding 20 cu is much cheaper compared to dual gpu . We shouldn’t go all crazy yet haha

Nah I'm very reserved lol I'm just not putting it completely out of my mind. Sony does like to think out the box when making hardware and did make a seperate chip this gen in the ps4 to handle downloading, some social features etc.
 

liviopangt

Neo Member
dual gpu puo essere terribile in ambito pc ma in un hardware chiuso ottimizzato creando bus di collegamento piu veloci delle pci express e tutta unaltra storia
 
  • Thoughtful
Reactions: TLZ

liviopangt

Neo Member
per quanto riguarda il sistema di raffreddamento nn e cosi difficile proggettarlo basta fare consol a sendwich dissipatore che occuba la parte sotto e sopra le periferiche
nn e che ci va uno scenziato termico e poi fatto in alluminio nn costerebbe manco tanto anche perche lefficenza di un dissipatore in alluminio nn e secondo a nessuno con la ventola messa in mezzo tipo ps3 risolvi tutto
 
  • Thoughtful
Reactions: TLZ

DeepEnigma

Gold Member
dual gpu puo essere terribile in ambito pc ma in un hardware chiuso ottimizzato creando bus di collegamento piu veloci delle pci express e tutta unaltra storia

“dual gpu can be terrible in pc but in a closed hardware optimized by creating faster connection buses than pci express and another story”


per quanto riguarda il sistema di raffreddamento nn e cosi difficile proggettarlo basta fare consol a sendwich dissipatore che occuba la parte sotto e sopra le periferiche
nn e che ci va uno scenziato termico e poi fatto in alluminio nn costerebbe manco tanto anche perche lefficenza di un dissipatore in alluminio nn e secondo a nessuno con la ventola messa in mezzo tipo ps3 risolvi tutto

“as for the cooling system nn it is so difficult to design it just do consol to sendwich heatsink that occuba the part below and above the peripherals
nn and that there is a thermal scientist and then made in aluminum nn would cost a lot also because the efficiency of an aluminum heat sink nn and second to none with the fan put in the middle type ps3 solve everything”


L liviopangt , usa questo sito:

 
Last edited:

pasterpl

Member
Hi, I am new to this forum and while I follow conversations/speculations here, I am wondering if anyone commented on the XSX press release from Dec. 2019? It states;

...Its industrial design enables us to deliver four times the processing power of Xbox One X...

 
Which was most likely bullshit at that time too. Unless Sony wants a giant bulky console. Considering the Japanese market and how they don’t like bulky electronics, (part of the reason Xbox was shunned) I really don’t think their in a place to allow that. In order to bring the size down they’ll need to bring the power down. But RDNA should help with that 9TF will be no slouch by any means also taking into considering what Sony did with 1.4TF.

TBF, Sony doesn't seem to be prioritizing the Japanese market anymore. The headquarters was relocated to California a while ago, and they seem to be focusing more on 1st party content that appeals to Western markets. I wouldn't be surprised if that extends to the console design so if the system turns out to be somewhat bulky, they won't care too much if Japan dislikes that because Japan as a whole has been moving away from traditional home consoles for years.

It'd be a much bigger issue if they were developing a, say, PS Vita 2, and it was chunky and bulky as hell. Japan needs to start appreciating the thickness 👍

First of all this

+ if oberon is related to the PS5 it also means they need to push it to 2Ghz to emulate what a 48CU's can deliver in graphics perf @1700Mhz, that's what explains the V Devs Kit (2Ghz is a lot), final product WILL NOT run @2Ghz.
- The Dual Chiplet theory is CRAZY, when i read it i was WOOOOOOOOOOOOOOO, the amount of power that would need is beyond 2GHZ, to communicate between chip to chip, travel (pikajul, i think that's the right name) to travel to a point to another needs more power.
-buying addons like a cooler will never happen.
-their business is to make it easier for consumers, not turning it into PC, they might make an easy SSD swap that's it.

A few points of contention here. The 36CU chip @ 40CUs wouldn't actually quite give the performance levels of a 48CU chip hitting around the sweetspot levels unless the 36 CU chip is pushed to around 2250MHz - 2400MHz, which at that point would probably just melt the chip. I agree tho that whatever PS5's final specs end up at, it absolutely won't be running at 2GHz on the GPU.

I agree the dual-chiplet idea is somewhat on the crazier side; have discussed why it would be insane in a way with dual Oberons (clocked waaaaay below sweetspots, giving "only" 12 or 13TFs when they could give a lot more, no insider rumors honestly trying to push something like 16TF (about what dual Oberons or a 72CU Oberon in sweetspot frequencies would give) etc.); it was just wild speculation on my end. It's pretty much the least likely scenario tbh.

That said, the Oberon chip having a chunk of CU's disabled is a lot more likely, so in actuality it could be a possible 48 CU chip, but they're working some bugs out most likely (maybe there is an Oberon stepping waiting to be datamined with more active CUs on it, hopefully it comes up within the next few weeks leading up to GDC). That said, I've prepared myself for a "worst-case" scenario if it really is just 36 active CUs max (I think this is less likely than a later Oberon stepping having more CUs active, but it's a potential possibility). Because as you say, there's no way either system's clocking their retail GPUs @ 2GHz, even high-end PC GPU cards don't push clocks that high and they have pretty insane cooling.

So Xbox has 2 chips for Lockart and SeX

And why would sony only look at 1 chip and corner themselves ?

If they had any sense they would work on a low and more expensive offerings and see what transpires with yields and costs. Options are good, no options are not.

In a way the additional Oberon steppings fit the idea of them testing multiple chips, i.e different iterations of the same chip with the same general architecture.

Given how late it's getting close to substantial system reveals, it's probably best to give up thinking there is a wholly different PS5-related chip out there with zero relation to Oberon. Oberon is very likely the PS5's GPU. HOWEVER, we also very likely don't have a full picture on Oberon, i.e it could very possibly have a block of CUs on it which are disabled.

I would much rather place stock on a later Oberon stepping surfacing with more CUs active (and maybe even showing RT blocks turned on) than a magical chip not related to Oberon popping up from the wild. The latter is just very improbable.
 
Last edited:

darkinstinct

...lacks reading comprehension.
So Xbox has 2 chips for Lockart and SeX

And why would sony only look at 1 chip and corner themselves ?

If they had any sense they would work on a low and more expensive offerings and see what transpires with yields and costs. Options are good, no options are not.

So what you are saying is they release a 9.2 TF entry model and a 12 TF high end model at the same time? A $400 and a $500 PS5?
 

hunthunt

Banned
per quanto riguarda il sistema di raffreddamento nn e cosi difficile proggettarlo basta fare consol a sendwich dissipatore che occuba la parte sotto e sopra le periferiche
nn e che ci va uno scenziato termico e poi fatto in alluminio nn costerebbe manco tanto anche perche lefficenza di un dissipatore in alluminio nn e secondo a nessuno con la ventola messa in mezzo tipo ps3 risolvi tutto

You should really try Google Translate dude, it works great these days.
 
  • Thoughtful
Reactions: TLZ

Reindeer

Member
So Xbox has 2 chips for Lockart and SeX

And why would sony only look at 1 chip and corner themselves ?

If they had any sense they would work on a low and more expensive offerings and see what transpires with yields and costs. Options are good, no options are not.
What?? Microsoft are rumoured to be making two chipsets because they want to make two different consoles, how is that in any way relatable to Sony using two different chips for one console as per your assumption?
 
Last edited:

FERN

Member
TBF, Sony doesn't seem to be prioritizing the Japanese market anymore. The headquarters was relocated to California a while ago, and they seem to be focusing more on 1st party content that appeals to Western markets. I wouldn't be surprised if that extends to the console design so if the system turns out to be somewhat bulky, they won't care too much if Japan dislikes that because Japan as a whole has been moving away from traditional home consoles for years.

It'd be a much bigger issue if they were developing a, say, PS Vita 2, and it was chunky and bulky as hell. Japan needs to start appreciating the thickness 👍
Japan should NOT be bothered by thickness

20d29ee1ca4796a10e58994729a787b8.jpg
 

B_Boss

Member
I would absolutely love if two consoles were offered at launch from Sony...I hated shelling out first for the launch PS4 and then the pro. I would have definitely rather gotten a pro from the start and not ever worry about another console until nextgen...very annoying.
 
Last edited:

geordiemp

Member
So what you are saying is they release a 9.2 TF entry model and a 12 TF high end model at the same time? A $400 and a $500 PS5?

They dont have to release 2 consoles, but Sony would of been wise to EXLORE to power levels unless they knew the future costs of the chips in the 2 years or whatever cycle. Things change.

Do you think they would just look at 1 option ? mmmm.

What if that option was not as good as they though, do they blisfully carry on ?

Seems a hard concept to grasp by the looks of it lol
 
Last edited:

Neo Blaster

Member
secondo me ps5 e dual gpu chipled con il raytracing separato dalle gpu connesso in parellelo alle gpu .
perche dual gpu ? per gestire meglio i 2 segnali video per il psvr2 le 2 gpu possono renderizzare in vr2 redering separati
Per favore, anche se l'inglese non è la lingua madre di molti qui intorno, il consenso è per tutti di usarlo.
 

LED Guy?

Banned
Honestly, it’s going both ways. As soon as the 12TF rumor for Xbox came out, suddenly “insiders” started to question whether it was RDNA even though it was said a million times that it was Navi. Then the rumor of a 9TF PS5 gained some traction From somewhat reputable sources, then suddenly It was 13 TF. Now both consoles are marginally the same, with the Xbox having a slight edge. So sure MS may have some shills but Sony does also. It just depends on which side of the fence you are rooting for. I’m rooting for both. If both console makers make bad ass hardware, my PC will actually be forced to use its muscle.


Depends on how many teams are working on it. I don’t think Microsoft would risk not having next-gen graphics on their 12TF machine. I mean, look at the anniversary editions of Halo. You can turn the old and new graphics engines on and off and there are very clear noticeable differences. Ray tracing will definitely help with that as well.
Yeah there are both sides but you have seen Xbox fanboys’ videos like Colteastwood and Dealer Gaming, it’s way beyond what Sony fanboys have been doing, but whatever...

About Halo Infinite, I hope 343 industries cando something special with Halo Infinite on Xbox Series X but I wouldn’t put my faith on it, the game will have to work on Jaguar CPU cores, 1.3 TF GPU and HDD so I don’t think the XSX version will differ, maybe just the graphics a bit, they’ll try ray tracing and stuff but it’ll be interesting.
 
Status
Not open for further replies.
Top Bottom