• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

pawel86ck

Banned
It's funny certain people here like to think about themselves as rational and realistic.... yet they believe github 2GHz leak. Sorry but 2GHz talk is as absurd as 18TF. That was probably some stress test and if PS5 will indeed end up using SoC from that github leak it will be clocked much lower, no more than 1750MHz, so 8TF at max. 8TF (PS5) vs 12TF (XSX) difference would be really big and I doubt that will be the case.
 
Last edited:
If there was a 2tflop difference between both consoles, would people think they were similar specs?
That's alot closer % wise than the OG One and PS4, and One X and Pro.
 

R600

Banned
What has any of this got to do with what I originally wrote? You're just stating the bleeding obvious '7nm production is much more expensive than previous iterations'. No shit, how about rebutting the points I made earlier?

You pigeon-holed on 7nm wafer costs, when that isn't even in contention here. I'm sure MS and Sony know full well how expensive the tech is. My point was and is about 'yields'. You stated smaller size = better yields and I showed you why that wouldn't be the case as there are several factors at play which dictate the yields. If Sony are going for absurdly high 1.8-2.0GHz clocks with realistic TDP the yields would be significantly poorer than a bigger GPU clocked lower. And no, 10% lower TDP wouldn't magically be a game changer. A 5700XT at its peak clock draws 227W of power. 10% reduction gets you at ~200W mark but that is a GPU clocked at 1905MHz, now imagine what would happen at 2.0GHz?

Also keep in mind these power curves don't apply the same way for console chip, that 5700XT is a very high quality piece of silicon, the same chip is used in 5700 but carefully hand picked for 5700XT because of its higher quality nature. Expecting a console binned chip such as in PS5 or Series X to perform on same level as 5700XT is lunacy as they will never have as good as thermals and power draw.



36CU@1.8GHz is still way too high. Its equivalent would be 5700 which has 36CUs@1.625GHz. How are Sony gonna squeeze out 175MHz-375MHz more speed out of it without destroying power curve and fucking over their yields?

I'm gonna summarize once again in hope you get it, if you continue to be deliberately obtuse about it and post some more irrelevant stuff then don't reply.

Even if I agree with you, that going narrow and fast has some merit. It has to be under some realms of practicality. Consoles are generally underclocked compared to their PC counterparts because the mantra has always been about efficiency in power, performance, thermal and cost. Now let's assume that Sony decided to push the envelope and asked AMD to sandwich their GPU between 5700 and 5700XT @1.8GHz. The silicon required to fit into that wouldn't be as high quality as a 5700XT. But what would happen to those chips which have 36CUs but can't hit 1.8GHz? Or if they have 36 active CUs but can't hit 1.8GHz without going past ~200W? What would Sony do with these binned chips? Throw them away?

This is where your entire argument breaks down, if Sony has a limited supply of 7nm wafer sheets then it DOESN'T make sense for them to design an APU in ~300-325mm2 territory, when in order to achieve those targets they have to clock their GPU at 1.8-2.0GHz while maintaining the power curve. All those precious $ they're trying to save on wafer sheets to the tune of 50-70mm2 would backfire because yields will be incredibly poor which means more precious wafer sheets wasted which means limiting supply, and in the end will cost them more money to make their APU.

MS aren't stupid when their Series X SoC is that big, they are facing the same hurdles as Sony do as far as production is concerned. They could've gone 'narrow and fast' but didn't because its just more costly and inefficient even on a 'more mature N7P node'. Wide and slow is what MS went with, and is very likely the way Sony is heading too. Github leak while being legit lacks context, don't take it as gospel. My interpretation is, that it is a very early dev-kit Sony made to allow their developers to learn the tools and have games ready for PS5 launch. I don't know anything about what their targeted performance is, it could be 9.2TF, or 10.4TF or higher. My point is the final retail PS5 will achieve those targeted performance by having a GPU which is NOT clocked @1.8-2.0GHz. It'll be done by a bigger NAVI at lower clock speeds.
How is 56CUs at 1700MHz wide and slow? You literally said 36CUs at 1.8GHz is not possible, but somehow with 56CUs at 1700MHz it absolutely is?

You paid attention to how much 36CU 5700 at 1700MHz pulls out? Its ~160W. +30W less then equivalent GPU that ended up on XBX (RX580). A console that rarely crosess 180W as a whole.

power-gaming-average.png


That is even without taking into account that 2nd gen 7nm (N7P) gives you straight 7% perf increase for same wattage or 10% lower TDP for same clock.

190cbcc6-0d46-4e73-948c-45ee40d35fba.jpg


And this is without taking into account that you are comparing mid 2019 PC chips with what will end up in console in 2020, 1.5yrs from that point.

Honestly, dont reply to me at all if you are going to hand waive 36CU at 1.8GHz on N7P node as impossible (and then in very next sentence make argument about 56CUs at 1.7GHz clocks being "wide and slow").

And no, there is no 5700 card that cannot hit 1.8GHz. Its purposely made division by AMD. Good chips witu full 40CUs working go to XT, others to 5700. But there is absolutely no doubt every 5700 can hit 1.8GHz, easily at that.
 
Last edited:

Nickolaidas

Member
It's funny certain people here like to think about themselves as rational and realistic.... yet they believe github 2GHz leak. Sorry but 2GHz talk is as absurd as 18TF. That was probably some stress test and if PS5 will indeed end up using SoC from that github leak it will be clocked much lower, no more than 1750MHz, so 8TF at max. 8TF (PS5) vs 12TF (XSX) difference would be really big and I doubt that will be the case.
Enjoy it while it lasts.
 
If there was a 2tflop difference between both consoles, would people think they were similar specs?
That's alot closer % wise than the OG One and PS4, and One X and Pro.

I would not consider it significant, but I think it would certainly create some negativity. 2TF sounds alot worse than 10% or whatever the percentage ends up being.
 
It's funny certain people here like to think about themselves as rational and realistic.... yet they believe github 2GHz leak. Sorry but 2GHz talk is as absurd as 18TF. That was probably some stress test and if PS5 will indeed end up using SoC from that github leak it will be clocked much lower, no more than 1750MHz, so 8TF at max. 8TF (PS5) vs 12TF (XSX) difference would be really big and I doubt that will be the case.
There is no such thing as 2Ghz clock, not hapening, end of story.
2Ghz was a stress test and everyone agree's with that.
8Tf console is unlikely, 90% not happening.
 

Gamernyc78

Banned
You're here saying he's a liar, so yes you're attacking him, a guy that is giving his name and face! Which other "insider" is doing that? None.
Not even Tom Warren which is a MS journo is giving specific numbers, he said it was 12 TF at first and later backtracked on his info. lmao.
At least Klee is not changing is tune!



I've read all his comments multiple times, I've been in the next-gen threads for almost 1.5 years now. He was never inconsistent with is info!

But believe what you want.



It was not suddenly, it was after months! He always said >10 TF for both consoles and kept that for months, only recently did he said he had spec sheets.



Klee worked in the industry for 15-20 years as a journalist on gaming magazines (Gamefan and EGM). He still knows guys working in the industry, but he has a close friend which is a dev in california, the guy giving him info! Matt is (or was) a developer, he's the guy I trust the most no doubt. No one questions him when he says something and there's a reason for that... before going to ERA he's was here on GAF leaking shit for years! He has a really solid track record.

Matt already said the github thing doesn't confirm anything but keep trusting that.



I dunno what he does for a living now but he is a journalist. Saying other is just being disingenuous.
He wasn't expecting websites to pick up on the stuff he says and to be honest not a single one that is credible did AFAIK.
He liked the "Klee said"? I'm sure he did but, he was not expecting fanboys to start bickering him after saying PS5 > Scarlett.

Also to both of you R R600 Rightisright Rightisright his info is in line with what Matt and Jason Schreier have been saying for months now.

I couldn't have said it better (even though I've iterated similar thoughts). These dudes literally want to disregard credible ppl currently in the industry due to a github leak and bcus thy are attached to a box thy feel thy need to give blind loyalty too 🤷‍♂️🤦‍♂️
 

R600

Banned
Well tell that to R600 😃👍
I maintain that 1.8GHz 36CU is easily possible. In fact, Navi 10 "knee" only starts at 1.8GHz, and my point is, you would want to be closer to the knee then further behind because you are effectively "wasting" silicon that way. If they go wide and very slow you lose more perf then what you win back in efficiency, hence "sweetspot".

For Navi 10 its closer to 1800MHz then it is to 1600MHz, and that is even without taking into account improved process and 1.5yr more in oven on considerably higher budget.

Lets wait and see, not long now.
 

pawel86ck

Banned
I maintain that 1.8GHz 36CU is easily possible. In fact, Navi 10 "knee" only starts at 1.8GHz, and my point is, you would want to be closer to the knee then further behind because you are effectively "wasting" silicon that way. If they go wide and very slow you lose more perf then what you win back in efficiency, hence "sweetspot".

For Navi 10 its closer to 1800MHz then it is to 1600MHz, and that is even without taking into account improved process and 1.5yr more in oven on considerably higher budget.

Lets wait and see, not long now.
2304 SP (36 CUs) x 1800MHz x2 = 8.3TF.

8.3TF vs 12TF would be still really big difference.
 
Last edited:

ethomaz

Banned
There is no such thing as 2Ghz clock, not hapening, end of story.
2Ghz was a stress test and everyone agree's with that.
8Tf console is unlikely, 90% not happening.
I thought we had reached the consensus that 2Ghz was stress test.
But you have to say again so maybe not :D
 
Last edited:

ethomaz

Banned
I think the first one who releases the amount of ram will be upped by the other. Sony Releases with 16?Gigs, MS ups it to 24 Gigs.
That is not how it works.

Double the RAM is possible because you just put 2x more the amount of RAM in each 32bits bus... no PCB change... no change in the project for launch.... so if you have 8 32bits bus with 512MB each you just increase to 1GB each... so you doubled 4GB to 8GB.

Now to change 16GB to 24GB you need to redesign the PCB with a different amount of 32bits bus for RAM.
That is not possible anymore when the specs get out unless you want to delay the launch.
 
Last edited:

pawel86ck

Banned
Come one man just stop it. That's 8.3TF and it doesn't match ANY rumors whatsoever. All of the credible insiders have pretty much said that both PS5 and XsX will be very similar. It's just starting to look like you want to push a narrative that you are interested in personally and it hurts the quality of this topic.
There was one PS5 leak that mentioned Sony wanted to launch 8TF PS5 around 2019, but it also mentioned Sony build another much stronger SoC for 2020 therefore we can expect more than 36CUs. PS5 should be double digits now.
 
Last edited:

sinnergy

Member
That is not how it works.

Double the RAM is possible because you just put 2x more the amount of RAM in each 32bits bus... no PCB change... no change in the project for launch.... so if you have 8 32bits bus with 512MB each you just increase to 1GB each... so you doubled 4GB to 8GB.

Now to change 16GB to 24GB you need to redesign the PCB with a different amount of 32bits bus for RAM.
That is not possible anymore when the specs get out unless you want to delay the launch.
Didn’t they have 1 gig and 2 gig chips shots in the promo pcb from E3? Maybe they will go 18 or 20 gig after Sony reveal?
 
Last edited:

icerock

Member
Shifting goal posts already? Why switch the discussion to TDP when the original discussion was about yields? :) Please tell me how 36CUs@2.0GHz is gonna be economically cheaper to produce than 56CUs @1.675GHz.

How is 56CUs at 1700MHz wide and slow?

Where did I say it is? Compared to 36CUs at 2.0GHz, it IS slow and something much more achievable and realistic in regards to NAVI. It is in general much more clockable than your Polaris cards.

You literally said 36CUs at 1.8GHz is not possible

Again where did I say this? Why are you resorting to lying?

You paid attention to how much 36CU 5700 at 1700MHz pulls out? Its ~160W. +30W less then equivalent GPU that ended up on XBX (RX580). A console that rarely crosess 180W as a whole.

It is ironic you're lecturing me about power curve when I was the one to bring it into discussion as to how it affect yields. But yes, I do know. And, I also know, their dev kits are clocked at 1.8GHz and 2.0GHz which is way beyond NAVIs' sweet spot.

Besides, all your graphs using 5700/5700XT power curve is useless. You can't expect a console-level binned chip to perform close to a retail GPU. Why? Because of variance in silicon quality. Furthermore, concept of under-volting doesn't exist to console manufacturers. When AMD says 5700XT consumes maximum of 225W of power, that doesn't mean, ALL of 5700XT consume same amount of power. Some of them can achieve the same clocks at a lower-voltage, but that would be akin to winning a silicon raffle. These charts and graphs cannot be relied upon by console manufacturers to draw upon a baseline. But, this is something I've already explained to you earlier.

DF has stated XSX breaks what were traditional console barriers with regards to TDP, it's going to draw well north of 200W+. A look at form factor is a dead giveaway, so is Sonys' devkit.

That is even without taking into account that 2nd gen 7nm (N7P) gives you straight 7% perf increase for same wattage or 10% lower TDP for same clock.

Already told you how 7-10% won't be dramatic game-changer since the clock speeds of 1.8-2.0GHz are well beyond the optimum power/performance/thermal mark.

And this is without taking into account that you are comparing mid 2019 PC chips with what will end up in console in 2020, 1.5yrs from that point.

Yeah, these chips will magically start emitting much less heat, consume much less power after 18 month time-lapse just because node is more mature. Great argument.

Honestly, dont reply to me at all if you are going to hand waive 36CU at 1.8GHz on N7P node as impossible

Once again I never said that, you're again putting words in my mouth. I said why using 36CUs@1.8GHz would result in poor yields compared to a bigger GPU and gave you array of reasons as to why that would be the case. The reason which you have once again ignored.

(and then in very next sentence make argument about 56CUs at 1.7GHz clocks being "wide and slow").

Comparatively, comprehension is hard for some. Besides we have no idea if that is the configuration MS are going with for the Series X retail kit. They could opt for an even bigger CU count and further reduce the clocks to reach their target.

And no, there is no 5700 card that cannot hit 1.8GHz. Its purposely made division by AMD. Good chips witu full 40CUs working go to XT, others to 5700. But there is absolutely no doubt every 5700 can hit 1.8GHz, easily at that.

Such a shame I read this last or I wouldn't have wasted my energy. All the things I wrote about baseline, why companies divide tiers, how clocks and power affect yields etc. are chucked sideaways because 'I believe they can easily do that'.

Sigh, I give up. Carry on spreading your mis-information.
 

chilichote

Member
There was one PS5 leak that mentioned Sony wanted to launch 8TF PS5 around 2019, but it also mentioned Sony build another much stronger SoC for 2020 therefore we can expect more than 36CUs. PS5 should be double digits now.

What if, Sony comes with 2 SKUs? I would buy the 8tf One because oft the Power the 11 to 13tf PS5 sucks from the wall. And with the same feature-set and similar specs, but lesser GPU-power, it doesn't make that much of a difference to develop Games for both, i think. But i'm not a tech guy^^
 

sinnergy

Member
I guess it was absolutely clear the next gen will have SSD implementation so this was no surprise.
But they didn’t know the speed, but after that showing they knew it was 4,5 - 5 GBPS, now in the latest rumors MS seems to use 7GBPS controllers in series X, you gotta be careful.
 
Last edited:

ANIMAL1975

Member
Both ARE GPU’s, there’s still images on the net as proof.

images
Oh there they are the little fuc***ers!
Using NAVI 10 GPUs in your new 'next gen' system lol, typical lazy/arrogant Sony move.

No, this isn't necessarily true.

Chip production doesn't follow the linear path which you described i.e. based on size. When you're breaking up a wafer sheet for chips, the resultant chips aren't born equal because they will have variant quality. For example, one chip may have all CUs activated but in order to hit the required clock, draws more power than another chip which will hit the same clocks at lesser power. Then there could be another chip where few CUs are not aligned. There are too many factors at play here. There is a reason why GPU manufacturers prescribe a set of criterias so they can tier their cards.

Why do you think AMD has two different products in form of 5700 and 5700XT when they are made from same wafer? They created two different SKUs because of variance in silicon quality and prescribed few requirements in accordance to which the chips are divided. Chips which are of higher quality are allocated in a 5700XT bin while ones which are of lower quality, go into 5700 bin. Before a chip can be used in 5700XT, it has to tick the requirements of a) 40/40 active CUs b) Reach 1905MHz and c) Draw 225W power at max. The lesser capable chips can be used in 5700 if it a) Has 36/40 active CUs b) Reaches 1725MHz clock speed and c) Draws 180W power at max.

If a chip has 40/40 active CUs but draws 260W of power to reach 1905MHz, it goes in the 5700 bin. Similarly, if a chip has 36/40 active CUs but can reach 1905MHz while drawing 200W of power, it'll still go into 5700 bin. Why? Because the baseline requirements which AMD prescribed for these SKUs are final. The whole idea was conjured up by them in first instance because going narrow and fast significantly reduces the yields. Having a 2nd SKU saves them cost and make this production of narrow and fast chips somewhat feasible.

Now console manufacturers on the other hands, specifically Sony do not have the same luxury of different SKUs. Before they decide on a chip which goes in the final product, they have to decide upon a baseline which is low enough i.e. have a moderate clock speed, hit the required clock speed without any discrepancy between power drawn and thermal targets so majority of chips which are manufactured can be used. The 36CU chip which you are on about is the same chip as used in 5700/5700XT, their requirements would be a) Have 36/40 active CUs, b) Reach 2000MHz clock speed c) Draw ~150W power. Any scenario in testing, when a chip misses any of the requirements has to be binned and unlike AMD they do not have different SKUs where binned chips could be used.

This is the main reason why people who are knowledgeable about these stuff parrot that going narrow and fast for console manufacturers doesn't make any sense because it is much more expensive than going wide and slow. The threshold for chips making the cut is too high, you are basically throwing ton of chips away, and in turn spending more money to make your APU.



60% of sales? I assume you're referring to Xbox 2nd SKU here in Lockhart. First and foremost, the APU size of Series X is ~400mm2 which is ginormous. It makes zero sense to use binned chips of Series X in a Lockhart when it is targeting 1/3rd of total compute power. The cooling, design, form factor would be all be rendered futile if they have to house a chip that big. The APU in Lockhart will be significantly smaller and significantly cheaper to produce.



Take into account what I wrote and you'll realize your simplistic view of smaller chip = more wafers = cheaper = better yields isn't true. A chip with 36CU@2.0 GHz is smaller than a 56CU@1.67GHz but isn't cheaper to produce. Because even though console manufacturers are paying for giant sheets of wafer, regardless of size of the chip, if the threshold/requirement is too high (such as 2.0GHz while drawing ~150W power), they'll end up wasting more chips and therefore more money.

It's pretty abundant that you are not as knowledgeable as you pretend to be.
Very good reading sir, ty for this!
BTW: iu talking to a Sony ninja infiltrated in Gaf to lower our expectations for the meltdowns.... it's useless to continue the discussion he won't brake

Gamernyc78 arms-gate 😂😂😂😂👀👀👀

I still have not been verified 😭😭😭
Dude you don't want to start a arms gate warz. You can win the size race, but i will destroy you in hairyflops brute performance.

2304 SP (36 CUs) x 1800MHz x2 = 8.3TF.

8.3TF vs 12TF would be still really big difference.
R600 did it! We back to ps5 8tf bang for the buck machine!
 
Last edited:

ethomaz

Banned
Didn’t they have 1 gig and 2 gig chips shots in the promo pcb from E3? Maybe they will go 18 or 20 gig after Sony reveal?
That was indeed weird... probably not final.
Because if you do that some part of the memory pool will have better latency than other.
Eg. 32bits bus attached to 1GB chip will have better latency than 32bits bus attached to 2GB chip.
While the speeds is in theory the same the time to access 1GB will be faster than 2GB.
 
Last edited:

R600

Banned
[/QUOTE]
Shifting goal posts already? Why switch the discussion to TDP when the original discussion was about yields? :) Please tell me how 36CUs@2.0GHz is gonna be economically cheaper to produce than 56CUs @1.675GHz.



Where did I say it is? Compared to 36CUs at 2.0GHz, it IS slow and something much more achievable and realistic in regards to NAVI. It is in general much more clockable than your Polaris cards.



Again where did I say this? Why are you resorting to lying?



It is ironic you're lecturing me about power curve when I was the one to bring it into discussion as to how it affect yields. But yes, I do know. And, I also know, their dev kits are clocked at 1.8GHz and 2.0GHz which is way beyond NAVIs' sweet spot.

Besides, all your graphs using 5700/5700XT power curve is useless. You can't expect a console-level binned chip to perform close to a retail GPU. Why? Because of variance in silicon quality. Furthermore, concept of under-volting doesn't exist to console manufacturers. When AMD says 5700XT consumes maximum of 225W of power, that doesn't mean, ALL of 5700XT consume same amount of power. Some of them can achieve the same clocks at a lower-voltage, but that would be akin to winning a silicon raffle. These charts and graphs cannot be relied upon by console manufacturers to draw upon a baseline. But, this is something I've already explained to you earlier.

DF has stated XSX breaks what were traditional console barriers with regards to TDP, it's going to draw well north of 200W+. A look at form factor is a dead giveaway, so is Sonys' devkit.



Already told you how 7-10% won't be dramatic game-changer since the clock speeds of 1.8-2.0GHz are well beyond the optimum power/performance/thermal mark.



Yeah, these chips will magically start emitting much less heat, consume much less power after 18 month time-lapse just because node is more mature. Great argument.



Once again I never said that, you're again putting words in my mouth. I said why using 36CUs@1.8GHz would result in poor yields compared to a bigger GPU and gave you array of reasons as to why that would be the case. The reason which you have once again ignored.



Comparatively, comprehension is hard for some. Besides we have no idea if that is the configuration MS are going with for the Series X retail kit. They could opt for an even bigger CU count and further reduce the clocks to reach their target.



Such a shame I read this last or I wouldn't have wasted my energy. All the things I wrote about baseline, why companies divide tiers, how clocks and power affect yields etc. are chucked sideaways because 'I believe they can easily do that'.

Sigh, I give up. Carry on spreading your mis-information.
I am not putting words in your mouth, you literally said "36CU at 1800MHz is too high anyway" in your last post. Github specified 56CUs for Arden, so I am not going to go against what it says there because 1700MHz for 56CU part sounds too much.

1800MHz is not beyond TDP limit, in fact its the knee of Navi 10 (meaning, go further and you will lose more efficiency then you will gain power).

As I said, I am on my phone and copy/pasting and multiquoting is chore, so cannot reply in detailed manner I would like to.

We will agree to disagree and wait for official reveal, which are getting closer anyway. If 36CUs at 1.8GHz sound too unrealistic to people, then I have to wonder how same people have speculated about 52-64CUs at 2.0GHz - 1.55GHz to reach "rumored 13TF"
 
how is ps5 slightly better than xfridge (assuming its 12.2 TFLOPS) , if from orisblack ps5 is 11.4 TFLOPS ?!!!

Nah, the 11.4 comes from a supposed XSX GPU downclock IIRC.

..Which I'm still questioning myself. Yeah, they could technically do that, but the only reason is if they have really shitty thermals. Which...would be completely opposite of the direct previous console they released? Meanwhile Sony's thermal cooling would have shot up a whole tier or two in order to cool a GPU pushing 1900-2000 (maybe more) MHz?

Granted IIRC the 11.4 came from TechPoweredUp's own guess when they updated the GPU database. They had certain obvious things wrong, like PS5's GPU's memory amount being too low (maybe the bandwidth too low, too), but still if you logically think about it, with its design and likely having at least as good cooling as the X, XSX shouldn't have an issue keeping its GPU around 1675-1750MHz.

1600MHz (11.4TF) is just them being overly conservative; alternative of disabling 8CUs (so 52CUs @ ~ 1720MHz) is also a bit overkill.

2304 SP (36 CUs) x 1800MHz x2 = 8.3TF.

8.3TF vs 12TF would be still really big difference.

Yeah, I don't think they'll go that low, myself. They could land at 1.9GHz with a very good cooling solution. So a range between 8.75TF (36CUs) to 9.72TF (40CUs). The only thing to worry then is if the cooling would cost enough to push the price into whatever price territory XSX is in, because at that point it is going to look bad for PS5 spec-to-price wise. Either that or they will eat the additional costs to keep it priced lower.

Ideally if they know a delta between PS5 and XSX would be too much, they will try keeping PS5 on the lower price end and probably target Lockhart (assuming it's releasing this year) while treating XSX like "the new XBO X" i.e try implying it's anchored by Lockhart anyway (which could be possible, but I'm personally thinking won't be).

They wouldn't mention hard spec numbers during advertising/promoting it either, and I doubt MS will either even if they have the advantage in that era, mainly because I don't know if that type of advertising resonates with the majority of gamers these days. Not like it did in the early '90s anyway.
 
Last edited:

joe_zazen

Member
When did he say this? Ever since his GAF days all Matt does is shit on Xbox any chance he can. I always assumed his credibility in regards to being an insider was on the same level of Thuway...........which means he knows nothing

i do miss the bish’s bannings of social media influencers because they tend to ruin things. I guess the good news is that gaf is popular enough again to attract them.

for those who care, matt is 100% honest, reliable, and a grown adult. However, he left the games industry 2-3 years ago, is he is no longer an “insider”. That is why he phrases things the way he does now.
 

THE:MILKMAN

Member
Damn 4 week means February 13 . Camefromfuture redeemed ? 😂

The key part is 'less than'. So the 12th or the 5th are the only two realistic options for me. Also David says it is basically the worst kept secret in gaming (that Sony will reveal PS5 next month). He probably hasn't directly heard anything.
 
  • Like
Reactions: TLZ
Lol I tell you the dudes ppl think are lying and don't know shit will turn out to know lol. I have a feeling about him and Tommy Fisher (I have a feeling thy know eachother or are one and the same BTW) . Bro if the shit thy say comes out true thts going to be crazyyyy
Tommy fisher said canefromfuture is not correct . Tommy is the cockiest more confident one and mod of war didn’t ban him and he banned camefromfuture . Let’s see
I want to believe that Sony played 4D chess with MS and sent them down the 9 tf rabbit hole and came back and surprised them with a 12 tf machine . That’s my hope 😂
 

Lampiao

Member
If they are going to do an event like 2013, 20 days before they sent the invitations and warned. People come from all over the world to cover, so they have to program themselves.
If Sony makes a State of Play to unveil a PS5, it's better to leave the market.
 
The key part is 'less than'. So the 12th or the 5th are the only two realistic options for me. Also David says it is basically the worst kept secret in gaming (that Sony will reveal PS5 next month). He probably hasn't directly heard anything.
"Less than 4 week away"

3.5 WEEKS CONFIRMED! :messenger_grinning_smiling:
Then assassin creed kingdom rumor was correct with feb 12 date.
 
If they are going to do an event like 2013, 20 days before they sent the invitations and warned. People come from all over the world to cover, so they have to program themselves.
If Sony makes a State of Play to unveil a PS5, it's better to leave the market.
If event is on 12 , invite will be sent on Jan 22 or 23
 

joe_zazen

Member
I honestly don’t trust the insiders in this forum, there is just too much vague bullshit fuckery with PS5 along with so many highs and lows (9.2 TFLOPS to 18 LOL ) from supposed leaks. I don’t trust Klee either. A journalist to a wood chopping guy in Alaska ? People can do whatever they want but why not continue being a journalist ? A journalist who is able to articulate himself can do any other things anywhere besides wood chopping in Alaska. What does all this attention get you when the truth will eventually be revealed ?

yields on 36cu 1.5-ish chips would be very good, no? Low clock dual chip would get sony to that 12-13 TF mark without throwing out all the work on Oberon. People seem to think dual gpus are impossible, when in fact amd has been pushing that very thing (see all their work on chiplet and infinity fabric, not to mention new dual chip entries in navi linux drivers which were missing in vega/vii drivers). it is not crossfire, that has been abandoned and not what amd has been working on.

sure, the tech might not appear in consoles, but if it is , that would explain the contradictions between github and insider scuttlebutt. the tech would also be a great way to compete with nvidia: one card with 2x5700xt chips, which appears and operates as a single gpu.
 
Last edited:
Status
Not open for further replies.
Top Bottom