• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

icerock

Member
Wasn't Github said to be the work of that discord?

Then Github turned out to be mostly right?

Like what?

edit:
"Guys guys, these recent rumors are fake. They're from that one discord that got a lot of stuff right."

Like ayy lmao

Popular theory is they found the Github data long back, it was assumed by them that Oberon iGPU in the data was based on RDNA1 architecture (which is very inefficient) so those 2.0GHz clockspeeds weren't sustainable. Hence, why so many of them were parroting $399, 8TF. Once the information became public in December, they started policing the Discord very tightly. Hence, not many got headups regarding PS5 specs which Dictator let them in on, two of the biggest bellends on that Discord in Colbert and DukeBlueBalls wrote a prediction a day before reveal which contained variable clocks. Something, which save for folks over at Sony and DF crew weren't privy to.

It's a clique, as simple as.

yep

V6BCLl7.png

Sad bunch of bastards.
 

SlimySnake

Flashless at the Golden Globes
I said from the beginning that the whole boosting thing is a sign of heating issues. And I'm not a dev nor do I have any internal insight. I simply know about hardware, power and cooling.
I am sorry, but this is so ridiculous. They had heating issues so to solve this they increased the clocks even further from 2.0 ghz to 2.23 ghz???

In what universe would that make sense? It's like if a couple has marital issues because the guy cheated on his wife with 20 girls. In order to reconcile, the couple decided, he must sleep with 3 more girls.

Also, Cerny has said time and time again, their entire APU is bound by power. It will never go above the limit they have set. It will never overheat by design. It will never make the fan become loud. It's only purpose is to prevent all that from happening.

An example of this would be a couple having marriage issues because the man collapses after his heart goes into overdrive during sex. Wife clearly isnt happy. She has to keep the fan on full while fucking, cant hear shit, her man starts sweating and taps out in the middle. Well, what if the doctors installed a valve that's sole purpose is to keep the heart from beating too fast. I am not a surgeon but i know some people wear heart beat regulators. This is basically the same thing. Now the wife is happy because she can get peak performance without the fear of having a fat man die on top of her.
 
Last edited:

Dabaus

Banned
Ill give you a list.
Dukeblueballs
DrKeo
Colbert
c0de
Proven
Shepstal Ed
DocH1X1
Klobrille
Lukastaves
Dictator(Yes hes in that discord now)
FairyEmpire
Doncabesa
Bcatwilly
Brand Sams
And more

As much Grief as i give NPC era i do read their next gen stuff quite frequently as i do here as well. With that said that looks about right.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
What a bunch of low lifes 😂 they should at least get payment from MS, but they don't. That is the sad part
They get invited to parties at MS headquarters. At least when they get MVP status. I think at least one of them used to post pictures of himself partying with MS execs. He's now a mod on the other site.

Im sure MS throws some cash their way. Maybe a free gamepass sub. They have always had a program for this stuff going back to windows vs apple days. Besides, No one does this for free.
 

icerock

Member
i am disappointed in Richard. this was supposed to be an interview with Cerny that was by design supposed to clear up, but ended up causing more confusion, more FUD and more concern trolling. He is going out of his way to refute stuff Cerny himself clarified to him by doing random benchmark tests with different parameter values. he goes and finds dev sources who directly contradict cerny and questions literally everything cerny says in that interview.

i expect this kind of stuff from MS insiders and losers who visit that discord, but richard should know better. i appreciate him not wanting to take Sony PR at their word but he never did this for the series X. and for good reason, we simply dont know how rdna2.0 cards perform. we dont know how they scale with clocks. we dont know how the sony power control thing works because its not even in the devkits yet. im fairly positive that dictator is part of that discord but Richard is a decent enough guy and needs to learn that this kind of concern trolling isnt going to do him any favors.

I don't know what Richard and DF are doing tbh, there are quotes in the article which contains a clarification from Cerny that it IS possible for CPU and GPU to run at max clocks simultaneously. Richard then brought his dev sources who suggested throttling on the CPU end, which Cerny again clarified, is not a feature set in the dev-kit but something the retail PS5 will do on its own to leverage variable frequency. Everything was clarified at that point, until he started benchmarking using RDNA1 GPUs?

I mean RDNA1 GPUs not scaling well, this has been known days after it launched. The architecture in itself is flawed which is making the GPU do a lot of inefficent calculations and hence use up that bandwidth quick. Bottleneck is due to the bandwidth, that's the core reason why even significant overclocking is resulting in minor gains. RDNA2 is supposed to improve on all these fronts, which is what PS5 iGPU is based on, I don't know why he conflated the two to suggest the gains will be marginal even with higher clocks. As a result you have idiots saying a teraflop produced by more CUs is more powerful than a teraflop produced by fewer CUs....

Ill give you a list.
Dukeblueballs
DrKeo
Colbert
c0de
Proven
Shepstal Ed
DocH1X1
Klobrille
Lukastaves
Dictator(Yes hes in that discord now)
FairyEmpire
Doncabesa
Bcatwilly
Brand Sams
And more

Share more juicy bits, I need some entertainment.

these guys plus some more not on the list just got banned for concern trolling and mass reporting. Looks like their days of running around getting everyone else in trouble are over.

Are those bans permanent? I only see few weeks on most. Don't even know what they banned the poem dude for, hopefully for cross-forum drama. That would make that place a little bit better.
 
Last edited:
Saw this on NPC Era, thought it was worth a quote:


Soprano said:
Just a little notice. The concern and FUD you are seeing around PS5 news lately is being done on purpose. Some of it is planned in a certain discord.

This thread heading to the dumps is apart of it i presume. There's this sad attempt to make PS5 look like it's going to fail and experience another PS3 generation.

That is all. Bye.
I would not make those posts.
The FUD on this forum is even worse in many cases.
 

SlimySnake

Flashless at the Golden Globes
I don't know what Richard and DF are doing tbh, there are quotes in the article which contains a clarification from Cerny that it IS possible for CPU and GPU to run at max clocks simultaneously. Richard then brought his dev sources who suggested throttling on the CPU end, which Cerny again clarified, is not a feature set in the dev-kit but something the retail PS5 will do on its own to leverage variable frequency. Everything was clarified at that point, until he started benchmarking using RDNA1 GPUs?

I mean RDNA1 GPUs not scaling well, this has been known days after it launched. The architecture in itself is flawed which is making the GPU do a lot of inefficent calculations and hence use up that bandwidth quick. Bottleneck is due to the bandwidth, that's the core reason why even significant overclocking is resulting in minor gains. RDNA2 is supposed to improve on all these fronts, which is what PS5 iGPU is based on, I don't know why he conflated the two to suggest the gains will be marginal even with higher clocks. As a result you have idiots saying a teraflop produced by more CUs is more powerful than a teraflop produced by fewer CUs....



Share more juicy bits, I need some entertainment.
Exactly. And richard of all people should know this. 5700xt and 5700 have awful performance in 4k. it is not doing a good job scaling at 4k. Im sure ive posted screenshots of benchmarks showing how it struggles to run some demanding current gen games at native 4k 30 fps. its not an issue of memory bandwidth either since nvidia's turing cards seem to have no trouble with the same 448 gbps bandwidth.

they should know better than to make comparisons like that.

the comparison doesnt make much sense anyway. there is only a 10% difference in compute units. cerny's example was 36 vs 48. 33% more CUs. a better example wouldve been to look at performance of a 22 CU 5500xt clocked super high and a 36 CU 5700 clock really low. even then because of bandwidth and it being rdna 1.0, it wouldnt have been a perfect test.
 

Imtjnotu

Member
Me? Well I registered the day before the GDC-talk just to contact Osirisblack about his numbers. Turns out my source was right. Check my history and you´ll see. But it's okay, I don't really care and I will not post anything more about it. I think both consoles are great and I'm looking forward to the new gen. :)
only thing tho was that you said you were guessing. not saying you dont have an insider but your first post was about being a long time lurker and guessing
 
I don't know what Richard and DF are doing tbh, there are quotes in the article which contains a clarification from Cerny that it IS possible for CPU and GPU to run at max clocks simultaneously. Richard then brought his dev sources who suggested throttling on the CPU end, which Cerny again clarified, is not a feature set in the dev-kit but something the retail PS5 will do on its own to leverage variable frequency. Everything was clarified at that point, until he started benchmarking using RDNA1 GPUs?

I mean RDNA1 GPUs not scaling well, this has been known days after it launched. The architecture in itself is flawed which is making the GPU do a lot of inefficent calculations and hence use up that bandwidth quick. Bottleneck is due to the bandwidth, that's the core reason why even significant overclocking is resulting in minor gains. RDNA2 is supposed to improve on all these fronts, which is what PS5 iGPU is based on, I don't know why he conflated the two to suggest the gains will be marginal even with higher clocks. As a result you have idiots saying a teraflop produced by more CUs is more powerful than a teraflop produced by fewer CUs....



Share more juicy bits, I need some entertainment.



Are those bans permanent? I only see few weeks on most. Don't even know what they banned the poem dude for, hopefully for cross-forum drama. That would make that place a little bit better.

NdE0GeF.png
 

icerock

Member
Exactly. And richard of all people should know this. 5700xt and 5700 have awful performance in 4k. it is not doing a good job scaling at 4k. Im sure ive posted screenshots of benchmarks showing how it struggles to run some demanding current gen games at native 4k 30 fps. its not an issue of memory bandwidth either since nvidia's turing cards seem to have no trouble with the same 448 gbps bandwidth.

they should know better than to make comparisons like that.

the comparison doesnt make much sense anyway. there is only a 10% difference in compute units. cerny's example was 36 vs 48. 33% more CUs. a better example wouldve been to look at performance of a 22 CU 5500xt clocked super high and a 36 CU 5700 clock really low. even then because of bandwidth and it being rdna 1.0, it wouldnt have been a perfect test.

If it's not bandwidth then what else could be the problem? RTX2080 has higher CU count, can reach same in-game clocks, can do 4K along with RT all on 448GB/s bandwidth. Plus that thing is on 12nm! Yet, if you compare the efficiency, RTX2080 is lightyears ahead of 5700/5700XT. I just think that RDNA1 as an architecture has a lot of flaws, it consumes a lot of power, doesn't offer as much performance, struggles with 4K and I think that's down to not utilizing the bandwidth efficiently.

Agree on the comparison, that would be more appropriate. But I can't help but think, many including Richard think PS5 iGPU is just an evolution of AMDs 5700 range. Hence, first comparison point is always 5700/5700XT. I really hope new RDNA2 cards from AMD are revealed soon, so this non-sense can be put to bed.


Already rallying the troops for when comparison videos drop, its down to Sony enforcing parity!!!!!!!!!!!! Hurr durr, go complaint. Haha.

I just hope you're insulated well on there, these receipts from time to time are a lot of fun.
 
He's trying to sell you something. I'm not.

He's not trying to sell anything, but Sony is. Mark Cerny doesn't work for Sony, he is a consultant brought in to be the lead system architect on the PS5, whether the PS5 sells 100 units or 100 million units, Mark still gets his money. One thing is for sure though, he is very well versed in both the software and hardware side of things and knows more than 99.999% of the people on here about the subject. He's a pretty analytical guy that doesn't seem to spout the typical corporate marketing BS, he likes facts so when he talks about stuff it usually has some pretty good weight behind it.
 
Last edited:

Ptarmiganx2

Member
I think the consoles are getting delayed. And quite frankly, you have much bigger problems to worry about if this thing continues to ravage nonstop without a break during the summer 6 months from now. The only thing government is doing right now is delaying this hoping by summer it starts to die down so we can recover and be better prepared to respond to it if it resurfaces. If it doesn't, we're potentially looking at a complete world collapse or rushed vaccines that have no been properly tested risking the population.
In terms of percentage growth in new cases we have averaged less than 10 percent the last 8 days with Tuesday and likely today being the exception. That is not taking into account the increase in testing rates which skews those numbers. It will peak within 10-12 days in the states. It's not going to "ravage" the country for another 6 months, so step off the media hype train. Coronaviruses are seasonal and this is no exception. The initial hydroxychloroquine results are better than expected. You can take a deep breath. I was asymptomatic other than chest pain and a suspicious chest x-ray, odds are I had it early Feb. Remember this is season 1 we will have antibodies going forward and symptoms won't be as severe year 2, 3, etc...if it becomes a seasonal occurrence.
 

Disco_

Member
Exactly. And richard of all people should know this. 5700xt and 5700 have awful performance in 4k. it is not doing a good job scaling at 4k. Im sure ive posted screenshots of benchmarks showing how it struggles to run some demanding current gen games at native 4k 30 fps. its not an issue of memory bandwidth either since nvidia's turing cards seem to have no trouble with the same 448 gbps bandwidth.

they should know better than to make comparisons like that.

the comparison doesnt make much sense anyway. there is only a 10% difference in compute units. cerny's example was 36 vs 48. 33% more CUs. a better example wouldve been to look at performance of a 22 CU 5500xt clocked super high and a 36 CU 5700 clock really low. even then because of bandwidth and it being rdna 1.0, it wouldnt have been a perfect test.
I liked the part about PS4 PRO sticking with jaguar due to BC issues even though zen was already on the market. What was the reason for x1x also having jaguar while releasing a year later?

Would you mind providing the source that stated 7nm DUV designs are compatible with 5nm while 7nm EUV designs are not

Would love to see this as well seeing that 7nm+ is fully compatible with 5nm.
 

Ascend

Member
he is a facts guy so when he talks about stuff it usually has some pretty good weight behind it.
Oh he knows what he's talking about. But he's got the ignorant completely conflating clock speeds and workload for example. Which is why we see so much nonsense. I'll ask the same thing I asked multiple times;

  • If there is no limit on cooling, why not increase the power limit to allow more performance?
  • If the console can run at max clocks & max load at all times, why do developers choose profiles to throttle the CPU to ensure max GPU speeds, as stated by DF?
  • If developers prefer static performance, why have variable clocks if the console can run at max clocks & max load at all times?
 

joe_zazen

Member
.
I am sorry, but this is so ridiculous. They had heating issues so to solve this they increased the clocks even further from 2.0 ghz to 2.23 ghz???

In what universe would that make sense? It's like if a couple has marital issues because the guy cheated on his wife with 20 girls. In order to reconcile, the couple decided, he must sleep with 3 more girls.

Also, Cerny has said time and time again, their entire APU is bound by power. It will never go above the limit they have set. It will never overheat by design. It will never make the fan become loud. It's only purpose is to prevent all that from happening.

An example of this would be a couple having marriage issues because the man collapses after his heart goes into overdrive during sex. Wife clearly isnt happy. She has to keep the fan on full while fucking, cant hear shit, her man starts sweating and taps out in the middle. Well, what if the doctors installed a valve that's sole purpose is to keep the heart from beating too fast. I am not a surgeon but i know some people wear heart beat regulators. This is basically the same thing. Now the wife is happy because she can get peak performance without the fear of having a fat man die on top of her.

i might be the only one, but i love your batshit analogies.
 

Ascend

Member
Ok maybe I'm crazy for thinking this. Obviously both MS and Sony have their own console warriors that both back up anything their favorite company does, while also taking down whatever the other company does, why does MS console warriors seem so much more prominent.
Maybe some people realize the truth about the hardware, without being a "console warrior". Maybe they are labeled as such simply because they are not making up excuses for weaknesses.
 

kyliethicc

Member
He'


He's not trying to sell anything, but Sony is. Mark Cerny doesn't work for Sony, he is a consultant brought in to be the lead system architect on the PS5, whether the PS5 sells 100 units or 100 million units, Mark still gets his money. One thing is for sure though, he is very well versed in both the software and hardware side of things and knows more than 99.999% of the people on here about the subject. He's a pretty analytical guy that doesn't seem to spout the typical corporate marketing BS, he likes facts so when he talks about stuff it usually has some pretty good weight behind it.
Exactly. If anything he’s a bad marketer of his work because he’s literal, honest, and precise.
He doesn’t go out of his way to clarify others misconceptions about his statements because he probably can’t anticipate their misunderstandings. Some smart people suck at spin; they’re honest, but not clever at selling. That’s why so many people seem to misunderstand and or not get the shit he says. And some who do pretend to not get it to score shit talking points and troll. Cerny seems to either not get this or not give a fuck. He just says straight up what shit is. Good and bad. I’m impressed how often he shares how shit can or has gone wrong while making the games and consoles he’s helped on (which are a lot.)

And he’s been making games since games began. Literally everyone who speaks with him says he’s very intelligent. He was the first American to ever get a PS1 dev kit. He signed Naughty Dog and Insomniac to funding deals when they were just 2 person start ups with no games yet. He helped create Sonic, Crash, Spyro, Jak, and recently helped create Spider-Man, Horizon, Last Guardian, Death Stranding, etc.... and of course ... Knack. He was system architect of the 2nd best selling console ever. I doubt he fucked up the design of the next PlayStation, especially since it seems like it’s just a PS4 2.0.

The new SSD tech, and custom chips Sony is creating behind it, has PC gamers pissy because it’s a custom, unique part that they won’t be able to get or “beat” when the new GPUs or whatever other parts come out. That’s why they all love Flops and that shit, because hardcore PC gamers will just go spend another $500+ to be able to run a game at a higher frame rate just to be like “see I’m better”. That’s why so many people are like “no it can’t be that fast” or “no it can’t run that high a clock” etc.

Sony’s dominance of console gaming, and their first party games elite status currently, has not earned Sony any optimistic interpretations from non-Sony fans. They want to see they fail, because gamers want drama. They want to see the big ones on top fail, crash, then rise back up, then fail again, and the cycle goes on. It makes for the most dramatic viewing experience. If Sony just keeps dominating in sales and acclaim it gets boring for the press, for Xbox fans, etc. People are thirsty for change, drama etc.
 
Last edited:
nope. designers are. programmers are just worker bees. designers make the levels, the boss fights, and everything that has to do with the final gameplay experience.

its a huge loss, but while this isnt a good look for ND, it doesnt look like its going to affect the game much. they are going to struggle to hire more senior devs like jason said, so maybe in the long run this might hurt the next few ND games, but Neil is still there and the other directors too so as long as their vision is being followed to the tee like kojima, this shouldnt be as big of a deal.
Naughty dog games aren't known for out of this world level design but for out of this world graphics. And it is programmers who allow those graphics. There are many games with similarly good level design.
No... They didn't expect the high performance of MS, so they added boost, creating the heating issues.

You guys are trying REALLY hard. Calm down. Jesus.
Sony said the heat issues were looking problematic prior to their innovative cooling solution which has allowed 2+Ghz to be achieved.

edit:
Oh he knows what he's talking about. But he's got the ignorant completely conflating clock speeds and workload for example. Which is why we see so much nonsense. I'll ask the same thing I asked multiple times;

  • If there is no limit on cooling, why not increase the power limit to allow more performance?
  • If the console can run at max clocks & max load at all times, why do developers choose profiles to throttle the CPU to ensure max GPU speeds, as stated by DF?
  • If developers prefer static performance, why have variable clocks if the console can run at max clocks & max load at all times?
The PSU might be limited by budget, and even if it could cool more there might not be additional power to allow it.
 
Last edited:
Status
Not open for further replies.
Top Bottom