• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RUMOR: PS4 CPU to run at 2GHz

RoboPlato

I'd be in the dick
Something I posted in the Durango cpu thread, about MS could try to mandate parity.



Wouldn't it be logical that MS could moan all they like, but they wouldn't be able to do anything about it anyway.

How would they even test for parity? Would devs have to submit the PS4 version to MS for them to test? There's also the question of if everything has to be identical or if it just has to be close enough.
 

i-Lo

Member
Something I posted in the Durango cpu thread, about MS could try to mandate parity.



Wouldn't it be logical that MS could moan all they like, but they wouldn't be able to do anything about it anyway.

I wonder: Did sony go around asking developers to mandate parity this gen? The answer is NO. Otherwise we would not have seen the tremendous number of games that were slightly and some mostly better on 360.
 

Oblivion

Fetishing muscular manly men in skintight hosery
The 8-core CPU is based on the upcoming “Jaguar” design, while the GPU is based on the Radeon 7000-series and is capable of 1.84 TFLOPS of processing power. That’s about 4.5x more powerful than the PS3 GPU.

Huh, never heard about this before. Guess that means the GPU's gonna gimp the system?
 

RoboPlato

I'd be in the dick
Huh, never heard about this before. Guess that means the GPU's gonna gimp the system?

It's over eight times as powerful as the PS3's GPU in terms of raw power and is on an architecture that's dramatically more efficient. It's a huge jump.
 

USC-fan

Banned
the PS4 has a custom videocard with a LOT more ram available to it. You can't use PC benchmarks to gauge it. Most dev comments have it around 10x as powerful as a PS3. you're way off.

way off? you say 10x and the numbers here say 12-13x. Yeah wayyy off...
 

James Sawyer Ford

Gold Member
I've heard some posters claims that Microsoft mandates parity, and that's absolutely not true, and certainly not something they can force.

Few as they are, there are instances of third party PS3 games looking superior to 360 games.
 

way off.

1.) the ps3 GPU doesn't appear anywhere on that chart of PC cards. the benchmarks there aren't a useful way of gauging how powerful it is. It's a custom card with a LOT of very fast ram available to it.

2.) on paper, if we look at raw theoretical specs, the new GPU is 4.6 times as powerful as the RSX- (wiki gives the rsx as 400gigaflops). This totally ignores ease of use though- devs weren't getting anywhere near the theoretical power of the RSX. Part of this was the split ram solution, part of it was the PS3 just being hard to program. If a card is theoretically capable of 400, but bottlenecks and ram limitations mean you only get 200 out of it- it's a 200 gigaflop card.

that's why EA, Crytek, and several others have been on record as saying the PS3 (and durango) are around 8 to 10 times as powerful as current gen, not 4. Given that the CPU is nothing to write home about, and in some areas weaker than the Cell is at certain functions, it's not the CPU that has them saying this. It's the ram and gpu.

edit: ah, my bad. I had you confused with the guy saying 4.6.
 
I've heard some posters claims that Microsoft mandates parity, and that's absolutely not true, and certainly not something they can force.

Few as they are, there are instances of third party PS3 games looking superior to 360 games.

Microsoft's Parity requirement is built on the premise that no significant "features difference" can be on a competitive platform unless MS also gets it. Its a real requirement and is more about game features & DLC.

I doubt it extends to 'framerate' however as that would be unmanageable.
 

James Sawyer Ford

Gold Member
Microsoft's Parity requirement is built on the premise that no significant "features difference" can be on a competitive platform unless MS also gets it. Its a real requirement and is more about game features & DLC.

I doubt it extends to 'framerate' however as that would be unmanageable.

Right, and that may be true with regards to features and content (there's still exclusive DLC on the PS3).

Has no bearing on graphics
 
360 has a 240 gpu that was easily better than the gpu in the ps3 which had based on a very inefficient and dated architecture.

The ps4 gpu is 9x greater than the 360 gpu. If that's true, then it's automatically at least 9x greater than the ps3 gpu because 360 gpu > ps3 gpu.
 

James Sawyer Ford

Gold Member
360 has a 240 gpu that was easily better than the gpu in the ps3 which had based on a very inefficient and dated architecture.

The ps4 gpu is 9x greater than the 360 gpu. If that's true, then it's automatically at least 9x greater than the ps3 gpu because 360 gpu > ps3 gpu.

Yeah, not to mention that the GCN2.0 architecture of coming AMD cards is a lot more efficient than the 360 GPU.

That, coupled with Sony's decision to go not only HIGH BANDWIDTH, but also HIGH QUANTITY of unified RAM makes me think that PS4 will be just as large, if not LARGER than last gen.

Sony has made some great engineering choices, and on paper it's at least 10X more powerful.
 
Microsoft's Parity requirement is built on the premise that no significant "features difference" can be on a competitive platform unless MS also gets it. Its a real requirement and is more about game features & DLC.

I doubt it extends to 'framerate' however as that would be unmanageable.

Thank you, I've been wondering what exactly the parity thing meant for a while.
 

Kaako

Felium Defensor
Yeah, not to mention that the GCN2.0 architecture of coming AMD cards is a lot more efficient than the 360 GPU.

That, coupled with Sony's decision to go not only HIGH BANDWIDTH, but also HIGH QUANTITY of unified RAM makes me think that PS4 will be just as large, if not LARGER than last gen.

Sony has made some great engineering choices, and on paper it's at least 10X more powerful.
Correct. Pardon my french, but people will get properly mind-fucked by some of the software Sony first party devs pump out next-gen.
 

tapedeck

Do I win a prize for talking about my penis on the Internet???
General kind of random question for anyone...assuming the CPU is 2Ghz, the custom GPU is using GCN 2.0 architecture, the usable RAM for games is at least 7GBs, the added efficiency of using an APU, and of course 'coding to the metal', how do you expect PS4 to stack up performance wise to say a GeForce Titan or any of the top end cards out right now?
 

TheD

The Detective
way off.

1.) the ps3 GPU doesn't appear anywhere on that chart of PC cards. the benchmarks there aren't a useful way of gauging how powerful it is. It's a custom card with a LOT of very fast ram available to it.

2.) on paper, if we look at raw theoretical specs, the new GPU is 4.6 times as powerful as the RSX- (wiki gives the rsx as 400gigaflops). This totally ignores ease of use though- devs weren't getting anywhere near the theoretical power of the RSX. Part of this was the split ram solution, part of it was the PS3 just being hard to program. If a card is theoretically capable of 400, but bottlenecks and ram limitations mean you only get 200 out of it- it's a 200 gigaflop card.

that's why EA, Crytek, and several others have been on record as saying the PS3 (and durango) are around 8 to 10 times as powerful as current gen, not 4. Given that the CPU is nothing to write home about, and in some areas weaker than the Cell is at certain functions, it's not the CPU that has them saying this. It's the ram and gpu.

edit: ah, my bad. I had you confused with the guy saying 4.6.

RSX was not 400GFLOPs, 250 GFLOPs is the about the real number.

The RSX did not have faster RAM than the 7900GT or GTO!
 

msdstc

Incredibly Naive
Yeah, I posted this in one of the numerous ps4 CPU threads a couple days ago.

I have never personally heard of ps4daily, which is why I shyed away from a thread.

More power = better, but man, I worry what all these "upgrades" do to the price...

Ehhh if it was underclocked, shouldn't change the price.
 

aaaaa0

Member
General kind of random question for anyone...assuming the CPU is 2Ghz, the custom GPU is using GCN 2.0 architecture, the usable RAM for games is at least 7GBs, the added efficiency of using an APU, and of course 'coding to the metal', how do you expect PS4 to stack up performance wise to say a GeForce Titan or any of the top end cards out right now?

The GeForce Titan would generally smack it around, as long as everything fit into the 6gb onboard and the PCI Express bus connecting it to the system didn't bottleneck the rendering. (And presuming you had a fast enough CPU and rendering code that is clever or lucky enough to avoid the other problems with the PC like API or OS overhead.)

You're talking about comparing a video card that alone has 288 GB/s bandwidth, 4.5 teraflops, and a 250 watt power budget against the entire PS4 system. The PlayStation is gonna get creamed.
 
The GeForce Titan would generally smack it around, as long as everything fit into the 6gb onboard and the PCI Express bus connecting it to the system didn't bottleneck the rendering. (And presuming you had a fast enough CPU and rendering code that is clever or lucky enough to avoid the other problems with the PC like API or OS overhead.)

You're talking about comparing a video card that alone has 288 GB/s bandwidth, 4.5 teraflops, and a 250 watt power budget against the entire PS4 system. The PlayStation is gonna get creamed.

Oh and it costs $1000 for the card alone.
 

Afrikan

Member
what other sacrifices will one have to do you maximize their Titan video card to the fullest? Like would you need a huge PC case?...special looking cooling things? spend extra time monitoring and taking care of the thing?
 

Jtrizzy

Member
Dat Kittonywy!
Indifferent2.gif

And wollan lol. Great memories
 

mrklaw

MrArseFace
Also creates a lot more heat and (typically) requires more power.

Aren't the jaguar cores tiny?like 3mm^2 per unit, so 6mm^2 for PS4? That's a tiny fraction of the overall die, so the increase in heat and power requirements would be relatively small.
 

StevieP

Banned
I'm just guessing but I know several devs on B3D have said that a 60-80% performance increase over a PC equivalent card is going to be feasible for most teams, possibly even more than that if they're good and doing specialized PS4 development. I think that final output could be similar to a 680 if you capped it at 30fps/1080p without AA.

Is this the new version of "console optimization brings 2x the power"?
 
I'm just guessing but I know several devs on B3D have said that a 60-80% performance increase over a PC equivalent card is going to be feasible for most teams, possibly even more than that if they're good and doing specialized PS4 development. I think that final output could be similar to a 680 if you capped it at 30fps/1080p without MSAA.

"Final output could be similar to a 680 if final output is capped to something significantly less than what a 680 could output"
 

RoboPlato

I'd be in the dick
Is this the new version of "console optimization brings 2x the power"?

I was trying to phrase it so I didn't seem like I was saying that exactly. I am referring to optimization and output improvements in consoles but was just trying to give some kind of context and comparison that people could visualize. I know it's not going to hit 680 levels of pure performance but seeing as that's the card everyone compares to I was trying to put things in perspective. Couldn't think of a better way to phrase it. Now that I reread it it is really stupid and I shouldn't have tried to make a comparison like that when I'm getting up for work at 4am :lol
 

DocSeuss

Member
What has Sony running so scared that they're ramping up the RAM and boosting the CPU for?

The costs involved in this mean it isn't something you do out of benevolence, or at the drop of a hat. They're definitely not what you do if you're comfortably ahead of the competition, which previous spec things seemed to indicate.

What are they so afraid of?
 

Reiko

Banned
What has Sony running so scared that they're ramping up the RAM and boosting the CPU for?

The costs involved in this aren't something you do out of benevolence.

They're what you do because you're afraid of something.

That's what I'm wondering.

PS4 already had an advantage with 4GB GDDR5
They already have an advantage with current specs.
 
What has Sony running so scared that they're ramping up the RAM and boosting the CPU for?

Rumblings of Steambox and it's specifications? There must be a sea of rumours floating around about the Steambox in developer circles, Sony might have caught wind of those rumours and panicked.
 

RoboPlato

I'd be in the dick
What has Sony running so scared that they're ramping up the RAM and boosting the CPU for?

The costs involved in this mean it isn't something you do out of benevolence, or at the drop of a hat. They're definitely not what you do if you're comfortably ahead of the competition, which previous spec things seemed to indicate.

What are they so afraid of?

They're a hardware company and this is the first time they haven't developed a unique architecture for their platform. They probably just wanted to make sure their platform was attractive from a hardware perspective so they went with the best specs they could afford. The RAM spec was the one that was the huge boost, the CPU one (if happening) has likely been in the cards based on TDP/yields and would help a lot given that it's a low end CPU.
 

Nachtmaer

Member
2.) on paper, if we look at raw theoretical specs, the new GPU is 4.6 times as powerful as the RSX- (wiki gives the rsx as 400gigaflops). This totally ignores ease of use though- devs weren't getting anywhere near the theoretical power of the RSX. Part of this was the split ram solution, part of it was the PS3 just being hard to program. If a card is theoretically capable of 400, but bottlenecks and ram limitations mean you only get 200 out of it- it's a 200 gigaflop card.

Wikipedia is wrong. I've read numerous of times that RSX is actually clocked at 500MHz.

If you want to compare RSX to a desktop card, which I know doesn't work, it's a gimped 7800 with performance close to a 7600.

The "overclocking" doesn't seem too drastic. They didn't specify the clock speed of the processor in their documents so I can only assume that they're still seeing what AMD and GloFlo (or is it TSMC?) can provide them.

I think they'll be produced at TSMC since that's where Temash and Kabini will be produced as well. Their GPUs are also fabbed at TSMC but that's a different process.
 
What has Sony running so scared that they're ramping up the RAM and boosting the CPU for?

The costs involved in this mean it isn't something you do out of benevolence, or at the drop of a hat. They're definitely not what you do if you're comfortably ahead of the competition, which previous spec things seemed to indicate.

What are they so afraid of?

The RAM makes sense because if the Xbox 720 is running 8GB, then 4GB can obviously be a huge problem when you take into account OS features and games that use more than 4GB at once.

The "overclocking" doesn't seem too drastic. They didn't specify the clock speed of the processor in their documents so I can only assume that they're still seeing what AMD and GloFlo (or is it TSMC?) can provide them. 400mhz isn't actually that much of a difference...for instance a lot of Intel processors that pretty much do that on stock voltage and stock cooling.
 

Razgreez

Member
They're a hardware company and this is the first time they haven't developed a unique architecture for their platform. They probably just wanted to make sure their platform was attractive from a hardware perspective so they went with the best specs they could afford. The RAM spec was the one that was the huge boost, the CPU one (if happening) has likely been in the cards based on TDP/yields and would help a lot given that it's a low end CPU.

Don't be silly. They're obviously scared of the big bad performance eating bogeyman

In reality neither of these companies are scared, they're just tweaking what they already have so as to get the best performance/efficiency possible
 
Top Bottom