• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Are the console GPUs this cycle really underpowered?

I dont think the PS4 is underpowered at all...A 7850+ is great GPU. It is that PC GPUs are perhaps overpowered, ie. there is not really anything out there that adequately harnesses the power and they use and incredible amount of wattage. The power comes with a price.
 

Eideka

Banned
I dont think the PS4 is underpowered at all...A 7850+ is great GPU. It is that PC GPUs are perhaps overpowered, ie. there is not really anything out there that adequately harnesses the power and they use and incredible amount of wattage. The power comes with a price.

That's debattable, downsampling demands quite a lot of horsepower for example. In this case 4gb of GDDR5 come in handy.
 

curlycare

Member
PS3 had gpu comparable to something like Nvidia 7800gt, and we got Killzone 2 anyway.

Yeah but then again Killzone and others utilise the SPEs in addition to the gpu, and that's not something to dismiss when comparing these machines.
 
I'm not even going to bother looking at the charts because the answer is NO, at least for the PS4. Not commenting on durango because those rumors are too all over the place.

The PS4 GPU is pretty much a chip that sits in between the power levels of the 7870 and 7850 and it has some customization on it to improve compute efficiency when simultaneously doing graphics calculations. This customization could be stuff from GCN 2.0, but we don't know because we dont have the info on GCN 2.0 yet.

It's a strong card for 1080p gaming. Anything higher, you'd want something in the enthusiast range which is GTX 680 / HD 7970 based chips and above.

The GPU in the PS4 absolutely wipes the floor with Xenos, the current gpu champ for this gen. Xenos was AMD/ATI's first edition of a unified architecture. The PS4 gpu is it's 6th edition of a unified architecture (xenos -> hd 2000/3000 series -> 4000 series -> 5000 -> 6000 -> GCN)

That's a lot of time to change shit up and apply what you learn into the next iteration. GCN is very efficient and very good at GPU compute, even better than kepler which had it's compute performance gimped on all but the Titan series.

Because it has been so long, and devs are still developing shit based on tech from 2004-05, we easily forget how many advancements were made since then because we have nothing to show for it because devs cannot utilize the beasts we already have. It's like when BF3 was getting ready to come out and everyone was saying we'd need the next nvidia/amd chips just to run it and it turned out what we had at the time, ran the game perfectly fine. We forgot that the 2010 GPUs were beasts in comparison to the crap we have in current gen consoles.

The PS4 GPU is well powered. It's not going to be to the level of a 680/7900 based chip, but it's strong enough for 1080p gaming and in a closed box will push the bar up considerably from the low bar devs currently use due to dx9 level 2004 junk.
 

Orayn

Member
I dont think the PS4 is underpowered at all...A 7850+ is great GPU. It is that PC GPUs are perhaps overpowered, ie. there is not really anything out there that adequately harnesses the power and they use and incredible amount of wattage. The power comes with a price.

True, at least to an extent. It's not that you can never bring a high-end video card to its knees, but there are relatively few standouts that use PC hardware with any degree of efficiency due so many games being optimized by 360/PS3.
 

rezuth

Member
The problem this gen was that they put lots of money into the GPU mostly which turned out not to be a good move if you wanna do a long cycle. Sony are the only company thus far that seem to have realized this and is trying to fix it for next gen.
 

Korezo

Member
With all these comments and comparisons I don't even know what to believe. Will I be able to run ps4 ports on my 580 i72600k at the same or higher settings? I was thinking I could , and if not why?
 
I never expected it to be the case again. Microsoft paid the price last generation by designing hardware that was too difficult to cool. Sony designed hardware that could not be sold for profit. Systems with lower power requirements were to be expected this generation.

This, basically. Both companies created a big retrospective of the current gen's lifetime run and when the results came in (and got analysed) said "Frak it. We're not going through that again."
 

Portugeezer

Gold Member
I'm not even going to bother looking at the charts because the answer is NO, at least for the PS4. Not commenting on durango because those rumors are too all over the place.

The PS4 GPU is pretty much a chip that sits in between the power levels of the 7870 and 7850 and it has some customization on it to improve compute efficiency when simultaneously doing graphics calculations. This customization could be stuff from GCN 2.0, but we don't know because we dont have the info on GCN 2.0 yet.

It's a strong card for 1080p gaming. Anything higher, you'd want something in the enthusiast range which is GTX 680 / HD 7970 based chips and above.

The GPU in the PS4 absolutely wipes the floor with Xenos, the current gpu champ for this gen. Xenos was AMD/ATI's first edition of a unified architecture. The PS4 gpu is it's 6th edition of a unified architecture (xenos -> hd 2000/3000 series -> 4000 series -> 5000 -> 6000 -> GCN)

That's a lot of time to change shit up and apply what you learn into the next iteration. GCN is very efficient and very good at GPU compute, even better than kepler which had it's compute performance gimped on all but the Titan series.

Because it has been so long, and devs are still developing shit based on tech from 2004-05, we easily forget how many advancements were made since then because we have nothing to show for it because devs cannot utilize the beasts we already have. It's like when BF3 was getting ready to come out and everyone was saying we'd need the next nvidia/amd chips just to run it and it turned out what we had at the time, ran the game perfectly fine. We forgot that the 2010 GPUs were beasts in comparison to the crap we have in current gen consoles.

The PS4 GPU is well powered. It's not going to be to the level of a 680/7900 based chip, but it's strong enough for 1080p gaming and in a closed box will push the bar up considerably from the low bar devs currently use due to dx9 level 2004 junk.

Appreciate the post, I hope you are right too... but could someone explain to me what GCN is, I keep reading that but I think Gamecube.
 
Is anyone surprised that a pc that is between 2-5 times the cost, size, and wattage of a console can kick said consoles ass, on a bare benchmark level?

It may be interesting, in a boring nonsensical way.

The specific comparison being made here is between the power advantage of PC GPUs at the start of last generation and the power advantage of PC GPUs this generation and is the advantage significantly bigger this time. It's not simply stating top end PCs are more powerful than less expensive consoles. It's certainly something to discuss because it can give an idea of how things will play out over the next few years between the 2 platforms.
 
With all these comments and comparisons I don't even know what to believe. Will I be able to run ps4 ports on my 580 i72600k at the same or higher settings? I was thinking I could , and if not why?

I don't think so tbh. Not becuase of the power, but becuase of the VRAM limitation probably.
 

artist

Banned
My point still stands. People like you love to twist facts so they suit their agenda, but tangible facts say otherwise and the meltdowns are entertaining. I'm more upset by this situation than anything else, I really wanted next-gen consoles to top high end PCs.

That said it does not change the fact that this hardware will produce better results than what is found in the PC space.
This is what excites me the most. :)

As I said I'm eager to see where the GTX770/780 and the new 89XX fit in all of this, because that's going to be a major talking point for PC folks given how demanding next-gen multiplats are likely to be.
So you wanted a Titan+ in PS4/Durango? Gotcha.

I don't have any, I'm a console and PC gamer. I like consoles, they introduced me to gaming after all.
I don't have a horse in the race, I'm here for the fun. :D I'm already sure I will buy a PS4/720 down the line when enough exclusives have released to make the purchase worth my while.


I don't feel uncomfortable with the idea of calling people fanboy if their ultimate goal is just to berate a platform, which is not my case of course. I'm too reasonable for that.
Really? No one is berating that PC is underpowered, the only person who got defensive and started throwing insults in this thread is .. YOU.

If I might ask, where did you get all that technical info about the fill rate, etc, for the ps4's gpu?
Google, various sources.
 

artist

Banned
Still the Flops metric clearly shows that while in this generation consoles got GPUs that was on the same level or even faster in Flops than the best PC GPUs of the same time, in the next generation this isn't so - PS4 got the GPU which has two times less Flops than even Tahiti.

You may do whatever perf/watt comparisons you want but 680/7970 cost exactly the same now as X1800/7800GTX did back then (actually, their prices was even higher than 7970 and 680 cost now) - and that's the only metric that matters when you compare two products in retail. Current PC GPUs draw much more power than R520/G70 did, sure, but who cares? It's the result per dollar - the Flops, the fps, resolutions, shader complexity - that matter.

As for the Titan not being 2.5x as fast as Pitcairn (which is actually faster than PS4 GPU) have a look at this for example:

53362.png


I'm not saying that this is an indication of how they compare going just from their Flops number, but this proves that Titan can indeed be 2.5x faster than Pitcairn/7870. And that's something that simply wasn't the case between G70 and RSX or R520 and Xenos.
Sure, the Titan is faster at higher resolutions and with higher IQ.

Overall, the 2.5x factor wont hold up; http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan/27.html
 

2San

Member
So what is getting compared exactly? Xenos getting compared to GPU's? The ps4 compared to GPU's. This is pretty worthless. What GPU's are you comparing them to?

Since Tahiti has the same amount of VRAM as Kepler afaik, I have no idea what's going on in that graph.
 
So what is getting compared exactly? Xenos getting compared to GPU's? The ps4 compared to GPU's. This is pretty worthless. What GPU's are you comparing them to?

I think it's supposed to represent the flagship/most powerful. So 7800GTX 512 for first graph G70, and Titan for Kepler and 7970GE for Tahiti.
 

Eideka

Banned
So you wanted a Titan+ in PS4/Durango? Gotcha.
No, that's unrealistic. But I will confess I was expecting a GPU equivalent to a GTX680.
Is that too much to ask ? Given that next-gen stuff were demoed using this GPU I deemed that realistic, I was wrong (and Steviep was right).

Really? No one is berating that PC is underpowered, the only person who got defensive and started throwing insults in this thread is .. YOU.
You seem strangely defensive, but that's okay.
You are free to accept my apologies for having unwillingly offended you. I never implied you were anti PC or anything, just that the classic frame of mind that the next-gen consoles will completely obliterate high end PCs has been proven wrong...

And that's kind of sad to be honest.
 

artist

Banned
Since the Tahiti has the same amount of VRAM as the Kepler I have no idea what's going on in that graph.
Uhh ...
And that is? The HD7970 has 6GB VRAM so it isn't that.
Uhh .. no. 3GB.

No, that's unrealistic. But I will confess I was expecting a GPU equivalent to a GTX680.
Is that too much to ask ?
Yes, it is.

You seem strangely defensive, but that's okay.
You are free to accept my apologies for having unwillingly offended you.
How? Am I acting upset, calling people fanboys? No need for apology as long as you can contribute/add any valuable discussion.
 
I know we don't know the cost of the PS4 GPU part of the soc, but it would be interesting to see a performance per £ or $ graph.
 

spwolf

Member
and here i thought PC gamers should be very happy with PS4 so far.. for PC gamers, PS4 is far more important than Titan.
 

Portugeezer

Gold Member
Too bad, but that's not necessary to produce wonderful graphics anyway.
And most importantly, we can kiss 720p goodbye this generation. :)

1080p for the win.

Come on, 680 is like $400-500, probably the price of the whole PS4.

Edit: More like goodbye to sub-720p.
 

Eideka

Banned
Come on, 680 is like $400-500, probably the price of the whole PS4.

Edit: More like goodbye to sub-720p.

I was under the delusion that Sony could come up with a GPU at least as powerful as that, I'm not talking about off the shelves parts, that's not how you build a console.
A highly customized GPU equal/slightly superior in raw power to the GTX 680 ? I could believe that and I was sorely disappointed when the specs leaked.

In the end, it won't matter much when Santa Monica/Naughty Dog will reveal what they have in stock for next-gen.

Edit: More like goodbye to sub-720p
In the very end of the gen when devs will try to push those machines too hard maybe...But for the time being KZ Shadow Fall and Battlefied 4 will run at 1920x1080 on consoles.
I hope everyone will follow suit.

wouldn't be so quick to dismiss 720p, or resolutions that fall in between 720p and 1080p.
Hum...Yeah, that makes sense, some games will be more demanding than others but it's still nice to see that the IQ will likely be decent. SMAA will replace FXAA, hopefully.
 

StevieP

Banned
If someone could design a GPU that has the raw performance of a high end PC GPU while only taking half or less of its TDP, it would've already been done by the folks that are currently designing GPUs and we'd have it in our PCs already.
 

sangreal

Member
Doesn't this demonstrate that while not a leap, the current generation was right up there with the top of the line GPUs? The Xenos loses out in fillrate and bandwidth but that's discounting the EDRAM which was put in there for specifically that reason. Here is what Microsoft said about the EDRAM in their alpha->final silicon comparison*:
The increase in bandwidth for fill due to EDRAM will speed up bandwidth-bound operations significantly.

While the R420 technically can do twice the number of quads per cycle, it quickly becomes frame-buffer bandwidth-bound in even the simplest case. For example, a simple color write with Z-testing requires 300% of the available frame-buffer bandwidth on the alpha GPU. The final GPU never becomes frame-buffer bandwidth-bound.

We expect that the available bandwidth due to EDRAM will significantly increase realizable fill speed, especially when Z-testing, alpha-blending, and MSAA are used.

Compared to the PS4 which is down to varying degrees in every category other than memory capacity (and the Durango fairs even worse). I understand why the concessions are made (the TDP issues you mentioned), but that doesn't change the overall power comparison. That just shows why its underpowered compared to top-end PC GPUs


*of course, I have no idea how this actually panned out. It's also comparing to r420 not r520
 

2San

Member
Come on, 680 is like $400-500, probably the price of the whole PS4.
Ignoring the whole frame time thing, HD7970 starts off at $300(albeit the underclocked version). Get the somewhat weaker, but still great 7950 and you are in the $250 dollar range.

There's still the whole wattage discussion going on though.

I think PS4 is nicely designed thing and makes perfect sense, just pointing out AMD offers good stuff at a nice price point.
 

joesiv

Member
Come on, 680 is like $400-500, probably the price of the whole PS4.
High end GPU's are sold at a premium, because they can; not because they have to be. The price of high end cards goes down greatly over a short period of time, and not only that, lower end cards are often based off the exact same chip, perhaps with less clocks, or with units disabled (in other words, the actual bill of materials for these cards isn't 400-500). The PS4 however will have a bill of materials likely higher than 400-500, they won't be making a huge profit (or any) at launch. What does it cost Nvidia to actually produce a 680? Probably $100-$150 (my guess).
 

King_Moc

Banned
I was under the delusion that Sony could come up with a GPU at least as powerful as that, I'm not talking about off the shelves parts, that's not how you build a console.
A highly customized GPU equal/slightly superior in raw power to the GTX 680 ? I could believe that and I was sorely disappointed when the specs leaked.

Power draw is the problem. The 680 under load uses more power than the PS4 probably does in total. It's too hard to keep cool in a console sized box.
 
It is not suprizing that power has diverged so much though because there is no limit on energy consumption from the desktop high gpus....what GPU in 2005 was consuming close to at Titan? I think the energy efficiency on those things are obscene...part of the reason why I would never get one.
 

mrklaw

MrArseFace
In terms of sheer absolute performance relative to high end PCs? This generation of consoles is a far cry from the previous generation. You're not even comparing to the top end here. Comparisons to Titan would be even more absurd. Of course, PCs don't really have much of an upper TDP limit, so the gap is a given.


Perhaps we could compare with whatever the average PC GPU is, perhaps using steam information for that?
 

Biggzy

Member
Power draw is the problem. The 680 under load uses more power than the PS4 probably does in total. It's too hard to keep cool in a console sized box.

Both Sony and Microsoft pushed what was feasible to put into a console this gen.
 

Eideka

Banned
Power draw is the problem. The 680 under load uses more power than the PS4 probably does in total. It's too hard to keep cool in a console sized box.

Yeah, that was obvious it could not fit in.

Perhaps we could compare with whatever the average PC GPU is, perhaps using steam information for that?
I have my doubts that the "average" PC gamer's GPU is even close to a GTX470.

While next-gen consoles can't rival high-end rigs it's a certainty that they are still far above the average gaming PC in every way, hence why it's a good thing that consoles remain the common denominator.

As a layman, other threads paint a different story on the PS4 and PC's, lots of defensive posts, maybe I just need to learn more on the technicalities of flops and such.
In terms of ROP I think the PS4 GPU is up there with the best, but there is apparently a significant gap in raw power between this and a GTX680/Radeon 7970.
 
I genuinely expected the PS4 to rival high-end PCs back in 2010....That's nowhere near the case.
As a layman, other threads paint a different story on the PS4 and PC's, lots of defensive posts, maybe I just need to learn more on the technicalities of flops and such.
 
PS4 basically uses a high end gpu of 2010 (GTX570), which is good enough.

Remember a year after xbox 360 launch, PC users got a card that completely crushed the current gen consoles cards...the 8800 GTX

In return, 8800GTX is nowhere near the GTX570 in power.

So yes, its still a pretty big difference between this gens gpu and next
 

scently

Member
Compared to contemporary high end pc parts? they look under powered. Compared to ps360? you are looking at atleast 10x in raw power and probably more when you account for architectural improvements and efficiency.

Developers seems to be more concerned with memory than flops and they are getting 8gb on both system, so I think we will be impressed with next gen, as this gen is holding back alot on the design and development of titles.
 

acm2000

Member
Why even try compare console GPU with pc GPU?

Closed hardware, near 100% efficiency, coded to the metal, games coded to one specific hardware config

Vs

Open modular starved hardware, high overheads, ~50% efficiency, games coded for compatibility of multiple hardware configs
 
Interesting post, but if anything it seems to underscore that there are big differences to me. Also, the graphs don't take into account that the 360 GPU's unified shader architecture was pretty far ahead of it's time. Consumer cards didn't adopt it till 2007, also when PC GPUs really leaped out ahead of console GPUs.

Serious question though, How much difference did the architecture make? That part I don't totally understand.

Another question, how fill rate limited are we at 1080P?

I'm skeptical here, but I'd love to learn more.
 
Top Bottom