• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WSJ: Sony Plans New PlayStation for Graphics-Heavy Games

Benefits due to a closed box design exists, I won't deny that, but they don't result in 60% to 100% increase in output.

The benefits of a console design aren't that large.

Just because one was correct in the past doesn't mean that I need to swallow everything (S)he says as the truth.

You don't have to swallow everything he/she says, but their credibility is better than mine or yours as a developer. But again, I just found your post a bit extreme as what he/she said wasn't too crazy all things considered. It just seemed a tad too reactionary, that's all. But I can appreciate exercising caution, I can respect that.
 
pc charts shouldn't be compared with consoles, FPS you get from a pc game doesn't mean you get the same result on a console.

This gen? It's close enough, all things considered. Unlike the old days.

Edlin said:
I think his figures are higher than they should be, but Caramak did say that console performance for the same hardware's "twice as fast as PC" a while back....

With all due respect to Carmack, that's taken entirely out of context. That's not the case.
 

EdLin

Neo Member
+20 fps - meaning basically doubling it's performance just by putting one of those gpu's in a ps4 is utter nonsense.

I think his figures are higher than they should be, but Caramak did say that console performance for the same hardware's "twice as fast as PC" a while back....
 

Caayn

Member
Your doubting a verified insider who is most likely a dev? What credibility do you have?
Common sense and the laws of physics.
He meant +20 FPS on the chart! not sure what 80%-100% increase is in relation to?
60% to 100% increase in output aka framerate. +20fps on a card that does roughly 20fps is a 100% increase.
You don't have to swallow everything he/she says, but their credibility is better than mine or yours as a developer. But again, I just found your post a bit extreme as what he/she said wasn't too crazy all things considered. It just seemed a tad too reactionary, that's all. But I can appreciate exercising caution, I can respect that.
My apologies if I came across as rude, it's not meant that way.
 
The console 4k version will just not run with "very high" settings in the same way they aren't running with max details when running in 1080p.
 
+20 fps - meaning basically doubling it's performance just by putting one of those gpu's in a ps4 is utter nonsense.

Anyone willing to say it is "utter nonsense" against a verified insider should at least show some confirmation about their credentials are willing to be verified by a mod and show you're actually in the industry.

I think his figures are higher than they should be, but Caramak did say that console performance for the same hardware's "twice as fast as PC" a while back....

At least Carmack actually has a resume to go specific about the subject.

Common sense and the laws of physics.

Everyone would take the "common sense and the laws of physics" from someone who is verified to be in the industry over someone who isn't.
 

EdLin

Neo Member
The console 4k version will just not run with "very high" settings in the same way they aren't running with max details when running in 1080p.

You're onto something there, I bet one could hit 4k 30fps or so with the same hardware that's getting 20fps in that chart if it's at "console settings"
 

EdLin

Neo Member
Anyone willing to say it is "utter nonsense" against a verified insider should at least show some confirmation about their credentials are willing to be verified by a mod and show you're actually in the industry.



At least Carmack actually has a resume to go specific about the subject.

True. it should be noted though that Caramak was comparing the Xbox 360 (and probably PS3) against similar PC hardware. PS4/XB1 and PC are more of an apples to apples comparison than that.
 

televator

Member
On one hand, this is kinda cool. Sony keeping up with new tech to implement in the console space.

On the other hand, this (along with 3DS) teaches me to never again buy a new console within the first 3 years of its life span.
 

Lister

Banned
Anyone willing to say it is "utter nonsense" against a verified insider should at least show some confirmation about their credentials are willing to be verified by a mod and show you're actually in the industry.



At least Carmack actually has a resume to go specific about the subject.



Everyone would take the "common sense and the laws of physics" from someone who is verified to be in the industry over someone who isn't.

Facts and the laws of physics don't change because of who's talking.

He's wrong. I don't know who he is, but unless he happens to be a software engineer with intricate knowledge of the 3D rendering pipeline and with experience on both *current* PC and console hardware (and not say, a writer, or a producer, or someone writing scripts for Unity), then I'll continue to dismiss the preposterous idea that somehow a PS4 would be 100% more efficient in terms of GPU use than a PC with a modern API.
 

THE:MILKMAN

Member
Common sense and the laws of physics. 60% to 100% increase in output aka framerate. +20fps on a card that does roughly 20fps is a 100% increase.

Granted, I messed that right up, sorry. But the gist of what Zoetis said is true to a large extent when you factor in the very high settings as PdotMichael pointed out and that I doubt any optimization was done for that rez at all. Pure brute forced.
 

Lister

Banned
True. it should be noted though that Caramak was comparing the Xbox 360 (and probably PS3) against similar PC hardware. PS4/XB1 and PC are more of an apples to apples comparison than that.

More specifically he was comparing console's veyr lightweight impact on the CPU vs Pc utilizing DX9.

That's just nto relevant in a DX11.3 world, nevermind DX12/Vulkan PC world. and he never intended it to be.
 
Anyone willing to say it is "utter nonsense" against a verified insider should at least show some confirmation about their credentials are willing to be verified by a mod and show you're actually in the industry.



At least Carmack actually has a resume to go specific about the subject.



Everyone would take the "common sense and the laws of physics" from someone who is verified to be in the industry over someone who isn't.

Carmack quote was about cpu draw calls with DX9.

We have mountains of evidence from this gen and last showing that cosole GPUs don't benefit that much from a closed system. Also GPU flops are an estimated power based on 100% efficency. Unless you are suggesting that the GPU in a PS4K would be 100% efficient and Titans/980Ti are running at 50% efficiency the +20fps comment is utter nonsense.
 

joecanada

Member
On one hand, this is kinda cool. Sony keeping up with new tech to implement in the console space.

On the other hand, this (along with 3DS) teaches me to never again buy a new console within the first 3 years of its life span.


Why not? I enjoyed the hell out of my ps4 for 2 years. If I upgrade next year, that will be three years of solid enjoyment, even if I got 100 bucks for my ps4 and bought a new one at 499, it would be worth it. That's still only 100 bucks a year on hardware. Games are where I wait to save the money.
 

bitbydeath

Gold Member
+20 fps - meaning basically doubling it's performance just by putting one of those gpu's in a ps4 is utter nonsense.

Not really, just look at the RAM differences.
PS3 with its split 512mb of RAM gave us this-

TheLastOfUs_Screenshot_PS3_03.jpg


No way PC could achieve that on 512MB.
Likely a minimum 2GB would be required.

PS2 gets even sillier with being able to achieve this -

god_of_war_2_18.jpg


On a measly 32mb of ram.

The power differences are real.
 
Not really, just look at the RAM differences.
PS3 with its split 512mb of RAM gave us this-

TheLastOfUs_Screenshot_PS3_03.jpg


No way PC could achieve that on 512MB.
Likely a minimum 2GB would be required.

PS2 gets even sillier with being able to achieve this -

god_of_war_2_18.jpg


On a measly 32mb of ram.

The power differences are real.

Ram has nothing to do with GPU processing power. Especially in the context of the 20fps comment.
 
Not really, just look at the RAM differences.
PS3 with its split 512mb of RAM gave us this-

TheLastOfUs_Screenshot_PS3_03.jpg


No way PC could achieve that on 512MB.
Likely a minimum 2GB would be required.

PS2 gets even sillier with being able to achieve this -

god_of_war_2_18.jpg


On a measly 32mb of ram.

The power differences are real.

thanks for the lovely bullshots on consoles that were actually silicon that was relatively different to PCs of the time (which was, at the time, being compared to their contemporary DX9 equivalents)
 

Lister

Banned
Not really, just look at the RAM differences.
PS3 with its split 512mb of RAM gave us this-

TheLastOfUs_Screenshot_PS3_03.jpg


No way PC could achieve that on 512MB.
Likely a minimum 2GB would be required.

PS2 gets even sillier with being able to achieve this -

god_of_war_2_18.jpg


On a measly 32mb of ram.

The power differences are real.

Another person that doesn't know what he's talking about, fantastic.

First of all we're talking abotu GPU rendering performance, not GPU buffer. Secondly, a PC with a 512 Megabyte GPU could absolutely do the same thing. But Pc and console use system and GPU buffers differently, so you can't really compare the two, except at the GPU buffer level.

Love the bullshots by the way, didn't know TLOU was using 16x AF and what appears ot be a fulyl modern MSAA + post processing anti-aliasing witha temporal filter? Nice.


This is the actual TLOU:

WvKf9BV1.png


Don't get me wrong, it's amazing what money and time and talent could do on that hardware, but none of that is out of the reach of say a 8800GT (though you would need a higher end CPU specially if you're limtied to DX9).
 

televator

Member
Why not? I enjoyed the hell out of my ps4 for 2 years. If I upgrade next year, that will be three years of solid enjoyment, even if I got 100 bucks for my ps4 and bought a new one at 499, it would be worth it. That's still only 100 bucks a year on hardware. Games are where I wait to save the money.

Because I'm poor?
 

bitbydeath

Gold Member
Ram has nothing to do with GPU processing power. Especially in the context of the 20fps comment.

I know. GPU and CPU come into play with those examples as well but are harder to compare as they aren't straight 1:1 figures given how customised the hardware was.
 
Another person that doesn't know what he's talking about, fantastic.

First of all we're talking abotu GPU rendering performance, not GPU buffer. Secondly, a PC with a 512 Megabyte GPU could absolutely do the same thing. But Pc and console use system and GPU buffers differently, so you can't really compare the two, except at the GPU buffer level.

I ran Bioshock infinite on my 8800GT and even downclocked it by 50% to match the Xbox360 gflops. It had 512MBs and matched 360 performance exactly.

I know. GPU and CPU come into play with those examples as well but are harder to compare as they aren't straight 1:1 figures given how customised the hardware was.

Then why bring it up? We are currently dealing with custom APUs in the consoles that have off the shelf PC tech. The unified memory does offer some increase in efficiency just not 100% or even 50%.
 

Lister

Banned
OK......lets get back on track.

What is the minimum upgrade requirement to achieve reasonable 4K game performance for PS4K?

If you said games with current PS4 graphics setting,s I'd say 980 levels will net you 30 FPs at 4K more or less. That card is way out of reach for a $400 console though, since you'd also probably going to increase CPU power.

But current PS4 levels aren't what you'd want at 4K, probably. You'd want something closer to PC at high settings at 4K or things liek crappy shadows and poor LOD are going to be even MORE obvious.

So it's more like GTX 980ti+.

NOT feasible for a console in terms of TDP, in terms of price.
 

EdLin

Neo Member
OK......lets get back on track.

What is the minimum upgrade requirement to achieve reasonable 4K game performance for PS4K?

For a PC, with some AAA games, 30fps at console settings (with AA also dialed back because you don't need it as much at 4k)?

A $550-$650 card like a 980ti or r9 Fury.

For a PS4? Well, something similar, it certainly wouldn't be something cheap enough for a $399 console, even if it only needs a tier or two lower in graphics. ($3-$400 GPU cards, not counting cost of the CPU, RAM, motherboard, etc.)
 

AmFreak

Member
Anyone willing to say it is "utter nonsense" against a verified insider should at least show some confirmation about their credentials are willing to be verified by a mod and show you're actually in the industry.



At least Carmack actually has a resume to go specific about the subject.



Everyone would take the "common sense and the laws of physics" from someone who is verified to be in the industry over someone who isn't.

No credentials needed when everyone can see this.
If it were true that putting xy gpu in a console doubles it performance were are all these games hiding that show this?
I don't know of a single one proving that point.
In fact most games of the last 2 years have shown comparable performance on comparable gpu's.
 

DeepEnigma

Gold Member
What size tv?

49"

Just went with the IPS 49XBR830c. $650 for it was hard to pass up. Especially since I am used to IPS panels for PC monitors.

I am glad I got the 49/50. I was thinking 55" in store, since next to all the screens, it is hard to guage size in your home. It is perfect for where I have it, and I sit about 6 feet away, so it looks huge to me (used to 27" monitor for a while before deciding to get a new TV again. Traveled a lot with work.)

Well that's badass. Congrats on the buy!

Thanks, lol. I actually went in, and was like, fuck it, I am just going to get the 1080p W800c for $700, and use my double points coupon that expires in 2 days. I needed a new set in time for MLB, R&C, and UC4.

That was when the salesperson saw me staring at the X830c, and was like, "we just got one 'open box' never even take out of the box. Just screws that were sitting on top in a bag. We can let you have it with the $100 sale, plus additional $230 off which will be a 120hz 4K set."

I loved the tiny pixel density, and read good things about the X1 scaler, and played the PS4 (Ratchet and Clank demo) on it, and the scaling was just as sharp and native looking as 1080p, no softness at all, and even better pixel density (tinier stair step).

So I thought about this thread, and was like, "fuck it, let's do it. 4K ready, got great reviews for the PC hookups as well, and I am paying less for both the TV and warranty than the 1080p one? And an XBR?" Easy decision, lol.
 

RoboPlato

I'd be in the dick
The console 4k version will just not run with "very high" settings in the same way they aren't running with max details when running in 1080p.

Yeah, this is something that keeps getting forgotten in these discussions. The more and more we hear about this machine the less and less I think it's going to have tangible benefits for non-VR games outside of resolution. Maybe just improved texture filtering and shadows since those things can stand out more at higher resolutions.

I'm also still not expecting native, just high quality reprojection/reconstruction.
 

mrklaw

MrArseFace
On one hand, this is kinda cool. Sony keeping up with new tech to implement in the console space.

On the other hand, this (along with 3DS) teaches me to never again buy a new console within the first 3 years of its life span.

Why? Because a new one will be out three years later? But then you'd never buy anything because something new is always out three years later (more likely even sooner than that)

Surely the important thing is - do you get guaranteed support for console X for 5-6 years like a normal console lifetime? If you do, then your purchase isn't impacted unless you feel insulted by the idea that others might be playing games at a higher resolution than you.
 
I haven't done any extensive comparisons but Tomb Raider on my PC with a Radeon 7870 runs better on the PS4 and that's supposed to be in line with a comparable GPU.

Tomb Raider on PS4 had a newer version of TressFX. So you can't compare it really. Without TressFX I ran it on a 460GTX at 1080p and that's well below an Xbox1 even.
Every DF Face off they compare the 750ti and it's about an 1.3 TF card and matches PS4 in most cases.
 
Ram has nothing to do with GPU processing power. Especially in the context of the 20fps comment.

Or a more reasonable answer, the reason why people sometimes talk about "balanced" systems as the key to consoles is because you can't have one component that is significantly more powerful than other. You can add a 32GB RAM on a PC but if your GPU and CPU is shit then that RAM is only good enough for audio and OS.

When consoles are being made, the key component is making sure everything is balanced. You can't slap an 8GB RAM on PS3 and all of a sudden you're doing PS4 level graphics. You need the GPU and CPU to compliment it. Same thing with these PS4Ks, you can't add 16GB RAM and call it quits. You need to make sure the processor and graphics card gets a nice little bump as well to utilize those extra 8GB of RAM, or else you're just going to see them dedicate that extra RAM on the OS.
 

THE:MILKMAN

Member
It won't be every title will be 4K native anyway. Upscaling to 4K from the most stable highest resolution.

I don't get why if it is upscaling then why the talk of more power? It makes zero damn sense.

This just sounds like a worse version of Xbox One's famous exec quote claiming all games output 1080P.
 
Or a more reasonable answer, the reason why people sometimes talk about "balanced" systems as the key to consoles is because you can't have one component that is significantly more powerful than other. You can add a 32GB RAM on a PC but if your GPU and CPU is shit then that RAM is only good enough for audio and OS.

When consoles are being made, the key component is making sure everything is balanced. You can't slap an 8GB RAM on PS3 and all of a sudden you're doing PS4 level graphics. You need the GPU and CPU to compliment it. Same thing with these PS4Ks, you can't add 16GB RAM and call it quits. You need to make sure the processor and graphics card gets a nice little bump as well to utilize those extra 8GB of RAM, or else you're just going to see them dedicate that extra RAM on the OS.

Trust me I know. I just upgraded from an i5 2400(3.1GHz)/8GB DDR3(1300MHz) to a 6600K(4.4GHz)/16GB DDR(3000Mhz). I think I increased my min frame rate here and there, but my 7870XT still can't get me a locked 60fps in current games. Still waiting on more concrete news on Pascal/Polaris.

I don't get why if it is upscaling then why the talk of more power? It makes zero damn sense.

This just sounds like a worse version of Xbox One's famous exec quote claiming all games output 1080P.

It really depends on the power increase, but decent over clock might allow them to up the details so that an upscaled 1080p image will still look better than the PS4 1080 output. They could run at an in between res like 1440p and up res that. Don't know if you saw it, but NXgamer did the video about how Quantum Break was using 4 720p images to upscale to a 1080p output. He also talked about Killzone Shadowfall's interlaced method. They might not have a 4K internal render, but they might be able to output a decent 4K image in the end.
 

Hubble

Member
People are being ridiculous. This is innovative. A more powerful console that is forwards compatible. It is innovation, and is not often done. A new PS4.5 that can play PS4 games and continue the evolving ecosystem while pushing new graphics boundaries.

Consoles have been styming PC graphics for years now since the 360 era. This will speedy that process and bring better graphic advancements more quickly across all platforms. Please be true!
 

The God

Member
People are being ridiculous. This is innovative. A more powerful console that is forwards compatible. It is innovation, and is not often done. A new PS4.5 that can play PS4 games and continue the evolving ecosystem while pushing new graphics boundaries.

Consoles have been styming PC graphics for years now since the 360 era. This will speedy that process and bring better graphic advancements more quickly across all platforms. Please be true!

Cross gen games have always been a thing
 

JazFiction

Neo Member
Without all the techno mumbu jumbo.

There is one thread from the past that this reminds me of.

The PS4 rumour thread which was cluttered with '8gb DDR5 is ridiculous'. Never gonna happen. Too expensive. Never in a mainstream console. Etc.

I can't wait for E3 if this thing is real.

Gaf meltdown's are the best, and they work even better with 6 months of naysayers
 

televator

Member
Why? Because a new one will be out three years later? But then you'd never buy anything because something new is always out three years later (more likely even sooner than that)

Surely the important thing is - do you get guaranteed support for console X for 5-6 years like a normal console lifetime? If you do, then your purchase isn't impacted unless you feel insulted by the idea that others might be playing games at a higher resolution than you.

I like making console purchases that are a "one and done" deal. It sounds superficial, sure, but I have to consider the resources I have to work with to satisfy my gaming habits. Plus, I have lots of old consoles from previous gens to keep me occupied. So I'm not exactly hurting to play on a new console from day one.
 

Kathian

Banned
People are being ridiculous. This is innovative. A more powerful console that is forwards compatible. It is innovation, and is not often done. A new PS4.5 that can play PS4 games and continue the evolving ecosystem while pushing new graphics boundaries.

Consoles have been styming PC graphics for years now since the 360 era. This will speedy that process and bring better graphic advancements more quickly across all platforms. Please be true!

Its not innovative. Its been done before and failed before.
 

Lister

Banned
Without all the techno mumbu jumbo.

There is one thread from the past that this reminds me of.

The PS4 rumour thread which was cluttered with '8gb DDR5 is ridiculous'. Never gonna happen. Too expensive. Never in a mainstream console. Etc.

I can't wait for E3 if this thing is real.

Gaf meltdown's are the best, and they work even better with 6 months of naysayers

I rememebr very differently, with fanboys saying that the new PS4 would beat a top of the line PC, no it would be better than anything you could buy for 6 months!

And yet a 750ti ended up besting it in a lot of games, matching it in most, and only falling behind to Xbox one levels in a few.

I do recall some people not buying the 8 gigs of GDDR5, mostly because we wer ehearing from Nvidia that the memory modules were contrained. Probably because Sony was buying them up!
 
Top Bottom