• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

vg247-PS4: new kits shipping now, AMD A10 used as base, final version next summer

If most next-gen games come out at native 1080p I don't even have a reason to play PC anymore. The resolution difference was always my biggest problem with consoles. I play all my games 1080p so this would be awesome.
 
- modified PC doesn't mean that A10 is the final CPU/APU in the PS4

- Based on A10 means the potential for lots of customisation therefore don't read too much into that specifically

- 256GB suggests SSD but thats a bit SSD and would be too expensive.



APU probably wouldn't have enough oomph for next gen graphics. So you want a dedicated GPU still (hopefully).

The GPU part of the APU can be used for other things, much like CELL did this gen. Either physx style stuff, or possibly post processing/AA etc. Basically offloading stuff from the CPU


right, makes a lot of sense! But what if those stacking rumors are true? A stacked CPU+GPU and an APU? Is that good?
 

Raptor

Member
That sounds real good to me, Bluray still in, 1080p60fps upfront for games, really good, can't wait to see it.
 
Sounds like the launch price will be 399$/€. Maybe 35k yen in Japan because of strong yen (like Nintendo will do with Wii U)
 
A mid range pc plays current gen games at 1080 p and 60 fps =)

My "mid range" PC from 3 years ago plays Dishonored (a game that just came out) at 1080p 60FPS with 4xAA and 8x AF

(Phenom 2 3 core unlocked to 4 core, AMD HD5770 graphics card).

So I'd be content with a current mid-level PC as far as graphics capabilities are concerned.
 
I don't even care about 30/60 fps, it's the shit 720p resolution that's the problem. Low-res textures, poor image quality, blurriness, this is all part of the res. 1080p standard is so important for these next-gen consoles, I really hope they come through in that regard. I want to get back into consoles, haven't since PS2, but I can't imagine playing 720p again after glorious 1080p on my monitor.
 

GashPrex

NeoGaf-Gold™ Member
so maybe I'm misunderstanding, but basically we are talking a "super" integrated chip here, with built in GPU? Sounds to me like it might be more affordable right?
 

herod

Member
My "mid range" PC from 3 years ago plays Dishonored (a game that just came out) at 1080p 60FPS with 4xAA and 8x AF

(Phenom 2 3 core unlocked to 4 core, AMD HD5770 graphics card).

So I'd be content with a current mid-level PC as far as graphics capabilities are concerned.

Can expectations for this console get any lower?

Yeah and 99% of the games are basically 360/PS3 ports with better iq.

So that's a no then.
 

Pranay

Member
My "mid range" PC from 3 years ago plays Dishonored (a game that just came out) at 1080p 60FPS with 4xAA and 8x AF

(Phenom 2 3 core unlocked to 4 core, AMD HD5770 graphics card).

So I'd be content with a current mid-level PC as far as graphics capabilities are concerned.

What i was saying is the specs which next gen consoles have will easily play the current gen ggames at 60fps and 1080p

They are aiming for 60 fps and 1080 for next gen games
 
x86 cpu

16GB ram in devkits

YES!

Devs know x86, no more weird (but powerful) POWERPC stuff. Awesome.
Look at how powerful the xbox 1 was compared to the competition.

The Pentium 3/Celeron processor in the Xbox 1 actually created more bottlenecks than strengths. The real power of the original Xbox came from the graphics card, which incorporated programmable shaders, and the large amount of RAM.

x86, while well-known and documented, is an extremely bloated instruction set, much of which is unnecessary for gaming related functions. Jack of all trades, master of none sort of thing. Still, having the CPU and GPU on the same chip from the get-go would allow for an incredible amount of bandwidth between the two.
 

McHuj

Member
1080p/60fps is utterly meaning less because it's important to know what is being displayed at 1080p/60fps, Uncharted or Mahjong.
 

mrklaw

MrArseFace
I don't even care about 30/60 fps, it's the shit 720p resolution that's the problem. Low-res textures, poor image quality, blurriness, this is all part of the res. 1080p standard is so important for these next-gen consoles, I really hope they come through in that regard. I want to get back into consoles, haven't since PS2, but I can't imagine playing 720p again after glorious 1080p on my monitor.

1080p is no good without suitable textures. I'd rather have 720p with high res textures, than 1080p with low res ones.
 
So that's a no then.
While it's not near 99% (he was exaggerating to make a point), the majority of the best and AAA games are indeed console up-ports. There are some great exclusives like D3, the entire moba genre, MMOs like GW2, but the majority of the AAA games are ports. I love my PC but this is a fact, and it's also exactly why we should all be hoping PS4 and the next Xbox are very powerful.
 

Xdrive05

Member
So a quad core AMD CPU?

That bodes well for current main-stream gaming PCs getting good next-gen ports. Now watch durango be some 8-core, 16 threaded Atom-esque CPU to fuck that up.
 
Dev kits tend to have more RAM than the retail versions. I doubt we'll see more than 4GB of RAM on the PS4

If the dev kits are 8GB than 4GB on console would make sense... There would be no reason, however for a dev kit to have 4x the ram of the final console UNLESS Sony still hasn't locked down the amount of ram available for the final console... but if that's the case, it still seems odd to possibly be offering way more ram than they might have available.
 

Eideka

Banned
A bit yeah, but nothing groundbreaking, and definitely nothing that i would call next gen.

Not groundbreaking for you perhaps. It's not next-gen but it's far more impressive than console graphics.

It's up to you to deny that fact.
But as a console fanboy you probably don't know what PC graphics look like. ;)
 

Talamius

Member
1080p60 in 3D is not happening on an A10 by itself. An A10 by itself will struggle in 1080p period.

If there's another GPU involved it will make a ton more sense.

Edit: And yes, I know it's not final specs.
 

gofreak

GAF's Bob Woodward
so maybe I'm misunderstanding, but basically we are talking a "super" integrated chip here, with built in GPU? Sounds to me like it might be more affordable right?

Basically and yes.

APU is CPU + GPU on one die.

APUs available today tend to have small GPU resources though. If PS4 is to be in line with the prior rumour of 18 AMD Compute Units, it will be quite a bit larger than a vanilla APU (which sport 4-6 IIRC).


1080p60 in 3D is not happening on an A10 by itself. An A10 by itself will struggle in 1080p period.

If there's another GPU involved it will make a ton more sense.

Edit: And yes, I know it's not final specs.

It's a chipset 'derived from' A10 base. Compute Unit config will vary from the A10 SKUs AMD is currently marketing I'm sure...
 
8Gb = 1GB

There are 8 bits in one byte.

If these specs are true, this is around WiiU level.

That's not what it says... *rereads* Hmm... they did very clearly write Gb and not GB... that does APPEAR to imply bits, but I'd still take that with a grain of salt just because it's such a common misconception.
 
1080p is no good without suitable textures. I'd rather have 720p with high res textures, than 1080p with low res ones.

How can textures be good at 720p? even the most detailed ones will still just be 720p and blurry and garbage unless you're sitting 10 feet away, which I guess a lot of people do if they play on a TV but still.

The other thing is that devs create assets at much higher res than 1080p so they'll automatically be better.
 

Sandfox

Member
If this costs more than $400(Knowing Sony the are going to charge the us a premium for their products) I'm not buying it day one. The chance of it not having BC is annoying because that means I have to keep my PS3 hooked up if I want to play PS3/PSN games while I'm waiting g for the PS4 line up to grow but its manageable.
 

tapedeck

Do I win a prize for talking about my penis on the Internet???
Did nobody tell Sony that only 0.000002% of the world use Ethernet?!
Well Im part of that percentage, and I think alot of people who play online prefer a wired connection over wifi if at all possible.
 

1-D_FTW

Member
I would assume it has a dedicated GPU if the goal of 1080P/60fps/3D bit is even remotely achievable.

If not, unimpressed. But considering it's going the always on Wii/WiiU route, it's a nice CPU hybrid system that'd be low power under non-gaming conditions. Which makes it seem like a thoughtful approach. Low power unless you need the oomph.
 
Strange to put 3D in there. Didn't Sony recently say that there is "no interest" in 3D from the consumer market?

I imagine that they have to keep it in there since they sell 3D TV's and all, but that particular comment in Enslaved was pretty apt. So I would imagine it not to have any particular focus, unless they want to do something different with it.
 
Top Bottom