• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Eurogamer\DF] Orbis Unmasked: what to expect from the next-gen PlayStation.

slider

Member
It's interesting watching the ongoing discussions but I have to admit it means very little to me until I see some game footage. It's only then I'll have the goalposts of the discussion... closed system versus PC etc.
 

gaming_noob

Member
Going for DDR3 allows them to include more RAM.

I know going for DDR3 means more RAM but it automatically gives them a bottleneck that forces them to think of ways to go around that. Wouldn't it be easier to go with 4 GB of GDDR5 instead of 8 GB of DDR3 if both configs sort of balances each other out?
 

Krabardaf

Member
Slightly, and it's for Cell. Isn't the bandwidth and memory amount responsible for why many multiplat games have better framerates or better aa or whatever than they do on PS3?
It often is. But XDR can fortunately be accessed by the GPU, it's just slower than accessing the VRAM.
 

mrgreen

Banned
It will be a big leap for those who have been exclusively console gaming for the last 5-7 years. For those who have had significant exposure to PC gaming, then its a different story.
That's the thing. I don't even think it is a big leap from the PS3 to that laptop and mobile GPU combo. Certainly it's not nearly a big enough leap for me.

How would the comparable laptop in that article run the Witcher 2, which the 360 ran well?
 

Krabardaf

Member
I know going for DDR3 means more RAM but it automatically gives them a bottleneck that forces them to think of ways to go around that. Wouldn't it be easier to go with 4 GB of GDDR5 instead of 8 GB of DDR3 if both configs sort of balances each other out?

They probably have specific needs for more memory that Sony has not.
3GB reserved to the OS is an indication of that. Plus I think 720 will ship with Kinect.

That's the thing. I don't even think it is a big leap from the PS3 to that laptop and mobile GPU combo. Certainly it's not nearly a big enough leap for me.
Wait the games. Next gen will raise the bar even for PCs.
 
That's the thing. I don't even think it is a big leap from the PS3 to that laptop and mobile GPU combo. Certainly it's not nearly a big enough leap for me.

How would the comparable laptop in that article run the Witcher 2, which the 360 ran well?

360 didn't run The witcher 2 well man, it's like two different gens when compared to PC.
 

Boss Man

Member
That's the thing. I don't even think it is a big leap from the PS3 to that laptop and mobile GPU combo. Certainly it's not nearly a big enough leap for me.

How would the comparable laptop in that article run the Witcher 2, which the 360 ran well?
The thing that you're missing (and a lot of people don't get this) is that you can't look to a "comparable laptop" to get a grasp on how the tech will be used in consoles.
 

test_account

XP-39C²
-Trophies
-In game XMB
-Background downloading
-Motion controls (move)
-Triggers on the DS3
-Ingame music

Thats off the top of my head and the big ones. I'll write another list with some other ones I think about.

Thats not including things that the PS3 just outright cannot do because of hardware, ie cross game chat, it also has beacons, and the ability to launch your games from your current game, plus more shit I really don't feel like outlining.
Sony was actually working on the Move concept already in 2004 (maybe earlier too): http://www.youtube.com/watch?v=JbSzmRt7HhQ. I do think that Sony decided to go further with this idea because of the Wii success though, but Sony already had the basic Move idea before the Wii was released.

But personally i dont see why it matters much. It is like saying that Microsoft stole the PS Eye idea with Kinect, or that Microsoft ripped off the dual analog sticks from PS1/PS2. Someone has to be first, everyone cant come up with new ideas all the time. And stuff like in-game OS, background downloading and in-game music are all taken from the PC enviorment anyway (eventhough the inspiration to include it later on in the PS3 might be have come from the Xbox 360).
 

ZaCH3000

Member
That's the thing. I don't even think it is a big leap from the PS3 to that laptop and mobile GPU combo. Certainly it's not nearly a big enough leap for me.

How would the comparable laptop in that article run the Witcher 2, which the 360 ran well?

PC gamers way over hype the advantages of PC gaming. Crysis 2, The Witcher 2 and other fabled PC games fail to impress me to the extent of their hype.

What does impress me are the tech demos for Luminous, Unreal Engine 4, and another next-gen engine that escapes me. The target render for Cyberpunk sparks my imagination of the graphical, and artistic leap in visuals possible by mid-generation as well.

Everything else that is currently toted as mind melting on PC looks like current-gen assets with a shiny, fresh coat of paint. Admittedly, yes the graphics are far superior to PS360 but not as far as the fables tell.

My understanding is that this is true because PC developers have to develop their games with the consoles minimum needs in mind for porting purposes. I think I speak for mostly everyone when I say this gen has overstayed its welcome.

Eideka makes a great point. The PC is unquestionably a level ahead of the consoles but suggesting a generational gap is hyperbole.
 
What does impress me are the tech demos for Luminous, Unreal Engine 4, and another next-gen engine that escapes me. The target render for Cyberpunk sparks my imagination of the graphical, and artistic leap in visuals possible by mid-generation as well.

Somehow, I think something got lost in translation, and what they meant with "target" was the overall feel and look of the game.

No way next gen systems can do that.
 

bobbytkc

ADD New Gen Gamer
Slightly, and it's for Cell. Isn't the bandwidth and memory amount responsible for why many multiplat games have better framerates or better aa or whatever than they do on PS3?

The cell can access only XDR, but the GPU can access both pools. This is my understanding, though I am not a games developer by any means, so someone may correct me on that.
 

Bombadil

Banned
I think Microsoft may have an initial reserve of 3 gigs for the OS, and then they may reduce the OS footprint over the course of the generation to free up more memory for games.

4 gigs of GDDR5 is impressive, but 8 gigs of DDR3 may have more utility to it.

I think, however, that Sony's first party studios will once against trump Microsoft's first party studios, and not just because of talent. And I think that once again, Microsoft's console will be a much better media hub than the Playstation 4 because of the amount of RAM.
 

i-Lo

Member
Somehow, I think something got lost in translation, and what they meant with "target" was the overall feel and look of the game.

No way next gen systems can do that.

Unless you can either prove it and I know you can't just like I know there is no way to disprove it, this bit of tirade should be dropped. It's getting annoying now.

Personally, I can easily see them doing all those things at 720p30fps to the say the least. But that's just a gut feeling.
 

mrgreen

Banned
360 didn't run The witcher 2 well man, it's like two different gens when compared to PC.

I wanted someone to properly answer my question about how well the laptop in that article could run the Witcher 2? Not some lie about the 360 version (of a game that no PC of similar power to the 360 could run at all.)

My point is that if that's how powerful the PS4 is going to be then it's not good enough at all imo.
 

Durante

Member
Did I miss something? I haven't visited GAF in ~24 hours, but this seems to be a 50 page thread about... nothing new at all.
 
Eideka makes a great point. The PC is unquestionably a level ahead of the consoles but suggesting a generational gap is hyperbole.

I take back that statement man.

However the difference between Gears of War 3 and Uncharted 3 is pretty soft when compared to the difference between Battlefield 3 PC and Killzone 3 Ps3. I was coming off a line of though where the difference wasn't small. So I guess my argument lost its balance.

But I take it back man!

Unless you can either prove it and I know you can't just like I know there is no way to disprove it, this bit of tirade should be dropped. It's getting annoying now.

Personally, I can easily see them doing all those things at 720p30fps to the say the least. But that's just a gut feeling.

All I'm saying is that, that offline rendered video was probably using shit like ray tracing, all the AA needed for it to look perfectly clean, and whatever more. Character models might look close though.
 

scently

Member
Alpha blending is something that can be done on the EDRAM, which makes it nearly free as far as bandwidth goes since the GPU has fast access to the EDRAM. Textures and game assets are generally accessed from the main memory (since they are too small to fit into a EDRAM which is only a couple of megabytes). To make it simpler, the 360 gets free bandwidth (or does not consume bandwidth from the main memory) for specific effects, which the PS3 does not get, although XDR has a slightly higher bandwidth than 360's memory.

The eDRAM on the 360 was its saving grace and one of the reasons the gpu was considered better. The RSX had access to two pools of RAM, each with there own dedicated bandwidth; 25.6 gb/s to the XDR and 22.4 gbs to the GDDR3 so effectively the RSX can access 48 gb/s of bandwidth, though the 25.6 is shared with the Cell.

The 360 on the other hand had just 22.4 gb/s of bandwidth to its GDDR3 RAM and the CPU also needs to access this RAM through the gpu using this same 22.4 gb/s bandwidth so effectively the 360 was at a disadvantage having less than half the total bandwidth of the ps3 for all its operation. This is where the eDRAM comes in. Most of the operations that consume bandwidth (back buffer, pixel ops, some post processing etc) where moved to the eDRAM module. These operations eat alot of bandwidth and as the connection from the eDRAM to the ROPS provided these bandwidth (256 gb/s) the majority of the 22.4 gb/s was saved for other gpu/cpu bandwidth needs. Of course once these ops were completed, they are then transferred to the gpu via the 32gb/s connection from the eDRAM module to the gpu.

The end point is that, had MS not included the eDRAM on the 360 it really won't have been able to compete with the ps3 as it would really have been bandwidth starved.

I can speculate that this is part of there reason for going with the 32 mb eSRAM which, if true, is a better embedded RAM tech than eDRAM. It allows them to include a large amount of cheap RAM (in this case the 8 gb DDR3) with a decent amount of bandwidth (68gb/s) and avoid any bandwidth bottleneck by moving the majority of the bandwidth consuming operations to the eSRAM module. So it could be a win win situation for them. Once the rest of the details of the Durango comes out we will get a better picture of what they are trying to achieve.


Btw I don't believe the 3gb and 2 cpu cores reserve for the OS, that's just wrong.
 

mrklaw

MrArseFace
If MS puts full win 8, or win 8 RT, could they offer access to all metro apps?

They'd need to deal with controller fragmentation - it's not touch, you don't have mouse and keyboard, so they'd have to have controller support somehow
 

ZaCH3000

Member
Somehow, I think something got lost in translation, and what they meant with "target" was the overall feel and look of the game.

No way next gen systems can do that.

I'm treating their statement with the same respect as GG did with KZ2's target render. A target is a target.

Falling short of a bullseye is still an impressive feat. If the target render is a 10, they will still strike a respectable 8.

Get lost in the strong riptide of hype!
 

PG2G

Member
If MS puts full win 8, or win 8 RT, could they offer access to all metro apps?

They'd need to deal with controller fragmentation - it's not touch, you don't have mouse and keyboard, so they'd have to have controller support somehow

Kinect should be capable of doing anything touch can do.
 

i-Lo

Member
Did I miss something? I haven't visited GAF in ~24 hours, but this seems to be a 50 page thread about... nothing new at all.

No it's pretty much newly sprout internet pundits arguing the ramification of a non-existent rumoured hardware that may or may not change in design.

Ignore list is your friend.
 

mrklaw

MrArseFace
I'm rather happy with my 5v, completely silent Raspberry Pi.
It's good enough for me :p

Still, 3 GB for OS doesn't sound realistic.
Full Windows 8 doesn't need half that.
I can maybe understand MS wants a system that can run several apps in background while gaming, but i don't need them and they will probably affect gaming performance anyway (unless not only a part of the ram is reserved for OS, but Cpu cores etc...too).


Plex or xbmc are nice for stored content. But I'd like something fully integrating my live cable TV, my stored media and online streaming on demand, with common search across everything.

GoogleTV had the right idea but too much focus on the Internet and not enough on media.

A games console with HDMI pass through that is always on, that can overlay game notifications on my live TV, can search and browse all my media whether live, offline or online, and switch between them seamlessly? Yes please.
 

spats

Member
I dreamt last night that the ps4 had a system which allowed for really intensive tasks to be rendered online on a server but have everything else was done offline, sort of like deferred cloud computing. It doesn't make any sense but I thought it was a cool dream.
 

mrklaw

MrArseFace
Do you honestly believe that most of these responses would even exist if XB3 had the same amount of RAM available for game development (3.5 vs 5)? Because people have yet to see the proof of what makes GDDR5 a good choice over DDR3, the contention is based on "amount" alone, pushing the "type" into irrelevancy. It's more about people hoping that PS4 doesn't get short end of the stick when it comes to multiplat titles like current gen.

Flip the question round. What makes more better? Why the assumption that it is better? If you can't feed the GPU quickly enough, and the GPU can't process as much anyway, then all you have is a big cache.


I'd like feedback from the developers on here as to how a typical streaming engine might break down the memory - eg how much for the immediate view, how much for caching your surroundings, how much buffer from HDD etc.
 
Underwhelming specs? 4GB of ram isn't really something I think is future proof, open world games are going to have problems a la PS3 no?
Durango is using 8GB minus OS (which just can't be 3GB, that's just too much)
And after reading that the PS4 version of Planetside 2 would need some adjustments and removal of some effects...bleh.
Can somebody put my concerns to rest?

There is 7.5 times more ram dedicated to games. And it's much faster.
 

Ramblin

Banned
I wanted someone to properly answer my question about how well the laptop in that article could run the Witcher 2? Not some lie about the 360 version (of a game that no PC of similar power to the 360 could run at all.)

My point is that if that's how powerful the PS4 is going to be then it's not good enough at all imo.

I had the same thought, not being enthused about the difference, if that was what was expected. Then I remember all the crazy promises of past generations, what was delivered with the launch games and how impressed I was, and at the end of the generation and how impressed I was how far they came. I'm going to wait until I see the reviews for the launch games before I decide to buy.
 

Bombadil

Banned
Flip the question round. What makes more better? Why the assumption that it is better? If you can't feed the GPU quickly enough, and the GPU can't process as much anyway, then all you have is a big cache.


I'd like feedback from the developers on here as to how a typical streaming engine might break down the memory - eg how much for the immediate view, how much for caching your surroundings, how much buffer from HDD etc.

Don't the 8 cores take care of that concern?
 

CrunchinJelly

formerly cjelly
Did I miss something? I haven't visited GAF in ~24 hours, but this seems to be a 50 page thread about... nothing new at all.

You missed lots of this:

fiQFw.gif
 

sTeLioSco

Banned
Underwhelming specs? 4GB of ram isn't really something I think is future proof, open world games are going to have problems a la PS3 no?
Durango is using 8GB minus OS (which just can't be 3GB, that's just too much)
And after reading that the PS4 version of Planetside 2 would need some adjustments and removal of some effects...bleh.
Can somebody put my concerns to rest?

4gb gddr5 unified memory.
the problems with ps3 was the split architecture
 

open_mouth_

insert_foot_
Can there be tech like Gakai where a user could download an "App" on the PS4 or Xbox or whatever that's has like 1 Gig of generic Textures and sounds, for example, and then play streamed games where most/part of the graphics load is handled locally?
 
Did I miss something? I haven't visited GAF in ~24 hours, but this seems to be a 50 page thread about... nothing new at all.

Well, at least some of the ongoing posts are about the RAM. After all that noise about Wii U's RAM, comparing the PS4's raw high-speed GDDR5 RAM approach to Durango's ESRAM/DDR3 RAM combination would inevitable be heated :)
 
That's the thing. I don't even think it is a big leap from the PS3 to that laptop and mobile GPU combo. Certainly it's not nearly a big enough leap for me.

How would the comparable laptop in that article run the Witcher 2, which the 360 ran well?

Erm what ?

I can run TW2 at 1920x1200 (no AA) with 25-30 fps using GTX 470 and that laptop gpu is at least 40-50% faster.
 
So how much can a 8 core jaguar and lets say a 7xxx or 8xxx series gpu process per second.
Maybe we can then guess who made the better choice. If lets say those 8 jaguar cores and gpu can only process like 70Gb/s then microsoft is right but if they can process like 200Gb/s then sony has the better design.

For 30 fps games that means like
70/30 = 2.33Gb per frame
200/30 = 6.66Gb per frame

For 60 fps games
70/60 = 1.16Gb per frame
200/60 = 3.33Gb per frame

So to make complete use of the processors and bandwidth sony cant fit all the data into their memory so they have to probably stream in stuff and loose some processing time for that.

Microsoft can actually keep 3 frames worth of data in their memory and probably dont have to spend as much time to stream in data.

Probably bad example. But my idea is there are probably a lot of balancing facts to make.
And you can do that with a console where hardware is fixed for 5~8 years.
 

gofreak

GAF's Bob Woodward
Can there be tech like Gakai where a user could download an "App" on the PS4 or Xbox or whatever that's has like 1 Gig of generic Textures and sounds, for example, and then play streamed games where most/part of the graphics load is handled locally?

I'm not sure this is what you're getting at, probably not, but Gaikai was showing a thing where you could start playing a game before it finished downloading. It would download the bits you need to get started then download the rest in the background while you play, so you can start playing faster.

http://www.youtube.com/watch?feature=player_embedded&v=SRyx8dFooV0
 

scsa

Member
Over time the os usage of ram shrinks , doesn't it?
Ps3's XMB used a lot of ram initially, but the footprint got smaller with new firmware updates.

Being the software experts that MS are, isn't it possible for them to free up ram as the gen progresses, thus making more than 5 available to developers?

Same with Sony.

Me thinks pretty much even systems.
 

Bombadil

Banned
Over time the os usage of ram shrinks , doesn't it?
Ps3's XMB used a lot of ram initially, but the footprint got smaller with new firmware updates.

Being the software experts that MS are, isn't it possible for them to free up ram as the gen progresses, thus making more than 5 available to developers?

Same with Sony.

Me thinks pretty much even systems.

Yeah, that's what I thought, too.

But ultimately this discussion is going to center on the type of RAM each console has and whether it's speed or amount that will make the difference.
 

B.O.O.M

Member
I'm not sure this is what you're getting at, probably not, but Gaikai was showing a thing where you could start playing a game before it finished downloading. It would download the bits you need to get started then download the rest in the background while you play, so you can start playing faster.

http://www.youtube.com/watch?feature=player_embedded&v=SRyx8dFooV0

woah I have not seen that before. So I guess this is one potential answer to the lag faced in streaming games right? Very interesting
 

mrklaw

MrArseFace
Don't the 8 cores take care of that concern?

No, it's about bus bandwidth and GPU capacity, complicated with edram and special sauce

On a simple level, 68Gb/s should be able to transfer 1GB every 1/60 second. But that won't mean 1Gb unique data, and it won't reach those theoretical limits anyway, you have latency etc. and data will need to be written back to memory many times

I'm mainly curious if you take an 'industry average' streaming engine and give it 4 or 8GB (or 3.5 and 5), how much would be allocated to different parts? Assuming mandatory install on HDD for better transfer speeds.

I'd expect the HDD speed to be the real limiting factor. No point actually drawing 1Gb unique data per frame if you can only transfer in 200MB/s from HDD
 

gofreak

GAF's Bob Woodward
woah I have not seen that before. So I guess this is one potential answer to the lag faced in streaming games right? Very interesting

In this case it's not a streaming game. It's a game executing locally. But you can start it very quickly after downloading. So buy big multi-gigabyte game -> start playing in a few minutes. Like they mention you could start playing instantly on a streamed session, then migrate to the locally executing copy when it is ready a few minutes later.

It's sort of something they were doing so that 'gaikai' could be useful in rural areas and areas with poor connections, not just for instant game streaming on good connections. And would obviously be of interest to people who want to download and play local copies of games rather than streaming, if it can in fact let you start playing much sooner.
 

alcide

Banned
Any word on any form of actual hardware supported Anti-Aliasing? I want whatever the guys over in the AA thread believe is the best to be hardware forced on every game.
 
Top Bottom