• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGLeaks: PlayStation 4 "Orbis" Roadmap

RoboPlato

I'd be in the dick
My question is do they want to launch with two first party shooters, assuming Planetside 2 is a launch title too?

If you have the choice of Planetside 2 or KZ4 which one is going to be a bigger draw considering that PS2 is FTP and solely based around multiplayer, where KZ4 will almost certainly have a SP element and MP trying to be as interesting as possible.
I have a feeling one of the new IPs will be at launch with Planetside 2 available as well. I think KZ4 would be a good game for the March timeframe Sony likes to get titles out during. GG will have a few months extra time to work on final hardware and they would be able to show it at the unveiling since it won't be too far out.

EDIT: I'm curious to see hear what kind of customizations Sony will be doing on their GPU. If what we hear now is true it obviously won't be as huge as Microsoft's undertaking but it won't be off the shelf either. I'd hope they focus on lighting and tesselation. Those two elemets are going to be incredibly important next generation, both for optimization and visuals.
 
A HD7850 type GPU is a close system with 4GB of GDDR5 ram is going to give you some really nice looking games.

If vgleaks is correct and Sony really feels they can achieve 10x the performance of the ps3 with these specs(even before the upgrade to 4gb ram) I don't see why anyone is complaining.
 

RamzaIsCool

The Amiga Brotherhood
If vgleaks is correct and Sony really feels they can achieve 10x the performance of the ps3 with these specs(even before the upgrade to 4gb ram) I don't see why anyone is complaining.

I don't think anyone at this point doubts that it's going to be a pretty powerfull machine. The main question now is how it matches against the next Xbox.

I think Planetside 2 will be preinstalled on every PS4. This way it will reach a largest audiance and maybe more important Sony can put on the box that the PS4 is bundled with a free game! Which is kinda a cheap trick as it is free already, but if I were Sony I wouldn't hesitate.
 

RoboPlato

I'd be in the dick
I don't think anyone at this point doubts that it's going to be a pretty powerfull machine. The main question now is how it matches against the next Xbox.

I think Planetside 2 will be preinstalled on every PS4. This way it will reach a largest audiance and maybe more important Sony can put on the box that the PS4 is bundled with a free game! Which is kinda a cheap trick as it is free already, but if I were Sony I wouldn't hesitate.
Some PS3 bundles are doing that with Dust 514 already.
 
Here you go people. I've been saying it for pages and no one wants to acknowledge the near certainty of two GPUs in the PS4. Here is the patent filed by Sony from a month ago: http://appft.uspto.gov/netacgi/nph-...68".PGNR.&OS=DN/20120320068&RS=DN/20120320068

That is from USPTO.gov. The patent describes "Dynamic Context Switching Between Architecturally Distinct GPUs." That is in the fucking patent filed by SCE (Playstation) so it isn't the TV or laptop division using this chip architecture. If you can locate the image in the patent, you can envision how the chip looks and works. The two GPUs also pull from two distinct memory pools (RAM).
 

PaulLFC

Member
Here you go people. I've been saying it for pages and no one wants to acknowledge the near certainty of two GPUs in the PS4. Here is the patent filed by Sony from a month ago: http://appft.uspto.gov/netacgi/nph-...68".PGNR.&OS=DN/20120320068&RS=DN/20120320068

That is from USPTO.gov. The patent describes "Dynamic Context Switching Between Architecturally Distinct GPUs." That is in the fucking patent filed by SCE (Playstation) so it isn't the TV or laptop division using this chip architecture. If you can locate the image in the patent, you can envision how the chip looks and works. The two GPUs also pull from two distinct memory pools (RAM).
Patents don't always lead to real products.
 
Patents don't always lead to real products.

Bravo. But its the best piece of information that we have. An "insider source" from VGleaks is barely credible. Whats interesting though is these "insiders" are releasing information that fits the mold of this patent.

Sony has hundreds of gaming related patents that they've never used. Just because it exists, doesn't mean it's going to be in the PS4.

I'm not disagreeing with you. My point is here is a patent that is describing chip architecture with various functions between two GPUs filed in July and recently approved is probably the best indication of what will be under the hood compared to these "insider sources" everyone is citing to assert their claims. Also, I can't be the only one who finds it oddly suspicious that we are be teased by multiple sources, Sony, and receiving a shitload of conveniently timed leaks just weeks after this patent is approved. Can I dare you to a challenge? Find me two more patents filed by Sony that resembles this patent. Only then can you claim its existence is meaningless.
 

Ashes

Banned
Here you go people. I've been saying it for pages and no one wants to acknowledge the near certainty of two GPUs in the PS4. Here is the patent filed by Sony from a month ago: http://appft.uspto.gov/netacgi/nph-...68".PGNR.&OS=DN/20120320068&RS=DN/20120320068

That is from USPTO.gov. The patent describes "Dynamic Context Switching Between Architecturally Distinct GPUs." That is in the fucking patent filed by SCE (Playstation) so it isn't the TV or laptop division using this chip architecture. If you can locate the image in the patent, you can envision how the chip looks and works. The two GPUs also pull from two distinct memory pools (RAM).


I think the last time somebody brought that patent up, I learnt that the author might be working for Sony Bend. ha ha.

So no ssd for ps4 by default?

No idea. But we would had heard already if that was the case.
 

i-Lo

Member
Too expensive compared to HDD. Too pointless. I rather get a nice hefty 500GB HDD than 128GB SDD.

My biggest request: USB 3.0.

How much more expensive than 60GB 2.5" SATA in 2006 would a 128GB ssd be today?

Also when buying in bulk one would assume the occurence of negotiations based on price between corporations.
 

mrklaw

MrArseFace
Too expensive compared to HDD. Too pointless. I rather get a nice hefty 500GB HDD than 128GB SDD.

My biggest request: USB 3.0.

Just give us proper SATA connections. decent size by default but cheap enough for Sony, and then let us swap out a nice fast SSD
 

GopherD

Member
Here you go people. I've been saying it for pages and no one wants to acknowledge the near certainty of two GPUs in the PS4e. Here is the patent filed by Sony from a month ago: http://appft.uspto.gov/netacgi/nph-...68".PGNR.&OS=DN/20120320068&RS=DN/20120320068

That is from USPTO.gov. The patent describes "Dynamic Context Switching Between Architecturally Distinct GPUs." That is in the fucking patent filed by SCE (Playstation) so it isn't the TV or laptop division using this chip architecture. If you can locate the image in the patent, you can envision how the chip looks and works. The two GPUs also pull from two distinct memory pools (RAM).

Deleted by request.
 
Too expensive compared to HDD. Too pointless. I rather get a nice hefty 500GB HDD than 128GB SDD.

My biggest request: USB 3.0.

Would it not be possible to get a SDD + cloud storage? would be expensive at the moment but come 2014 it would become cheaper. Also with faster blu-ray drives, game installs may not be required.

Could be possible to have one SSD, with a spare slot to insert a second one at a later date. Only on PS4, I am sure MS would not allow this.
 

DBT85

Member
Too expensive compared to HDD. Too pointless. I rather get a nice hefty 500GB HDD than 128GB SDD.

My biggest request: USB 3.0.

USB 3 would be welcome indeed. AND PUT ONE ON THE FUCKING BACK!

How much more expensive than 60GB 2.5" SATA in 2006 would a 128GB ssd be today?

Also when buying in bulk one would assume the occurence of negotiations based on price between corporations.

My question about the included HDD size for the new consoles is based around partly costs but also partly installs. Now that the consoles will be more powerful, and that the entire generation for 3rd party games will not be limited to dvd sized games, I'm wondering how large those installs are going to be and how many might be mandatory. I'm also hoping a full install is an option on all games.

If those installs are indeed larger as a default, and with the digital presence being far more important to MS and Sony than at the start of the last generation, I don't think they would want to be settling for a 120GB SSD. This is not an option shared by some others however.

Consider that if they follow the PS Vita path of all games being out day and date digitally, some people are going to be downloading 25+GB games that they have either bought or got on PS+ or whatever.

Also consumers have been used to seeing the HDD numbers going up for the last 6 years, suddenly going from a 500GB PS3 to a 120GB PS4 might make unknowing consumers wonder what the fuck happened.

They also might not need to rely on fast HDDs or installs at all. If every PS4 has say the 16GB flash we are expecting in it, it can constantly load data from the slow HDD or the BluRay to that in the background. I don't know enough about how they would go about doing that though.
 

i-Lo

Member
Would it not be possible to get a SDD + cloud storage? would be expensive at the moment but come 2014 it would become cheaper. Also with faster blu-ray drives, game installs may not be required.

I highly doubt bd drive speed next gen would equal the read speed of the sata drive present in a ps3 today. Assets are only going to get bigger next gen so HDD will still remain the preferred choice for streaming data off of.
 

Binabik15

Member
Question because I just thought of it: with high bandwidth being needed for particles and transparency effects (right?), could PS4 do good fire effects? I'm asking because I showed my brother a video of Metro 2033's flamethrower section when he asked why people are saying that it's this gpu killer and in the vid, as it did for me, the fps tanked hard.

I really want a flamethrower capable system!
 

PaulLFC

Member
Bravo. But its the best piece of information that we have. An "insider source" from VGleaks is barely credible. Whats interesting though is these "insiders" are releasing information that fits the mold of this patent.
No need for the condescending tone. It's no more reliable than VGLeaks - it's a patent that may or may not get used, just like any other patent. It may happen, it may not. We just have to wait and see.
 

omonimo

Banned
Question because I just thought of it: with high bandwidth being needed for particles and transparency effects (right?), could PS4 do good fire effects? I'm asking because I showed my brother a video of Metro 2033's flamethrower section when he asked why people are saying that it's this gpu killer and in the vid, as it did for me, the fps tanked hard.

I really want a flamethrower capable system!

Of course, high bandwidth is all we needs for those things. In any case, will be a very long waiting; I can't believe we have a whole year before to see a ps4, at the best. I'm so impatience, it's just a torture.
 

i-Lo

Member
Consoles won't go under-utilised. It's about the last thing about these consoles I'd put a question mark on. They'll wring every last drop out of these.

If the ram became the weak link in the chain then it would inhibit the gpu's functionality or rather its full potential. That's what I meant. With 4GB GDDR5 instead of 2 that concern can be subdued.
 
Here you go people. I've been saying it for pages and no one wants to acknowledge the near certainty of two GPUs in the PS4. Here is the patent filed by Sony from a month ago: http://appft.uspto.gov/netacgi/nph-...68".PGNR.&OS=DN/20120320068&RS=DN/20120320068

That is from USPTO.gov. The patent describes "Dynamic Context Switching Between Architecturally Distinct GPUs." That is in the fucking patent filed by SCE (Playstation) so it isn't the TV or laptop division using this chip architecture. If you can locate the image in the patent, you can envision how the chip looks and works. The two GPUs also pull from two distinct memory pools (RAM).
If such a configuration came to fruition (say the HD7850 plus PS3s RSX) could the RSX potentially be utilised to provide functionality such as 3D, without taxing the main GPU?
 

Binabik15

Member
Of course, high bandwidth is all we needs for those things. In any case, will be a very long waiting; I can't believe we have a whole year before to see a ps4, at the best. I'm so impatience, it's just a torture.

Hopefully you're right!

I want that x5 at least. More sparks and patches of burning fuel, realtime lighting, good smoke. Hnnngh.

I'm a little pyromaniac, best thing about RtCW was toasting people trying to get into MY bunker (besides getting lots of points for being a medic, I couldn't shoot for shit, but still helped my team!).
 

gofreak

GAF's Bob Woodward
Question because I just thought of it: with high bandwidth being needed for particles and transparency effects (right?), could PS4 do good fire effects? I'm asking because I showed my brother a video of Metro 2033's flamethrower section when he asked why people are saying that it's this gpu killer and in the vid, as it did for me, the fps tanked hard.

I really want a flamethrower capable system!


The good news for PS4 relative to PS3 is that resolution won't be increasing so much while 'buffer bandwidth' will go up significantly, if the GDDR5 rumours are correct.

The GPU will have up to the full whack of memory bandwidth to play with on buffer related tasks (192GB/s minus CPU usage, up to 12GB/s, minus GPU input bandwidth). PS3's GPU had up to 22GB/s, minus GPU input bandwidth (although in some games, some buffer related work was taken off the GPU and done by Cell - for example DoF and motion blur effects - using xdr or local sram bandwidth)

Durango, with an eDRAM approach again, should have plenty of buffer bandwidth, more than Orbis.
 
Why do people still insist on talking about SSD?

If anything AT MOST we'll see a hybrid hard drive, but in all honesty I just expect the Flash to be directly on the mobo so every unit can have that standard speed for devs to work with.
 
Deleted by request.

I hope it stays. So far, there hasn't been a single counter argument to what I'm saying. Maybe there is one out there but I'm not going to actively search for it. If you have information disproving my claim of this patent mixed with current rumors = PS4 architecture, please by all means, post it so I stop spreading it. Otherwise, its the best piece of information to reference along with select rumors to base our speculation off of.
 
Durango, with an eDRAM approach again, should have plenty of buffer bandwidth, more than Orbis.

I don't have a working knowledge on the said topic, but the impression i got from reading is that edram while fast is expensive, hot and for very specific functions. It can't entirely mitigate the lower bandwidth situation.
 

gofreak

GAF's Bob Woodward
I don't have a working knowledge on the said topic, but the impression i got from reading is that edram while fast is expensive, hot and for very specific functions. It can't entirely mitigate the lower bandwidth situation.

For buffer tasks, if there's enough of it, it can absorb a lot of the required bandwidth usage.

You're right in that buffer tasks are only one source of bandwidth consumption - albeit a significant one - and depending on the flexibility of the eDRAM you can't trade an excess of its bandwidth against other tasks like you can with main memory.

But for alpha effects etc. there'll be lots of bandwidth there in Durango.

PS4's situation v Durango on this front should probably be quite a bit better than PS3's vs 360 though. It should have plenty of bandwidth for those tasks even if not as much, if the GDDR5 rumours are correct.
 

i-Lo

Member
The good news for PS4 relative to PS3 is that resolution won't be increasing so much while 'buffer bandwidth' will go up significantly, if the GDDR5 rumours are correct.

The GPU will have up to the full whack of memory bandwidth to play with on buffer related tasks (192GB/s minus CPU usage, up to 12GB/s, minus GPU input bandwidth). PS3's GPU had up to 22GB/s, minus GPU input bandwidth (although in some games, some buffer related work was taken off the GPU and done by Cell - for example DoF and motion blur effects - using xdr or local sram bandwidth)

Durango, with an eDRAM approach again, should have plenty of buffer bandwidth, more than Orbis.

Then why didn't Sony opt for the same memory setup?
 

gofreak

GAF's Bob Woodward
Then why didn't Sony opt for the same memory setup?

There's trade-offs.

eDRAM is in Durango, by the sound of things, to compensate for relatively low bandwidth in other parts of the system. So that bottom-of-the-pipeline tasks won't suffer as a consequence of that decision.

But for other tasks, and for general GPU input bandwidth, there's less than with Sony's approach.

Sony's approach should also be simpler to manage (for render programmers at least) and more flexible. The EDRAM approach kind of dictates constraints on how you use the systems' bandwidth. Those constraints might align with typical use-cases, of course, but it gives you a little less freedom.
 
Durango HAS to have something to compensate for BW and TFLOPS in a sense. Or insiders wouldn't be saying that these consoles are within 5/10% of each other, or even stuff like Durango might be more powerful. (Which I doubt comes from their understanding of the tech, and more like info given to them from people who have access to the kits, like devs).

But right now we have nothing to go on but them. Something's gotta give!

Is it GCN vs GCN 2? Is MS using enough ESRAM? etc

As thuway has said, there's almost nothing that makes it probable that the GPU has some sort of magic dust fueling it, but the design will have to be pretty intelligent for it to not look faster on paper (considering the similarities in tech) but somehow perform at parity or even faster.

I'm dying to know.
 
I don't care what storage system (HD or SSD) comes with the PS4. Regardless of the installed hardware, I'm going to swap it out for the best large capacity SDD available at the time.
 

Ardenyal

Member
How many render targets deferred renderers usually need? In my limited understanding deferred renderering engines might have some struggles on limited frame buffer size
 

Nachtmaer

Member
Is it GCN vs GCN 2? Is MS using enough ESRAM? etc

As far as we know, GCN 2 won't be a big overhaul. They might be able to up the efficiency a bit since 28nm is well understood by now and add some HSA features but besides that I don't think there will be any major differences.

According to those slides from CES, AMD made it clear that they use the term GCN 2.0 just to indicate that their entire (new) line-up will be using GCN.

It'd be nice if AMD can pull off a Barts (6870/6850) for the consoles. Barts used the same VLIW 5 architecture as their previous GPUs but was better balanced and more efficient on the same process. It had a decent amount of shaders less and still got close to a 5870.
 

gofreak

GAF's Bob Woodward
How many render targets deferred renderers usually need? In my limited understanding deferred renderering engines might have some struggles on limited frame buffer size

4 or so, I think. 5 including the final framebuffer.

The thing is, though, to produce the final framebuffer you need to sample from the preceding 4. So in an eDRAM situation, if it's like 360's eDRAM, you need to copy those buffers back to main memory and then sample them from there when rendering the final frame.

So eDRAM size wouldn't actually be that big of a deal in this case. You'd render the buffers one at a time, only one would need to be in eDRAM at a time.

The slightly annoying thing with 360 eDRAM - I am told - was the copy-back to main memory for sampling in the final pass.

If the eDRAM is so big that it can hold all the buffers together, and the GPU can sample from the eDRAM directly, it would get around that issue.

There's talk in Durango's case that there'll be a hardware blitting unit that will let an application copy data from eDRAM back to main memory without needing the CPU's attention edit - see below for correction. That's reportedly one of the '3 blocks' of custom hardware. Having dedicated hardware to do that means you wouldn't have to wait for CPU thread to be ready, and so it should be faster than with 360. This would be useful if the eDRAM is of limited capacity and you're doing multi pass rendering of any kind...and a lot of newer lighting and other techniques are quite heavy on multiple passes.

edit - apparently the GPU handled the copy-back in 360, not the CPU. But the point still stands that there's a stall while you wait for that copy-back to complete if the next pass depends on data generated by the previous. I'm presuming if a hardware blitter is present in Durango that it works out faster than the copy-back system in 360.
 
I hope it stays. So far, there hasn't been a single counter argument to what I'm saying. Maybe there is one out there but I'm not going to actively search for it. If you have information disproving my claim of this patent mixed with current rumors = PS4 architecture, please by all means, post it so I stop spreading it. Otherwise, its the best piece of information to reference along with select rumors to base our speculation off of.

Patents are created not because a product exists or will exist, but because in case they decide to make it, or in case they get sued cause their architecture/idea is similar to something else. It's also done to push the competition out. It's the same as cancelled projects; companies renew trademarks and what not. It's a way of futureproofing.

Example:

WXO4i.jpg


This was originally filed a while ago, but recently renewed. It doesn't mean Sony is going to make us do butterfly kicks.
 

mrklaw

MrArseFace
4 or so, I think. 5 including the final framebuffer.

The thing is, though, to produce the final framebuffer you need to sample from the preceding 4. So in an eDRAM situation, if it's like 360's eDRAM, you need to copy those buffers back to main memory and then sample them from there when rendering the final frame.

So eDRAM size wouldn't actually be that big of a deal in this case. You'd render the buffers one at a time, only one would need to be in eDRAM at a time.

The slightly annoying thing with 360 eDRAM - I am told - was the copy-back to main memory for sampling in the final pass.

If the eDRAM is so big that it can hold all the buffers together, and the GPU can sample from the eDRAM directly, it would get around that issue.

There's talk in Durango's case that there'll be a hardware blitting unit that will let an application copy data from eDRAM back to main memory without needing the CPU's attention edit - see below for correction. That's reportedly one of the '3 blocks' of custom hardware. Having dedicated hardware to do that means you wouldn't have to wait for CPU thread to be ready, and so it should be faster than with 360. This would be useful if the eDRAM is of limited capacity and you're doing multi pass rendering of any kind...and a lot of newer lighting and other techniques are quite heavy on multiple passes.

edit - apparently the GPU handled the copy-back in 360, not the CPU. But the point still stands that there's a stall while you wait for that copy-back to complete if the next pass depends on data generated by the previous. I'm presuming if a hardware blitter is present in Durango that it works out faster than the copy-back system in 360.

there will still potentially be some stalling if the GPU is doing heavy access to main memory, but probably not as bad as needing the GPU to actually do the work for you.
 

gofreak

GAF's Bob Woodward
there will still potentially be some stalling if the GPU is doing heavy access to main memory, but probably not as bad as needing the GPU to actually do the work for you.

Maybe the blitter has some dedicated lanes to main memory, if that's possible. Optimised for writes-only.

=> No contention with other CPU/GPU traffic when doing these write-backs.

Of course the GPU might still be busy doing some other things and not be ready to start the next pass, but the hold-up then wouldn't be the memory system's fault :) Although there'll still, of course, be a delay, but it would be minimised vs the 360 setup.

It would add some board complexity though, asides from even just the hardware itself.
 
Top Bottom