• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGLeaks: PlayStation 4 "Orbis" Roadmap

Mikey Jr.

Member
I think Sony definitely needs a FPS "Halo killer" type of game near the launch of PS4, but I don't think Killzone franchise has the reputation to be that. It already has a certain stigma of "yeah this is a solid series but isn't essential" whereas Halo is truly a sensation for gamers. If it were up to me I would assign Guerilla to a new FPS IP, give them plenty of resources, and hype the shit out of it upon release. I don't even like FPS games personally, but that's what I would do in Sony's shoes.

The thing is is that KZ MP has to do something new. If its like Halo or COD, then whats the point?
 
R

Rösti

Unconfirmed Member
Hmm, I apparently overlooked this though: ftp://superdae.com.au/

Probably nothing, but who knows. Seems like a rather insecure way to share information anyhow. If he would have shared the login details that is. The fact that the main site is gone remains.
 

CLEEK

Member
It doesn't matter what the game is called, what matters is marketing presence and I don't see Sony ever getting near what Microsoft accomplishes with Halo.

Killzone just doesn't have the critical clout that helped Halo.

Halo CE was pretty much perfect. A technical and playable marvel. It's popularity was based on its excellence. Halo 2, for all of its faults in single player, brought true online gaming to the masses. The hype for Halo 3, while well handled by MS and Bungie, was huge anyway.

A closer comparison for Killzone would be Gears. A well marketed, high profile game that scores well enough and has its fans, but will never reach the critical highs of Halo or Mario or other top tier games.
 

erpg

GAF parliamentarian
Kill zone gets (NEEDS) a reboot.
Ratchet and Clank gets a much needed transition to a better developer than the creatively void Insomniac.
SOCOM and Resistance get the boot.
Dark Cloud gets another chance.
Demon Souls 2 becomes Japan Studio's reason to exist.
Santa Monica get free reign to do whatever.

That's my hit list. Yoshida has to trim the fat and get bloody.
 

Lord Error

Insane For Sony
Cool - you made a post showing i'm right. For the record - dude didn't reference shooters, specifically
But you're not right. 116ms is very good for a 30FPS game. At the time the game was made it was believed that 100ms is theoretical minimum for a 30FPS game, because only one or two games have accomplished it. It's pretty clear that they took the issue seriously, worked hard to improve latency, and managed to improve it a lot.
 

Jack_AG

Banned
The whole thing was completely overblown. KZ2 didn't have much more controller lag than other games tested back then, like GTAIV etc.
Which were also blasted for their video latency (peripheral lag), albeit, weren't Sony exclusives and therefore didn't get much heat for it.

I'm just looking forwards to less video latency which means less feel of peripheral lag. Some TVs are nailing it with response (not printed "response" which should be called "pixel state change", but isn't, therefore misleading) for video latency at 8 and 16ms - here's hoping we can get PC-like experiences being the next frame refresh the only frame to wait for... unless you run deferred processes which compound rendering times (cough FROSTBITE 2 cough), then its just shit.
 
I wouldn't mind some genre changes for these IP's. Killzone - 3rd person action adventure. Ratchet & Clank - 2D platformer, Syphon Filter - Open World badassness.
 

i-Lo

Member
It'll be interesting to see what kind of FPS SSM has been developing...

Which were also blasted for their video latency (peripheral lag), albeit, weren't Sony exclusives and therefore didn't get much heat for it.

I'm just looking forwards to less video latency which means less feel of peripheral lag. Some TVs are nailing it with response (not printed "response" which should be called "pixel state change", but isn't, therefore misleading) for video latency at 8 and 16ms - here's hoping we can get PC-like experiences being the next frame refresh the only frame to wait for... unless you run deferred processes which compound rendering times (cough FROSTBITE 2 cough), then its just shit.

How's the latency on LED displays these days (I remember Samsung LCDs were among the weaker performing ones)? Our family bought a plasma on my request just so that I could sidestep this issue.
 

Jack_AG

Banned
116ms is the same as the vast majority of 30fps games.

But you're not right. 116ms is very good for a 30FPS game. At the time the game was made it was believed that 100ms is theoretical minimum for a 30FPS game, because only one or two games have accomplished it. It's pretty clear that they took the issue seriously, worked hard to improve latency, and managed to improve it a lot.

116 @ 30 - the math does not work. For 60, sure, it does, for 30, no. Figure out if you don't get it. 99ms or 132ms for 30fps, not 116.

A lot more games using a forward render, no vsync and triple buffering hit 99ms - which is a majority.

Uncharted 1, for example nailed the 3@30 rule then jumped to 4@30 for Uncharted 2 once Vsync and a buffer was implemented.

Deferred rendering, AA, etc all add those precious MS to rendering a single frame.

Then we must consider framerate drops, frames carried over and displayed from the buffer to keep from tearing or dropping - holdovers - increase latency even further.

Since most games don't use deferred, vsync or double/triple buffering (as is seen by countless torn frames, framerate drops, etc in 90% of games) - the majority hit the mark.

I'm not arguing that KZ3 wasn't an improvement over the "swim through molasses" KZ2 - but to have a latency of 4+ frames isn't "good".

It's a monumental step in a direction the industry has never been in. From my Atari 2600 through my PS2 - I've never experienced it quite like this gen. Of course, you say "BUT YAR TEVEEEE"! and, of course, you'd be correct to suspect it but numbers measured are done with a CRT ;)

I still have my old gaming CRT from my old PC i use to test latency between monitors, my TV's I'm interested in, etc.

I ballpark my laptop by comparing it to my CRT, then drag my laptop with me to look at TVs, HDMI, mirror the screen, run my MS timer - take a few pics with a fast shutter - compare notes, walk out happy.

Anyhow - my point is - no more fuggin video latency. I am hoping the machines will be capable enough to produce frames fast enough to limit the amount of holdbacks - even 3 frames is unacceptable - even though it is the theoretical "floor".

Crossing my fingers that next-gen will change.

How's the latency on LED displays these days (I remember Samsung LCDs were among the weaker performing ones)? Our family bought a plasma on my request just so that I could sidestep this issue.

I haven't tested LED displays since I've read they aren't the best so I've stuck with known LCD TVs. I'm not as big of a videophile as I am an audiophile so I have a crappy LG. The picture isn't amazing - it's great - not amazing - but I measured 16ms at 60fps and 33ms at 30fps. Other TVs hit the same numbers, but not my wallet a few years ago.

My next purchase will be about $170 for a 23" monitor with measured 8ms video latency - used in fighting tourneys for obvious reasons... although that was 2012 models. Will be keeping up with changes this year as I already have a dead pixel on my LG - waiting for a few more to go and get annoying on me before scrapping it.
 

CLEEK

Member
Not necessarily.

Remember how IBM piggybacked off CELL for the XENON processor in the X360?

What IBM did was legal, and due to oversight in the contractual terms between Sony and IBM. If either MS or Sony haven't written binding clauses in their contracts with AMD, the same thing could occur again.

I'd imagine that the CELL technology being used by MS in the 360 is second only to PS3 haemorrhaging money as Sony's biggest mistake.
 
Call of Duty didn't really break through like Halo did until it's fourth numbered title; Killzone has a chance. Killzone 2 is fantastic, one of my favorite multiplayer games ever, and while Killzone 3 was disappointing, it wasn't BAD; just average. I think that there's enough good will left for the series for it to deserve another shot on the PlayStation 4. I think Killzone games always serve as great tech demos (although always upset later that year by Naughty Dog,) and I don't think that Guerilla has made the best Killzone they can yet.

I love InFamous, but I'd rather leave the fantastic ending(s) of 2 alone than have an unnecessary sequel. The endings are too different to create a sequel with a common starting point.

Ratchet and Clank should slow down, LittleBigPlanet should be a once-a-generation thing, or only produce really significant sequels and spin-offs (which they've been good about so far,) and Resistance should just be handled a little more carefully.

I love Sony's franchises, but I trust their developers more. If they can deliver on the same quality level next generation as they have this generation, I can continue to pledge my allegiance to PlayStation.
 

HooYaH

Member
I expect a port of Planetside 2, updated version of Dust 514, and Free Realms(Is this game successful?) during launch period. I have a gut feeling there will be a F2P Wipeout.
 
Not necessarily.

Remember how IBM piggybacked off CELL for the XENON processor in the X360?

CELL was a 3 company investment(Sony/Toshiba/IBM - it wasn't just meant for games). IBM probably owned that part of the design.

Situation here is MS hired AMD to design for them, and MS will own the "rights" to it. From what little I understand.
 

Shayan

Banned
CELL was a 3 company investment(Sony/Toshiba/IBM - it wasn't just meant for games). IBM probably owned that part of the design.

Situation here is MS hired AMD to design for them, and MS will own the "rights" to it. From what little I understand.

Cell was meant for games . not just for general purpose computing . UC3 and GOW3 are possible because of cell since rsx is weaker compared to xenos
 

Swifty

Member
Cell was meant for games . not just for general purpose computing . UC3 and GOW3 are possible because of cell since rsx is weaker compared to xenos
umm, no. Cell is a processor that's great for bandwidth but not latency. It's a powerful processor but games need speed not throughput. Sure, it can crunch a lot of numbers but what's the point if it's difficult to get your result in real-time compared to other solutions?
 
Speaking of launch titles and GT6, we all know polyphony likes to take their time with the series. Could it be possible for them to release GT6:prologue when the PS4 launches after how many copies of GT5:prologue sold?
 

Servbot24

Banned
The thing is is that KZ MP has to do something new. If its like Halo or COD, then whats the point?

Naturally. I don't think it will ever be possible to match those series unless you introduce a few completely new concepts to the genre.

Chan: Took them 3 years or so to make prologue, so I'd say it's possible... highly doubt it will happen though. Maybe later 2014.
 
Speaking of launch titles and GT6, we all know polyphony likes to take their time with the series. Could it be possible for them to release GT6:prologue when the PS4 launches after how many copies of GT5:prologue sold?

They might or they might just do what they did with PS2 a GT early on that have a fair amount of content .
They do have all the cars and tracks from GT5 plus what they working on for the pass few years .
Then 2 or 3 years later GT7 which IMO is the best way to go since they can always add DLC to GT6 until GT7 ready .
Getting a GT game out early like the first year or half is a must for Sony and would really help in EU .
 

NBtoaster

Member
116 @ 30 - the math does not work. For 60, sure, it does, for 30, no. Figure out if you don't get it. 99ms or 132ms for 30fps, not 116.

A lot more games using a forward render, no vsync and triple buffering hit 99ms - which is a majority.

Uncharted 1, for example nailed the 3@30 rule then jumped to 4@30 for Uncharted 2 once Vsync and a buffer was implemented.

Deferred rendering, AA, etc all add those precious MS to rendering a single frame.

Then we must consider framerate drops, frames carried over and displayed from the buffer to keep from tearing or dropping - holdovers - increase latency even further.

Since most games don't use deferred, vsync or double/triple buffering (as is seen by countless torn frames, framerate drops, etc in 90% of games) - the majority hit the mark.

116ms isn't the average you're right - but it's higher:

http://www.eurogamer.net/articles/digitalfoundry-lag-factor-article?page=3

The average videogame runs at 30FPS, and appears to have an average lag in the region of 133ms.

And 116ms or 84ms is possible in 30fps games

http://www.eurogamer.net/articles/digitalfoundry-nfs-hot-pursuit-face-off?page=2

multiple measurements with Xbox 360 Need for Speed: Hot Pursuit confirm a five frame delay - so 83ms in total.
 
CELL was a 3 company investment(Sony/Toshiba/IBM - it wasn't just meant for games). IBM probably owned that part of the design.

Situation here is MS hired AMD to design for them, and MS will own the "rights" to it. From what little I understand.

MS will own the rights to one specific design. They will not own the rights to the philosophy behind components and processors that accomplish a certain function in general.

It's why I think the whole hype surrounding the special chips (especially those claiming it will turn a 1.2TF chip into a 3TF chip) is largely horseshit.

In the case of Xenos, unified shaders were a known quantity and AMD was able to advance their technology so it was included first in the Xbox 360.

If these magical chips were so great, we'd see them in an AMD or nVidia roadmap for chips coming out in the future. But we don't. So that tells me that these chips aren't magic, but just something that may tailor to Microsoft's goals with the project as a whole rather than something that's the next big step in graphics tech.
 
MS will own the rights to one specific design. They will not own the rights to the philosophy behind components and processors that accomplish a certain function in general.

It's why I think the whole hype surrounding the special chips (especially those claiming it will turn a 1.2TF chip into a 3TF chip) is largely horseshit.

In the case of Xenos, unified shaders were a known quantity and AMD was able to advance their technology so it was included first in the Xbox 360.

If these magical chips were so great, we'd see them in an AMD or nVidia roadmap for chips coming out in the future. But we don't. So that tells me that these chips aren't magic, but just something that may tailor to Microsoft's goals with the project as a whole rather than something that's the next big step in graphics tech.

It could all be just a bunch of hot air.
 

Gorillaz

Member
Haha, I like the "What?" when asked if he has an Orbis dev kit, followed by the smile. All but confirmation.



I'm actually not so sure. Not everyone likes shooters, and having one preintalled might put the "shooters console" stigma onto the PS4, something I think the 360 has suffered from.

It would be a good FPS and a nice way to begin a F2P model on PS4. I mean UC3 is basically a F2P MP now so it would work. Maybe not preinstalled but like I said it should be a ps plus type of thing at least.
 

thuway

Member
MS will own the rights to one specific design. They will not own the rights to the philosophy behind components and processors that accomplish a certain function in general.

It's why I think the whole hype surrounding the special chips (especially those claiming it will turn a 1.2TF chip into a 3TF chip) is largely horseshit.

In the case of Xenos, unified shaders were a known quantity and AMD was able to advance their technology so it was included first in the Xbox 360.

If these magical chips were so great, we'd see them in an AMD or nVidia roadmap for chips coming out in the future. But we don't. So that tells me that these chips aren't magic, but just something that may tailor to Microsoft's goals with the project as a whole rather than something that's the next big step in graphics tech.

Prepare to be proven wrong.
 
And KZ2 did that.....
KZ3 tried to be a bit more COD which was disappointing.
I really miss KZ2 back in it's prime. Man the beta was soooo fuckin good

Killzone 2 was so fucking good. That first week or 2 when it first came and everything was new and crazy, fuck, it was was like we were somehow playing a PS4 game 5 years early.
 

thuway

Member
So explain why this isn't in any of the upcoming GPUs. If it were so great, you'd think AMD would want a competitive advantage against nVidia.

To be quite honest, the custom silicon is just that- CUSTOM. You have to understand, peak FLOPS are not the only determinant on performance. GAF has been fixated with that number since the beginning. I don't know about this bitter thing, but I do know it will have DSPs, large cache's, and ED RAM.

The machine will be 1.28 TF with 8 GB DDR3/4 alongside ED RAM, like we've mentioned previously over and over and over again.

It won't be long till someone leaks it all.

PS4 is 1.84 TF, 4 GB 192 GB/S RAM, alongside 4 steamroller cores. These are your next-gen systems. Derive what you can and will.
 
So explain why this isn't in any of the upcoming GPUs. If it were so great, you'd think AMD would want a competitive advantage against nVidia.

Well GCN2 will be on their Sea Islands Gpus. That's one thing that is on the roadmap.

Another thing could be that this is an APU, and the whole AMD "make it yourself" approach to the future of the microchip industry maybe wouldn't work if AMD was just bringing people on board and stealing shit. Now it could also be that this wouldn't work in the discrete GPU market, and you would have to design a whole system (in this case APU) from top to bottom to work the way MS intends it to.

There could be a lot of factors, with the most obvious one being that the AMD team working with MS (according to rumors, AMD's priority) has different level of resources, had different shcedules to work on the design, and they cannot share it with the teams that work/worked with Sony and Nintendo.

Hence why reportedly, Sony is going with a more of the shelve design than MS. Reason why the rumored specs in the PS4 seem much more straightforward. This is not to say that Sony won't customized its own APU, according to their own philosophies. But we speculate based on what we have.
 
To be quite honest, the custom silicon is just that- CUSTOM. You have to understand, peak FLOPS are not the only determinant on performance. GAF has been fixated with that number since the beginning. I don't know about this bitter thing, but I do know it will have DSPs, large cache's, and ED RAM.

The machine will be 1.28 TF with 8 GB DDR3/4 alongside ED RAM, like we've mentioned previously over and over and over again.

It won't be long till someone leaks it all.

It's custom, worked on by very smart AMD engineers, that also have an interest in progressing GPU technology forward.

It's not like AMD is just going to develop some groundbreaking new tech for Microsoft exclusively, knowing full well of its own potential internally when they pitched the idea to Microsoft.

Like I said before, the processor arrangement in Durango is probably custom made for Microsoft's overall goals for the project, but it does nothing magical to increase performance efficiency per watt in a general sense for overall graphics output. It may be tailor made for certain processing functions that Microsoft wants to emphasize, but in the end Microsoft is limited to the same power & heat constraints that Sony is for a closed box.
 

Shayan

Banned
The thing is is that KZ MP has to do something new. If its like Halo or COD, then whats the point?

i think sony wanted to make KZ3 codish and it didnt work. KZ2 was perhaps the best fps this gen in my opinion and by sticking to that formula will do fine

Now , every Fps wants to be another COD and BF. Those are the 2 big names in terms of sales
 
It's custom, worked on by very smart AMD engineers, that also have an interest in progressing GPU technology forward.

It's not like AMD is just going to develop some groundbreaking new tech for Microsoft exclusively, knowing full well of its own potential internally when they pitched the idea to Microsoft.

Like I said before, the processor arrangement in Durango is probably custom made for Microsoft's overall goals for the project, but it does nothing magical to increase performance efficiency per watt in a general sense for overall graphics output. It may be tailor made for certain processing functions that Microsoft wants to emphasize, but in the end Microsoft is limited to the same power & heat constraints that Sony is for a closed box.

Which doesn't mean shit. With that same amount of limited power and heat, you can come up with a piece of garbage if you really want to.
 

iamvin22

Industry Verified
To be quite honest, the custom silicon is just that- CUSTOM. You have to understand, peak FLOPS are not the only determinant on performance. GAF has been fixated with that number since the beginning. I don't know about this bitter thing, but I do know it will have DSPs, large cache's, and ED RAM.

The machine will be 1.28 TF with 8 GB DDR3/4 alongside ED RAM, like we've mentioned previously over and over and over again.

It won't be long till someone leaks it all.

PS4 is 1.84 TF, 4 GB 192 GB/S RAM, alongside 4 steamroller cores. These are your next-gen systems. Derive what you can and will.

you and canchruser seem the only posters that makes sense but i have a question. these are closed dedicated GAMING consoles. with 1.28 -1.84TF and the types of RAM they both house, what can we expect performance wise if we were to guesss/ compare it to one of todays pc games?

edit: let add sawyer to this as well ;)
 
To be quite honest, the custom silicon is just that- CUSTOM. You have to understand, peak FLOPS are not the only determinant on performance. GAF has been fixated with that number since the beginning. I don't know about this bitter thing, but I do know it will have DSPs, large cache's, and ED RAM.

The machine will be 1.28 TF with 8 GB DDR3/4 alongside ED RAM, like we've mentioned previously over and over and over again.

It won't be long till someone leaks it all.

PS4 is 1.84 TF, 4 GB 192 GB/S RAM, alongside 4 steamroller cores. These are your next-gen systems. Derive what you can and will.

I thought PS4 was going with Jaguar cores also unless i remembering wrong .
 
you and canchruser seem the only posters that makes sense but i have a question. these are closed dedicated GAMING consoles. with 1.28 -1.84TF and the types of RAM they both house, what can we expect performance wise if we were to guesss/ compare it to one of todays pc games?

edit: let add sawyer to this as well ;)

If we wanna buy into the hype, we are talking about a "what if games were designed from the ground up to take advantage of a 680?".

All we have right now are games that were designed either with consoles in mind or if they are just on PC they weren't really designed for a 680.

Crysis 3 would be the starting point I guess, if you want to compare. But you can already take a look at Star Wars 1313, Watch Dogs and go from there.

Again they were shown last year, the consoles are releasing late 2013. For comparison sakes, Dark Sector was shown in early 2005, one year later Gears of War was knocking at the door.

Crysis 1 was released in march 2008. For comparison sakes.

I thought PS4 was going with Jaguar cores also unless i remembering wrong .

Recent rumors have been pointing to steam roller. Lesser cores, but clocked much higher. Another design difference that we will have to wait and see what devs officially have to say later this year.
 
So what's up with dev complaints? And devs being more happy with x3?

MS brought key 3rd parties on board way earlier. Might mean better dev support, better tools, better relationship. Maybe they let developers in on certain decisions (like ram amount)? Would explain 8 GB.
 
Top Bottom