• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGLeaks: Details multiple devkits evolution of Orbis

I love and hate these times. Every single bit of info gets glamorized and blown up and the anticipation of the new consoles gets everyone excited and hype. But man I want more concrete information, E3 or the unveils cannot come soon enough...ugh
 

CorrisD

badchoiceboobies
Not really surprising that there is a new camera, it wouldn't make sense to continue with exactly the same hardware when MS is going to whip out a more advanced Kinect, but I do wonder if it will be in the box.

I'm kind of hoping that Sony go with the split controller they have a patent for with possible move add-on balls for the top that we saw in something else, I could imagine being able to hold either side of the controller wherever you want instead of the typical dual hand holding that normal controllers have now could be quite comfortable.


No they will design for the lowest common denominator. Sony's 3rd party efforts weren't a nightmare based on limited specs but a alien work environment compared to 360
It's more likely that third parties will stick to the lower bandwidth of Durango and lower RAM of Orbis. Third party development has always been about the lowest common denominator.

Well that is the hope, but it didn't pan out exactly that way this generation for some titles, Skyrim for instance was a complete mess because they evidently planned for the 360 memory limitations rather than PS3.

I'm really just worried about another generation of some iffy ports.
 

LiquidMetal14

hide your water-based mammals
Going by the replies I've seen over the last few days I'm surprised nobody's complaining about Orbis requiring an install of Windows.

I guess if you want a sensationalist thread with nonsense discussion then yeah, it should be noted.

Otherwise, it has no bearing on the tone of the discussion.

In short, who cares?
 
The Dualshock news, is the best "new" news, everything else is more confirmation of what's already known. I'm a bit meh about dual camera though.
 

Ravage

Member
Any chance the dual cameras will be used for eye tracking?

I suspect only the initial launch units will have GDDR5. As soon as 3D/2.5D/Wide IO becomes sustainable at 192 GB/S, I expect Sony to shift gears and change the unit altogether.

I'm not even aware that this is possible.
 

McHuj

Member
Has it every been speculated/rumored when the first Dev kit (just the r10 board one) showed up?

Given that the architecture is fairly straightforward, then seems to me that Sony (and probably MS) is poised to have a fairly strong launch line up. Devkits will have been out 18+ months by the time PS4 ships.
 

i-Lo

Member
Basically what I want to say is that the 192 figure has been there since mid 11 not early 2012 (so it's older). I won't say if this has changed or not over the time.

But the amount did. When developing with less than 2GB in mind and then being allotted another extra 1.5-2GB, how does one go about utilizing this surplus?

Any vague response would do. Thank you.
 
Well that is the hope, but it didn't pan out exactly that way this generation for some titles, Skyrim for instance was a complete mess because they evidently planned for the 360 memory limitations rather than PS3.

I'm really just worried about another generation of some lame ports.
it's very unlikely to be the same situation. Much different architecture this time around for Sony. Similar stuff.
 

gaming_noob

Member
Basically what I want to say is that the 192 figure has been there since mid 11 not early 2012 (so it's older). I won't say if this has changed or not over the time.

Wow this is really exciting.

If it's higher than 192gb/s is it better tech than GDDR5?
 

thuway

Member
sony could overclock the PS4 with firmware updates like they did with the PSP.

Until the millions of PS4's come in because of overheating. These machines are designed to a tee, 1.84 TF is beastly, and if you think its anything short of that, go buy yourself a nice PC gaming rig. In a closed environment this should perform quite similarly to a GTX 680.
 

Nirolak

Mrgrgr
But the amount did. When developing with less than 2GB in mind and then being allotted another extra 1.5-2GB, how does one go about utilizing this surplus?

Any vague response would do. Thank you.

Well, everything a game does takes memory.

AI, shadows, anti-aliasing, textures, level size, the number of enemies, the number of environmental props, the amount of animation, and more.

Really you can fill it up with a lot of things. If it's a cross generation title though, pretty much all of that filling has to be graphics related since you can't break your gameplay on the older consoles unless it's simply something like increasing player count in a Battlefield title.
 
So, in the console the special compute module is integrated with the GPU?

jejeje. Good try. Although i think he implicitly confirmed the special compute module ;). So thuway you will have your 2 teraflops...

About developers basing in the lowest common denominator there is one thing:
-If they develop basing on the consoles with less amount of RAM: there will be the same number of npcs and objects in both games.
-If they develop basing on the consoles bandwith: PS4 will have always much greater frame rates or are they going to make PS4´s version slower purposely?.
 
Has it every been speculated/rumored when the first Dev kit (just the r10 board one) showed up?

Given that the architecture is fairly straightforward, then seems to me that Sony (and probably MS) is poised to have a fairly strong launch line up. Devkits will have been out 18+ months by the time PS4 ships.
From what lherre said above, target specs at least have been out since mid-2011, so that might provide a clue.

That could mean in theory games have been in planning and development for Orbis for like 18 months already.
 

Jellzy

Neo Member
I think Thuway's theory on the OS is spot on and will follow a simliar route to the Vita's OS (Playstation OS).

The memory is fast enough to offload data at such a high rate that when a game is paused (suspended, again think Vita) then only the vital memory required for the game to run/stay alive in the background is required, allowing the OS more RAM to open and closs various apps, other games etc.

The double camera will no doubt be for detecting depth when using the new controller... Possibly allow 3d photo's and video to be taken also?

Think of the new controller as a hybrid dualshock and move controller.

My biggest gripe with the move controller is a lack of analogue sticks, and also the massive sphere which is understandably a requirement.

Imagine this new controller can be disconnected into 2 halves, each with their own analogue and instead of the spere, possibly an led striparound the back of each half, or atleast a few led's under a transluscent back (think of how infared works in a remote control).

The camera along with various sensors would detect the controllers position in space and with 2 analogues make character movement and camera movement more precise.

Moving the camera round in games that used move was always its biggest problem.

Sorry for the quick rambling and mess of this post, just got a few minutes and thought I'd chuck in my 2 cents.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Wow this is really exciting.

If it's higher than 192gb/s is it better tech than GDDR5?

You can get higher with GDDR5, the Tesla K20X used 384-bit GDDR5 to achive 250GB/s. Some 79xx AMD cards do something similar (384-bit >260GB/s).
 

B.O.O.M

Member
Damn I got excited when it was rumored they were dropping the Dualshock and then this happens. Lets hope it's only Dualshock in name.

This is a tough situation for sony :D fans in general are opposed to major change. BEst thing they could do is either find a controller that is middle ground 'or' have the DS3 compatible with the ps4

ps4 will never coming out soon. :(

why?
 

i-Lo

Member
Well, everything a game does takes memory.

AI, shadows, anti-aliasing, textures, level size, the number of enemies, the number of environmental props, the amount of animation, and more.

Really you can fill it up with a lot of things. If it's a cross generation title though, pretty much all of that filling has to be graphics related since you can't break your gameplay on the older consoles unless it's simply something like increasing player count in a Battlefield title.

I see. However, is it a substitute for a project that began development knowing the exact final specs rather than employing additive measures to fill the void? Ergo, are the launch titles going to be indicative of PS4's true capabilities (the twilight years' showcases notwithstanding) or will the second wave be what the launch titles should have been? The answer comes down to an educated guess at best (I suppose).
 

thuway

Member
Basically what I want to say is that the 192 figure has been there since mid 11 not early 2012 (so it's older). I won't say if this has changed or not over the time.

Megaton right here folks. Time to throw out the GDDR5 Kool-Aid out the window. I'm always pessimistic about these figures. It's must easier to go down, and there is very little benefit of going up.
 

Norml

Member
Dual camera?

maybe it has to do with this recent patent?

http://patft.uspto.gov/netacgi/nph-...ALL&S1=08355529&OS=PN/08355529&RS=PN/08355529

oYhn6wb.png
 

deanos

Banned
Until the millions of PS4's come in because of overheating. These machines are designed to a tee, 1.84 TF is beastly, and if you think its anything short of that, go buy yourself a nice PC gaming rig. In a closed environment this should perform quite similarly to a GTX 680.
The chip in the PSP was designed to run faster than it does out of the box.
By default, the CPU runs at 222Mhz, but the MIPS R4000 processor was designed to run at 333Mhz with a 166Mhz bus.
Sony allowed developers to change the clock speed from within the game.
 
The chip in the PSP was designed to run faster than it does out of the box.
By default, the CPU runs at 222Mhz, but the MIPS R4000 processor was designed to run at 333Mhz with a 166Mhz bus.
Sony allowed developers to change the clock speed from within the game.

SOC's are hot because GPU and CPU are on the same chip. Upping the clock rate post launch is a bad idea.
 
I am hearing things about GDC (march 25/29) , so who knows.

And there were talks that Sony could show something within weeks.

I get the feeling we will get a slow rollout of reveals.

1st reveal - Playstation event in February - show a trailer of in game footage and at the end show the PS4 logo

2nd reveal - GDC in March - this is where the console gets unveiled and we get confrimation of specs, controller, system design, but perhaps not much more other than some more target demos (rubber ducky tech demo?) and perhaps another game trailer.

3rd reveal - E3 - a deep dive into the OS and features of the PS4 as well as numerous games, also pricing and availability.
 

Ashes

Banned
Until the millions of PS4's come in because of overheating. These machines are designed to a tee, 1.84 TF is beastly, and if you think its anything short of that, go buy yourself a nice PC gaming rig. In a closed environment this should perform quite similarly to a GTX 680.

You ough not to say stuff like that. But whatever...
 
It sounds like a stereoscopic camera, yeah.

It would explain why they might be able to do 'move' style tracking with just a LED strip on the controller, which is rumoured. The sphere was there, and was so big, to enable reasonable depth calculation. If there's a depth camera, something a lot more subtle should do.

Also camera res should be improved which helps too. Not buying the spliting controller idea despite the patent. You can always use a Move for that and it sounds like they're still compatible.

Two cameras might also let you mirror the room in 3D for games like eyepet and Wonderbook, not sure how well that works.
 

Kagari

Crystal Bearer
Until the millions of PS4's come in because of overheating. These machines are designed to a tee, 1.84 TF is beastly, and if you think its anything short of that, go buy yourself a nice PC gaming rig. In a closed environment this should perform quite similarly to a GTX 680.

Really?
 

Nirolak

Mrgrgr
I see. However, is it a substitute for a project that began development knowing the exact final specs rather than employing additive measures to fill the void? Ergo, are the launch titles going to be indicative of PS4's true capabilities (the twilight years' showcases notwithstanding) or will the second wave be what the launch titles should have been? The answer comes down to an educated guess at best (I suppose).

Well they've only gotten the actual dev kits as of January, so it'd be really hard to get anywhere near maxing out a system in about 10-11 months.

I would expect we will see near peak performance around year 3, or about the same time we saw games like Uncharted 2 coming out and got relatively marginal upgrades past then.

Though, given that developers have a lot more control over the rendering pipeline this time, they might be able to make high impact (in terms of consumer perception) changes later on, but I suspect anything in terms of large scale brute force improvements will largely stop by then.
 

MaulerX

Member
So that 192 GB/s bandwidth figure is from 2011? Sounds like one of those situations where they shoot for the stars and land on the moon. I'm preparing for final to be less, maybe 140 GB/s to 160 GB/s. But that's just a realistic hunch.
 

Jellzy

Neo Member
Also camera res should be improved which helps too. Not buying the spliting controller idea despite the patent. You can always use a Move for that and it sounds like they're still compatible.

Possibly plus it also negates the rumour of a touchpad (realistically anyway).

But why not go the full way and release a hybrid controller so that move elements can be added to anygame and a yet still have 2 analogues to fully control character movement and also control the camera more realistically?
 
Do these guys actually know what a real-time OS is?

I'm not exactly sure what they mean either. For all intents and purposes "real time OS" = "embedded OS" and I've never seen an embedded OS provide the rich type of environments you see on modern desktops or the PS3/Xbox.

Besides, it sounds like he's talking about gains that can be made when coding in assembly anyway.
 
Top Bottom