• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Kotaku(Jan23): Have official Orbis docs; Devkit: PS4- 8GBRAM,2.2GBVRAM, Controller

Not even joking around when I say it, I trust in Kaz.

003_kazunoriyamauchicreatorofthegranturismogameseries__large.jpg


1049542-kazrollin.jpg


I believe in both of them.

OT, I am useless at tech stuff but does this give an any idea of BC on the PS4?
 

Hydrargyrus

Member
I think that the rear touch pad in the controller is a great idea, but it needs to be well implemented in games, not like they did with the sixaxis.
 
If these specs are true, Sony has not been able to hit the mark, or are reconsidering their stance on "final hardware". The last I heard, in September, the RAM ceiling was raised to 4 GB UMA 192 GB/S. If this holds weight than we have to modify our expectations. Final silicon should ship this month, so we should by that time have a good idea what Sony is gunning for. Once again, its all about the RAM.

Well I don't think you are wrong but I don't see any grave changes so far. The 10.2GB could easily result in a 4GB GDDR5 + 256MB (legacy RAM) configuration. The devkit having 8GB could be because eg. Sony Ice Team released new debug software to test or for certain developers who requested more during the development process.

Where do you see a big change?
 
Love a touchpad on the back. Such a great idea. Flip the controller over for menus in RPGs, navigating the OS, gesture gameplay. From the front, slide actions, camera control.

Its unusual which is why they should try it.
 

jaosobno

Member
For what it's worth, I was just talking to a supposedly Foxconn employee:


Interesting if true because the employee supposedly works at the Chongqing LCD TV and module plant:
http://evertiq.com/news/23149

I seriously doubt that Foxconn will have anything to do with PS4. It's either GloFo or TSMC.

lol @ 2X Ethernet. definitely going to ship with that. so useful!

It's for development purposes (internal development network data sharing), it's likely to be removed in the final box.
 

ohohdave

Member

This.

The Start/Select buttons could be emulated on the controller's pad surface when playing PS1/PS2/PS3 games

I'm up for change, but don't know if I could quit the feel of a PS controller at this point.
Remember we could have been playing with this for the past years.
editboomerang.jpg
 
uh, nobody but sony knows final specs, all that rumors are is trying to figure it out from old dev kits, and this is new dev kit... which means those rumors are old and this is new... and that jives well with Sony upgrading PS4.

uhh, this kit is the old one.
 

NinjaBoiX

Member
Yeah, I don't know how it will work. It's one of the reasons why I have trouble holding a Vita i.e. to not touch the backpad. Let's see how this works.
Most games allow you to turn it off. But yeah, I tried playing MGS3, and I couldn't walk more than three paces without whipping the knife out.

No thanks.

FIFA uses it well, it acts as the goal, so you just tap where you want to shoot on the back pad. Optional of course.
 
If all of the new consoles are using AMD architecture and console to pc ports are naturally more multi-threaded and rely on the GPU more, does AMD have a chance of competing with the i5's and i7's of the world again?

At the high end? No, that ship has long since sailed.
 
An SPE module would be totally awesome, especially if they plan to keep it in there and if it's in all the models. Because then it could be used by PS4 games as well, and its performance is absolutely non-negligible for stuff it's good at.

I don't think it's very likely though.

Yeah, a single SPE can provide something around 25 Gflops and thus adding a module with 7 SPEs would mean having 175 Gflops of additional power that can be used for audio/video processing, physic, and to help the cpu with vector intensive code, in addition to provide hw backwards compatibility. It would make the PS4 a monster.
 

tipoo

Banned
if it was originally 4GB of GDDR5 and they decided to upgrade to 4GB DDR3 + 4GB GDDR5, would devs be pissed at them all over again for splitting the pool?

Maybe towards the end of the cycle, but I would think that initially the added bandwidth would be a bigger help than the split pool was a burden. 8GB split is still better than 4GB unified at any rate. Yes the processor would get lower bandwidth RAM, but processors aren't as bandwidth demanding as GPUs plus it wouldn't have to share the bandwidth. But I think the 8GB is just for the dev kit anyways.

I wouldn't want AMD period. Shit is garbage compared to whats on the market.

I have no idea why they go AMD and not try to work something out with Intel.

Intel doesn't licence out their chips like AMD and IBM do, they keep control of them. They burned Microsoft on the xbox with that. What this means is that whoever is buying their chips is subject to whenever they want to shrink dies and whatnot, and can't customize parts of the architecture themselves.

Plus AMD would be willing to sell for lower profit margins, considering thier situation they would probably be very flexible in thier deals.
 

ekim

Member
Really nice insight into what more can be exposed on the same hardware in a closed box. Excite.

I think he's maybe misinformed in mentioning 720's GPU will be 'pre-GCN' though - if anything it should be the latest GCN refresh.

I also thought that Durango's GPU will get the GCN architecture.
 
I wonder what the AMD R10xx is capable of. What range does that fall in next to say a GTX 660-ti.

I also wonder how the CPU stacks up against a standard Intel i7.
 
I wonder what the AMD R10xx is capable of. What range does that fall in next to say a GTX 660-ti.

I also wonder how the CPU stacks up against a standard Intel i7.

R1000 is Southern Islands i think which could mean anything from 73xx to 79xx. If the rumours are true and its 7970M based GPU then they are about on par performance wise.
 

test_account

XP-39C²
Little Big Planet supports it as well. Most games don't though. Presumably it will just work (or be forced to work) with every game now like on the 360.
Yeah, Little Big Planet is another game that supports is as well indeed :) I wonder why more game doesnt support it, especially Call of Duty Black Ops 1, Modern Warfare 3 and Black Ops 2. Three games that are really popular and supports split-screen online.

EDIT: Seems that Black Ops 2 supports it.
 
Timing just seems odd considering these devkits have just gone out.

And last minute changes don't sound promising. You don't make an efficient console (both in performance and cost) by changing things at the last moment. Neither is it good for devs trying to gauge what they can achieve for initial projects
They may be targeting a certain bandwidth and a certain amount, but without finalizing how to get there - i.e. from the other thread it was discussed that either 4GB of GDDR5 or 4GB of stacked RAM can achieve the same bandwidth, but with pros and potential drawbacks for either. The most crucial for the latter being whether it will actually be ready for a console shipping late this year.
 
I think you read the wrong article :)

yeah, there are nice comments on orbis and i thought it's a bit critical on the ddr3 nature of the durango, in spite of the esram.

ed, engadget picked up this story from kotaku, i'm assuming they did some cross checked... i wonder how 'old' is this devkit. I assume earlier version....
 

Mario007

Member
yeah, there are nice comments on orbis and i thought it's a bit critical on the ddr3 nature of the durango, in spite of the esram.

ed, engadget picked up this story from kotaku, i'm assuming they did some cross checked... i wonder how 'old' is this devkit. I assume earlier version....

I wouldn't really trust engadget that much. They probably just saw the Kotaku story and went with it.
 

Boss Man

Member
I don't mind them experimenting with something new, but I really hope that there will be a DualShock 4 alternative.

Also, I just thought about how ridiculously cool the OS could look if it were to focus on that rear touch pad. Imagine a background that looks like a curtain and your fingers move through it like an etch-a-sketch. So cool. To be honest though, it would probably be pretty aggravating to navigate like that.

Either way, I'm expecting the OS in these new systems to be one of the most exciting things about them. I feel like that will separate them most from this gen, even though I expect the graphics to be no less than what you'd expect from a generational leap.


My main wish from Sony with Orbis (outside of games), is a more cohesive platform. Being more open about requirements and stuff has its advantages, but I don't think it's the right philosophy for a console. It's mostly little things. For instance, next time you get bored take a look at the different trophy icons and names between games. I'm not saying that games should not be able to choose their own icons for trophies, but they should have to meet certain quality requirements. Demon's Souls is actually an offender here, even though it may have been my favorite game this generation.

There are a lot of other little things, and I don't think any of it is really that offensive so it's hard to actually point things out from the top of my head. But there is definitely an overall sense that 'PSN' or maybe even 'PS3' does not really mean anything.
 

sholvaco

Neo Member
I think he's maybe misinformed in mentioning 720's GPU will be 'pre-GCN' though - if anything it should be the latest GCN refresh.

Not out of the realm of possibility. The other day when proelite mentioned that ms refers to the compute portion of the GPU as shader cores instead of compute units the possibility of a VLIW based architecture sprang to mind. It's not necessarily a disadvantage because when it comes to graphics workloads WLIV is just as (if not more) efficient as GCN which was introduced mainly so AMD could compete in the high performance computing market.

Durangos architectural details must have been finalized a while back. Older rumors (oban) suggest it was in the pipeline for a long time.
 

nib95

Banned
This is worrying if true. 2.2gb vram? 8GB system ram? What about that GDDR5? I just hope it doesn't translate to less GDDR5 and more slow ram. Not sure a non unified direction is a good one.

Also, nay to the touch screen. Eurgh...
 

i-Lo

Member
This is worrying if true. 2.2gb vram? 8GB system ram? What about that GDDR5? I just hope it doesn't translate to less GDDR5 and more slow ram.

Also, nay to the touch screen. Eurgh...

It is the old dev kit, used to simulate the sum total GDDR5 of 2GB, according Thuway. So the dev kits would have changed now to reflect the 4GB (3.5 usable) GDDR5 UMA.

Each day I want to see Guerrilla's new fantasy IP more and more. :)

And each, I wonder what if one of Sony's studio made a non linear sci-fi RPG.
 
Top Bottom