• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGLeaks Durango specs: x64 8-core CPU @1.6GHz, 8GB DDR3 + 32MB ESRAM, 50GB 6x BD...

We haven't seen a game yet so its too soon to say deferred rendering won't work. MS are the creators of the directx api. I don't think this system would not have the ability to do deferred rendering when a lot of their in house engines are built around it.

Any framebuffer depending on resolution uses a specific amount of memory, if there are 32MB of eDRAM we know for sure you can't use more than that, thus limiting deferred rendering.
 

Ashes

Banned
I didn't see that, I thought you meant "fix'd" :p

From raw peak performance we have:

720:
CPU: 102 GFLOPS
GPU: 1.22 TFLOPS

360:
CPU: ~50 GFLOPS (can't find source)
GPU: 240 GFLOPS


(someone correct me if I'm wrong)

On cpu : We talked about this in the last thread. Peak performance is only the ceiling. Nobody gets that. I think the 360 CPU peak was 110 gflops too. But in reality, it benchmarked around 20 gflops. [some one will correct me if I'm wrong]
Bobcat was a much much more efficent processor, and jaguar is even better than that. We don't have jaguars 8 core benchmarks, but they will get a lot closer to 100 gflops than 360 CPU. Theoretically.
 

iSnakeTk

Should be put to work in a coal mine.
I feel like ram will be the main point why wiiU will not get the third party stuff. It was really short sighted of nintendo to only include 1 gig.


also , yay first post.
 
What do you base this on? Things like physics and networking will see massive gains when designed to run on multiple threads. Its not nonsense. Multithreaded programming is relatively new so people are still learning ways to take advantage of it. But it isn't nonsense. Figuring out ways to constantly have hardware working isn't nonsense.

Sony also has a Jaguar CPU and a dedicated compute unit. I don't see any significant processing units here (besides CPU/GPU) which help with physics/graphics. And even if there are any, this doesn't make Sonys hardware "inefficient", jeez.
 
I guess with all those extra video/audio processors, the Durango's big thing will be being able to have both a game and a TV channel on screen at the same time. Nice for grinding or such I guess.
 
The Kinect port doesn't surprise me. We know Kinect on the 360 was gimped due to USB bandwidth. The port doesn't confirm Kinect will be packed in to every 720 (although I suspect it will be).


general consensus is all over the place

2 camps:

1. Orbis > Durango >>>WiiU

2. Orbis = Duragno

I think more people are in camp 1 than camp 2


That's armchair GAF analysis. Dev leaks I've seen say they're about equal.
 

PaulLFC

Member
What do you base this on? Things like physics and networking will see massive gains when designed to run on multiple threads. Its not nonsense. Multithreaded programming is relatively new so people are still learning ways to take advantage of it. But it isn't nonsense. Figuring out ways to constantly have hardware working isn't nonsense.
I was under the impression AMD CPUs aren't multithreaded?
 

onQ123

Member
I am just surprised at how much closer to the wii u this is than what I was expecting.

4 X Ram with 5 X Bandwidth

3 X GPU floating point operations per second

same amount of edram

2 X Disc Capacity with about the same read speeds


(can't think of the CPU vs CPU specs off the top of my head but it's 8 cores vs 3 )
 

THE:MILKMAN

Member
Interesting that it'll have 16 threads. Now lets see if Devs actually start making use of this and if it transitions over to PC gaming as well. I suspect that those with 8 core AMD CPU's in their gaming PC may well be sitting pretty watching all of this.

Is that a Durango exclusive or will Orbis have that too?
 

Jadedx

Banned
I'm a little disappointed in the Durango. I think the ESRAM should be 64MB and we still don't know how much of the main ram the OS is gonna eat up.

I still don't know why people think the OS will use up a whole lot of ram. Just think of all the 360 can do with just 32 mb for the OS.
 

pixlexic

Banned
Nobody is "ignoring" the RAM, just most people aren't looking at it as simply as "8>4 so Durango is better!"

This is true. While I believe more is better in terms of video games because of paging memory for lesser ram set ups as opposed to just keeping it in ram. I also think no game will come close to having 3 gigs of assets for one area in a game. The load times would be too horrendous. Though faster ram would mean faster texture fetches which would mean more render passes IF thew gpu could keep up.
 
What? How did you read what wasn't there? The hell man.

Im sure you aren't versed in this but I'll bite.

12 CUs(however many ALUs) for 720 vs 18 CUs(which would be more ALUs than 720)

He wants to render a shadow. On both platforms. As he said he is ROP limited. So if his shader for the shadow is texture heavy, the ALUs would be idle, therefore negating that ALU advantage for PS4 and giving the advantage to the most ROP heavy.

While this is true, it only paints a specific part of the picture and only would really apply to a port, rendering the same scene. A game built from the ground up for the PS4 and its immense ALU advantage would not run into this.

All he's saying is that just because the PS4 has more Flop power than the 720, doesn't mean the the PS4 would automatically be better in every scenario.
 

Spongebob

Banned
Pathetic specs, disappointing "secret sauce".

Hell, it even looks as though Orbis will have a better secret sauce with its compute module.

Durango won't be much more than a cable box with Kinect packed in. Not a game console, but a tv tuner.

Respect to stevie for predicting all this.
 

EVIL

Member
High-fidelity Natural User Interface (NUI) sensor is always present
kinect_fap_security.jpg
 
There will be nobody left to make the games by 2018...


Ohh no doubt we're about to hit a wall, at some point graphical fidelity will stop making strides due to costs, that or we'll be looking at a maximum of 4-5 AAA titles a year. We're definitely closing in on critical point that'll either tear the industry apart of force devs to cut back.
 

Limanima

Member
This is why I hate the fact the the PS3 wasn't a monster success. Companies now more then ever play for safe which is a shame.
No more strange architectures from Sony...
 

clav

Member
I feel like ram will be the main point why wiiU will not get the third party stuff. It was really short sighted of nintendo to only include 1 gig.


also , yay first post.

Not really. Nintendo just needs to develop their first party games, and that's it.

Been like that for so many generations. Third party games just don't really sell on Nintendo as they used to do.

The only thing you need is console marketshare.
 

charsace

Member
Any framebuffer depending on resolution uses a specific amount of memory, if there are 32MB of eDRAM we know for sure you can't use more than that, thus limiting deferred rendering.

Developers had input in the design of the system. Do you think all the engines shown so fare would be built around deferred rendering if they knew the next xbox couldn't handle it? Devs have had some form of devkit since 2010. The system most likely has a way to do deferred rendering at 1080p. The design is different from what we have seen in the last decades seeing as how the GPU can attain 170 gb through some method.
 

dbztrk

Member
I still don't know why people think the OS will use up a whole lot of ram. Just think of all the 360 can do with just 32 mb for the OS.

I never stated how much. However, if Kinect is to be included in every system (speculation on my part) plus all of these super cool features that everyone else is speculating on, it will take up a sizable amount of ram. I'm thinking about 2 gigs minimum.
 

nib95

Banned
general consensus is all over the place

2 camps:

1. Orbis > Durango >>>WiiU

2. Orbis = Duragno

I think more people are in camp 1 than camp 2

Based on the leaks alone, at the moment it does seem to be in Sony's camp. Someone clarify if I am wrong on the details below.

CPU

Durango/Orbis: the same

GPU

Orbis: 78xx
Durango: 77xx

Orbis: 4 GB GDDR5 @est 192 GB/s(?)
Durango: 8 GB DDR3 @70 GB/s + 32mb Esram @102 GB/s (Combined 170 GB/s using 32mb Esram).
 
Exactly, which has nothing to do with ports. But hey interpret whatever you want in the post he wrote, it's cool.
His example was tailored towards ports, just because you don't understand don't get mad lol. For the record, it doesn't mean "stop looking at flops" as you implied.
 

big_z

Member
wiisad.gif


general consensus is all over the place

2 camps:

1. Orbis > Durango >>>WiiU

2. Orbis = Duragno

I think more people are in camp 1 than camp 2


the better system is the one that has better developement tools(microsoft has been better in this respect). that will become the lead platform for developement and the ideal choice for multiplatform games. simple as that.
 

Audioboxer

Member
It doesn't really matter whether it's easy or not. They won't do that (on purpose) because it will affect software sales when another version is noticeably better than the other.

3rd party will definitely scale down to the lesser specs, whichever it ends up being. And for this reason I seriously doubt there is going to be more than 5% difference in performance between the 2 systems because the more powerful one would have a lot of "wasted" power (= wasted money) for anything not 1st party title.

On a very basic level of improvement, frame rate is bound to go up, and devs won't exactly be 'doing that on purpose' in some sort of mischievous way to devalue. Even a little part of that bolded sentence is false for this generation to be honest. Devs have rushed out PS3 copies of multiplatform titles in a right state, not caring about sales being affected, only to correct later - Skyrim probably the biggest profile title, followed by the likes of Bayonetta. Developers have already done things 'on purpose' to put it frankly.

Of course going to the extremes of re-texturing whole areas might be chalked up as "unlikely to happen", but increased frame rate is bound to happen. In saying that though, on contrary to the above, some titles this generation have been sold on looking better - I recall Oblivion, Sigma 2, and even Bioshock touting PS3 texture improvements. Most likely as part of a PR campaign, but it's still proof that your statement of developers somehow worrying about one title looking better will force them to aim for complete parity.

I just think if the hardware really is comparable, and easily scalable, as a developer it's not really a decision of am I doing something on purpose, as much as it is is this easy to scale naturally for the end product? If it is you're not going to go out of your way to stop such improvements happening in the porting or development process simply because fanboys will have a hissy fit over it.
 

quest

Not Banned from OT
Exactly, which has nothing to do with ports. But hey interpret whatever you want in the post he wrote, it's cool.

If they scaled back the CUs that much it only makes sense they scaled back the ROPs and texture units also. Like how AMD graphic cards do it on the PC market. So looks like sony has CU,Texture unit and ROP advantage. Then add in the seperate compute unit advantage the PS4 will have a large advantage on paper. We all knew if they wanted to throw kinect in every box cuts had to be made else where. Well we found out where the cuts were made.
 
Top Bottom