• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumored Chinese Forum Xbox720 specs: 8CoreCPU,8GB,HD8800GPU,W8,640GBHDD

pixlexic

Banned
WinRT is technically the API for "Windows Store" apps. Not to be confused with Windows RT which is the ARM version of the OS.



This isn't true, XNA is not supported by "Windows Store" apps or even in Windows Phone 8 apps (it does support them in a backward compatibility mode though)

Didn't say it was xna. i said it is exactly like xna. It like using .net there is not much difference in the rt runtime.
 
Us people. Normal folk care a lot less.

We've been through this but, gfx are the most important thing in core gaming and there's not a close second.

If you can do like wii and go outside the core, then you can escape that paradigm (although even for Wii, it's worth noting more people ended up owning a powerful console either a ps3 or 360, and also worth noting the core market is far more reliable imo, eg how wii fell off a cliff about year 3 while ps360 do not). wii U clearly looks to have not escaped it.
 
So does Leadbetter actually have any of his own leads?

a birdie told me he claimed two non neogaf sources.

but i guess that doesnt mean non-b3d sources.

it's still be FARRRRRRRRRRR the most likely explanation that these commonly accepted spec sets for PS4/720 are right than not. For one, lherre has specifically said they are right expect for minor details, and so he would have to be a liar.

I just think the odds that PS4/720 differ from the common spec sets get longer every day. At this point it'd be downright amazing if say, one actually had an IBM CPU. It'd almost be like a hail mary succeeding in football or something.

The thing is if we assume PS4/720 are due in Nov, then something SHOULD be leaking right now. So there's nothing odd that we probably know a lot about them. It would be odd if we didn't. A fully detailed 360 schematic leaked almost two years before it's release.
 

CrunchinJelly

formerly cjelly
Who the hell is Leadbetter?

Richard Leadbetter. Digital Foundry. Ex-Sega Saturn magazine.


a birdie told me he claimed two non neogaf sources.

but i guess that doesnt mean non-b3d sources.

it's still be FARRRRRRRRRRR the most likely explanation that these commonly accepted spec sets for PS4/720 are right than not. For one, lherre has specifically said they are right expect for minor details, and so he would have to be a liar.

I just think the odds that PS4/720 differ from the common spec sets get longer every day. At this point it'd be downright amazing if say, one actually had an IBM CPU. It'd almost be like a hail mary succeeding in football or something.

The thing is if we assume PS4/720 are due in Nov, then something SHOULD be leaking right now. So there's nothing odd that we probably know a lot about them. It would be odd if we didn't. A fully detailed 360 schematic leaked almost two years before it's release.
You know, it is just so hard to tell where people are getting their info from these days. Hard to separate genuine insiders, copy and pasters, Chinese whisperers etc.
 
Lol, doubtful..

Ugh no it isn't. The quickness in which ESRAM and Ddr3 volume were positioned as 2nd best was incredible. There's already theories about how 16gb of flash is gonna be used as this magic little thing that will make the ram amount advantage moot. It's already decided the Orbis gpu is simply better. Airm chair analysis already theorizing the reasons why MS is going with a decidedly less powerful system, which include the worst possible accusations when it comes gamers interests.

It's pretty much already law that Orbis > Durango, that doesn't tell you anything about the real capabilities of these consoles nor how they compare, but it tells you a whole lot about the power of wishful thinking.
 

Proelite

Member
Ugh no it isn't. The quickness in which ESRAM and Ddr3 volume were positioned as 2nd best was incredible. There's already theories about how 16gb of flash is gonna be used as this magic little thing that will make the ram amount advantage moot. It's already decided the Orbis gpu is simply better. Airm chair analysis already theorizing the reasons why MS is going with a decidedly less powerful system, which include the worst possible accusations when it comes gamers interests.

It's pretty much already law that Orbis > Durango, that doesn't tell you anything about the real capabilities of these consoles nor how they compare, but it tells you a whole lot about the power of wishful thinking.

3GB for the OS is already final. MS is focusing on Kinect, multimedia and OS. etc etc.

None of which are absolute truth.

It's foolhardy to determine a performance winner based on one Eurogamer article that is undoubtedly suspect in some numbers, even LHerre said as much.
 
That could work too, but they'd also have to redesign the cooling/packaging as a clock increase of this magnitude would have a profound effect on TDP.

If they're smart they've seriously over-engineered the cooling for the first batches anyway. They can always scale it back over time like Sony did.
 

nib95

Banned
3GB for the OS is already final. MS is focusing on Kinect, multimedia and OS. etc etc.

None of which are absolute truth.

It's foolhardy to determine a performance winner based on one Eurogamer article that is undoubtedly suspect in some numbers, even LHerre said as much.

Those opinions people voiced were on the basis of 3GB OS usage, some half that. Point is, I think everyone realises nothing is final. People are merely theorising on the rumoured details.
 

i-Lo

Member
3GB for the OS is already final. MS is focusing on Kinect, multimedia and OS. etc etc.

None of which are absolute truth.

It's foolhardy to determine a performance winner based on one Eurogamer article that is undoubtedly suspect in some numbers, even LHerre said as much.

So is PS4 a weaker console when the wizard's jizz finally is activated in XB3? Or are they both going to run toe to toe. I was hoping to buy XB3 after PS4 but if it's going to be the same shit situation with ports where UE4 and C3 are tailored to XB3's strength and most game adopt UE and C3 as the the main middlewares, then I have to consider getting XB3 first.
 
So is PS4 a weaker console when the wizard's jizz finally is activated in XB3? Or are they both going to run toe to toe. I was hoping to buy XB3 after PS4 but if it's going to be the same shit situation with ports where UE4 and C3 are tailored to XB3's strength and most game adopt UE and C3 as the the main middlewares, then I have to consider getting XB3 first.

You can't make your decision now man.
 

i-Lo

Member
You can't make your decision now man.

I was hoping to get PS4 on launch day and if XB3 comes out first it wouldn't matter. But if the ports chug like Betashed's pride and joy, Skyrim on PS3 then buying a large paperweight just to play exclusives would still be lucrative but just not at launch. Then I'd get the XB3 on launch day instead.
 

Gorillaz

Member
Buying either console at launch is a terrible idea. Both consoles were rough as fuck at launch and the WiiU isn't that far off either.
 
Same difference. It depends more on throughput than the overall total amount of RAM. 1.5GB more for Durango isn't going to do anything.

So what you believe is that there's no way the design of Durango can allow for it to have enough bandwidth to compete with Orbis, but Orbis design manages to beat it in speed and can nullify the amount advantage.

Of course. You know that makes so much sense, we don't even need to know anything more about these systems.

Doesn't really make a difference. Obviously certain devs with games might prefer this rumoured 720 set up, but for the more boundary pushing graphical tech, 3.5gb GDDR5 I'd imagine would be a far more preferable.

I understand man, those Crytek guys are just so lazy.
 

Agent Icebeezy

Welcome beautful toddler, Madison Elizabeth, to the horde!
I'm reading this thread with great joy and I'm assuming most of you guys that do know something are getting your rocks off in this thread. Reminds me of the Eddie Murphy skit where he said the ice cream truck would always stop an extra block away, just to make the kids run. We all keep coming in for any morsel of information and here we are, always back to square one at the end of the day. There is nothing but conjecture here for the most part. Amazing show.
 

nib95

Banned
What if it's 3.5 GDrr5 vs 6.5GB Ddr3 + 32MB ESRAM?

Doesn't really make a difference. Obviously certain devs with games might prefer this rumoured 720 set up, but for the more boundary pushing graphical tech, 3.5gb GDDR5 I'd imagine would be a far more preferable.

I guess we'll have to wait and see. There's still lots we don't know. Add to that, perhaps the benefits of GDDR5 might not be felt till some time in to the gen? It's not like developers have really been pushing things on that front in the PC world. The bandwidth advantage of GDDR5 thus far hasn't really been made much use of.
 

Spongebob

Banned
Doesn't really make a difference. Obviously certain devs with games might prefer this rumoured 720 set up, but for the more boundary pushing graphical tech, 3.5gb GDDR5 I'd imagine would be a far more preferable.

Yep, MS's memory setup was design to allow for a large amount of RAM for OS funtions/ads/Kinect/bloatware. Sony's setup is designed for pushing graphics.
 
Lol, doubtful. There's a reason every new high end GPU on the market comes with GDDR5 instead of far cheaper DDR3. DDR3 is typically used for system ram, not video ram. Having said that I'm sure many devs will get more use out of more DDR3 compared to less GDDR5. But I'd imagine as graphical features intensify, and devs start pushing the boat, that GDDR5 will start seeing serious advantages.

Again true and you might be right, but PC's are different than consoles. PC games often benchmark at 100 and even 200+ FPS, with at least 60 desirable. And the resolutions can be 4k or higher with eyefinity setups. The last thing they want is their high end GPU failing in those situations, or losing a benchmark graph 180 fps to 120 fps because they ran out of bw.

Bandwidth is a lot more important there than a console targeting 1080p and 30 FPS.
 
So what you believe is that there's no way the design of Durango can allow for it to have enough bandwidth to compete with Orbis, but Orbis design manages to beat it in speed and can nullify the amount advantage.

Of course. You know that makes so much sense, we don't even need to know anything more about these systems.



I understand man, those Crytek guys are just so lazy.

You are being overly defensive, nobody called Crytek lazy. His point has merit.
 

THE:MILKMAN

Member
Buying either console at launch is a terrible idea. Both consoles were rough as fuck at launch and the WiiU isn't that far off either.

That should be different this time. Both have had their devs making games for them for 2+ years. Go back to E3 2005 and the 360 was quite rushed and Sony first party were mostly in the pre-production/design phase (ND, GG at least)
 

Proelite

Member
Same difference. It depends more on throughput than the overall total amount of RAM. 1.5GB more for Durango isn't going to do anything.

Doesn't really make a difference. Obviously certain devs with games might prefer this rumoured 720 set up, but for the more boundary pushing graphical tech, 3.5gb GDDR5 I'd imagine would be a far more preferable.

I guess we'll have to wait and see. There's still lots we don't know. Add to that, perhaps the benefits of GDDR5 might not be felt till some time in to the gen? It's not like developers have really been pushing things on that front in the PC world. The bandwidth advantage of GDDR5 thus far hasn't really been made much use of.

Are you guys devs?

Seriously, if you guys are, my ears are open.

I've been waiting for devs to chip in on the advantages of one platform over the other.
 

Iacobellis

Junior Member
Buying either console at launch is a terrible idea. Both consoles were rough as fuck at launch and the WiiU isn't that far off either.

360 was weak at launch?

tumblr_mgp2wsbvVG1qgg8yfo2_250.gif
 
You are being overly defensive, nobody called Crytek lazy. His point has merit.

I'm not being defensive at all, we are basically making assumptions based on what we think devs want. We're probably not developers, we certainly don't know the final specs, and certainly we haven't talked or heard from developers, aside from Crytek.

It's one thing to believe that Gddr5 will allow for X to be done better, but when the same people look back at the other side and only come up with "nah, that won't do anything better, and devs certainly want those 3.5GB Gddr5 to really push the graphical limits", it becomes a boring discussion.

Instead of the talk be, what can Sony do to nullify that ram advantage and what can MS do to nullify the bw advantage, it seems to always turn to how there's no way for MS and there's way way for Sony.

I'm only even buying a PS4 this year, and will only buy Xbox if it shows me something wonderful I didn't expect, and it will be next year at best. So I don't feel the need to defend my choice, I'm just genuinely excited about what these machines can be and do.
 

Apath

Member
I always smile when I read this (very common) comment. As people always say on one hand that the 360 pad is near perfect, then go one to mention how some fundamental aspects of it are appalling! How can a pad be 'top notch' when the d-pad is practically unusable for games and and the chunky face buttons are uncomfortable? It's just saying your car is great, apart from the engine and chassis.

The great thing about the 360 pad is how comfortable it is for games that just uses sticks and triggers. For anything else, it's flawed. The DS3 is far more versatile, albeit with its own problems.

I hop both MS and Sony revise both their controllers, as their individual flaws are pretty obvious and simple to fix.
Because the d-pad, which is rarely used, is as integral as the chassis and engine of a car, right? DS3 is flawed too--those flaws are usually just as minor compared to the overall package.
 

Karma

Banned
Lol, doubtful. There's a reason every new high end GPU on the market comes with GDDR5 instead of far cheaper DDR3. DDR3 is typically used for system ram, not video ram. Having said that I'm sure many devs will get more use out of more DDR3 compared to less GDDR5. But I'd imagine as graphical features intensify, and devs start pushing the boat, that GDDR5 will start seeing serious advantages.

Did anyone even say what the bus speeds were? What if the bus speed on the GDDR5 is half of the DDR3?
 

nib95

Banned
Are you guys devs?

Seriously, if you guys are, my ears are open.

I've been waiting for devs to chip in on the advantages of one platform over the other.

Don't know about him but I'm not. But I do try and keep informed on this sort of stuff on all manner of different forums (B3D, Anand, OcUK etc etc).

I'm not a Microsoft employee either mind.

Did anyone even say what the bus speeds were? What if the bus speed on the GDDR5 is half of the DDR3?

Lol, do you honestly believe Sony would spend all that money on expensive GDDR5 to spunk it like that? Very plausible...


Anyway, debate and conversion is entertaining and all, but I do agree with Proelite that opinions from devs would be welcome.
 
Gemüsepizza;46683267 said:
So you want to tell us that (going by the order of your post above)... 8x250 GFLOPS = 2 TFLOPS GPU? ~20 compute units?

PS: I will now go back and edit my post, just so that you guys know.

Haha, I knew the info on the GPU was wrong. Microsoft has always taken smart decisions on that end. There's no way the '720' will have a weak GPU (or weaker than the PS4's).

Sony just can't afford losing too much money nowadays and for that reason alone I believe PS4's hardware will be simpler. I bet Durango will end up being the most powerful console.
 

CLEEK

Member
Because the d-pad, which is rarely used, is as integral as the chassis and engine of a car, right? DS3 is flawed too--those flaws are usually just as minor compared to the overall package.

I use the d-pad on my PS3, Wii/WiiU, Vita and 3DS all the time. Exclusively, for some games. So yeah, it is a big deal, you know? As I said, the 360 pad is great for games that just use sticks and triggers. If they're the sorts of games you play, you might only need the d-pad for selecting weapons or something. If you play a variety of genres, the d-pad can be the primary input.

For me, the single biggest improvement the 3DS XL offered over the original was the changed ergonomics mean the placement of the d-pad was in a comfortable position. I barely used the d-pad on the 3DS, but use it all the time on the XL now.
 

Proelite

Member
Lol, do you honestly believe that would be the case? Sony spending all that money on expensive GDDR5 to spunk it like that. Very plausible...

32mb Esram isn't cheap either, especially when you have to embedded in within the GPU itself.

Both companies went their own ways for ram implementation.

I think it's a premature to determine which one is the right choice.
 
Would love to hear John Carmack's thoughts on the RAM differences, I hope we get some lengthy interviews with him once these things are officially unveiled.

You got that games press?
 
Its almost always preferable to have larger pool of memory as long as the bandwidth is sufficient as far as programing is concerned. Anyways the difference in bandwidth would be hardly noticeable considering what they are going to do with that bandwidth and the difference thats likely to be there.

The ESRAM will eliminate the extreme waste of putting in large amount of fast memory when you don't need it. Whether its enough to offset it? Only MS knows. It will deliver the speed needed to play games. Fix function hardware close to the ESRAM could do much much more and much much faster than the traditional PC memory setup.

We will have to see how it pans out but it way too early to judge if the ps4 or nextbox will have a better memory system. And we don't even have the Wizard jizz yet!
 

nib95

Banned
You're a Sony fan, which is the same but you don't get paid.

Money and employment would dictate my loyalties far more than anything else lol.


On a side note, I think stepping back from the discussion might be good for you lol. You're getting pretty emotionally charged and obtuse.
 

Gorillaz

Member
Both respective makers will wait until the install base gets higher before they go guns blazing.Like always. We ain't seeing a Uncharted 4 at launch and killzone 4 will probably be in the spring
MS might show some good games at launch, but they will save the best for later


360 was weak at launch?

tumblr_mgp2wsbvVG1qgg8yfo2_250.gif
300-400 for what they released for launch?
50laugh.gif
 
Are you guys devs?

Seriously, if you guys are, my ears are open.

I've been waiting for devs to chip in on the advantages of one platform over the other.

Yet you had no issue pointing out that Orbis will have the same problems with ports that the ps3 did this gen.

Are you a dev?
 

oldergamer

Member
Yep, MS's memory setup was design to allow for a large amount of RAM for OS funtions/ads/Kinect/bloatware. Sony's setup is designed for pushing graphics.

This statement is pretty inaccurate. People here keep saying the MS will make a bloatware OS in regards to consoles, but the reality is they always have had a extremely lean version of the OS on their consoles. As i recall hearing a while ago, the OS on the 360 takes up "less" space then the OS on the PS3.

Also all this talk about both console using x86 cpu's simply doesn't make sense. Ms is designing a more customized solution and Sony is using off the shelf parts. We already heard a few years ago about MS researching and potentially designing a custom CPU.

We also heard a long time ago that MS is planning to subsidize a console price if you sign up for X amount of years of xbox live (like a cell phone). Remember the articles detailing they were testing this out in certain areas. So in theory they could have a more expensive console price wise, but you would be on a contract like a cell phone.

Also AMD cannot sub license x86 technology to anyone, only intel "can" and essentially won't. they will fab chips for you, but not grant you a x86 license. AMD can only Fab x86 chips for customers. Those chips can only be fabbed at specific foundries.



There's a lot of rumors that doesn't make any logical sense floating around.
 
I'm not being defensive at all, we are basically making assumptions based on what we think devs want. We're probably not developers, we certainly don't know the final specs, and certainly we haven't talked or heard from developers, aside from Crytek.

It's one thing to believe that Gddr5 will allow for X to be done better, but when the same people look back at the other side and only come up with "nah, that won't do anything better, and devs certainly want those 3.5GB Gddr5 to really push the graphical limits", it becomes a boring discussion.

Instead of the talk be, what can Sony do to nullify that ram advantage and what can MS do to nullify the bw advantage, it seems to always turn to how there's no way for MS and there's way way for Sony.

I'm only even buying a PS4 this year, and will only buy Xbox if it shows me something wonderful I didn't expect, and it will be next year at best. So I don't feel the need to defend my choice, I'm just genuinely excited about what these machines can be and do.

Sure, there are options available, however just because alternative exist, it doesn't mean that they'll be put in these systems. That goes for Orbis and Durango. Maybe Durango secret sauce will improve the BW of the DDR3 memory or there's a group of flash chips that will allow Orbis to use for fast data transfer to feed their ram pool. However we (At least I) have no idea what they are or if they even exist in a shape or form that can be put in these consoles prior to launch and not have RROD or YLOD issues. We cannot have meaningful discussions if the counter argument is going to be secret sauce/wizardjizz and unicorns. We just need to wait until more details are out in the open and evaluate the systems again.

So nothing is set in stone, but I do understand why we see some conclusions being made on insufficient data.
 
Money and employment would dictate my loyalties far more than anything else lol.

The point being that he has his reasons, you have yours. Difference being, although he's getting paid he's actually having a pretty conservative view on the power balance between these consoles. And although you don't get paid, your opinion is always one sided.

You could say, well at least he gets paid to have his biased opinion, while you do it for free. In order to escape it, you need to be unbiased. Wishing for a strong product is one thing, wishing for a worse product from the competition? Well I hope you are getting paid man.

Sure, there are options available, however just because alternative exist, it doesn't mean that they'll be put in these systems. That goes for Orbis and Durango. Maybe Durango secret sauce will improve the BW of the DDR3 memory or there's a group of flash chips that will allow Orbis to use for fast data transfer to feed their ram pool. However we (At least I) have no idea what they are or if they even exist in a shape or form that can be put in these consoles prior to launch and not have RROD or YLOD issues. We cannot have meaningful discussions if the counter argument is going to be secret sauce/wizardjizz and unicorns. We just need to wait until more details are out in the open and evaluate the systems again.

So nothing is set in stone, but I do understand why we see some conclusions being made on insufficient data.

Of course, but this should spike curiosity instead of pushing us into premature conclusions.

Money and employment would dictate my loyalties far more than anything else lol.


On a side note, I think stepping back from the discussion might be good for you lol. You're getting pretty emotionally charged and obtuse.

I understand that in the internet when someone takes their time to put in check points we are trying to make, it always seems aggressive to us. But this isn't the case at all, and it's not very nice to call me obtuse when I haven't given you reasons for so.

We can always just agree to disagree instead of trying other ways for the conversation to end ;)
 
Top Bottom