• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF/Eurogamer: First Xbox 3 Devkit leaks, 8 Core Intel CPU, nvidia GPU, 8-12GB RAM

DopeyFish

Not bitter, just unsweetened
Has Intel ever done a console CPU?

Xbox 1.

Was a pentium 3 733 with half the cache

So sorta like a celeron but not a celeron. Don't think they'd do one again because intel is very protective about their IP... While MS and Sony tend to want to alter the products and take it to their own fabs (royalty)
 

antonz

Member
So how reliable is DF with rumors?

Id wager its pretty accurate. The word is MS is going with an AMD based processor so as the kits near finalization they will start getting those components though I imagine they still have another processor jump to make from FX line to the rumored jaguar cores.
 

Mindlog

Member
It is but it would indicate a pretty large shift in design. I mean Quad Core i5 Sandy bridges were outperforming AMDs 8 Core chips in many cases.

Sounds like cost cutting measures while maintaining an adequate level of performance
It sounds like the standard shift from guesstimated to real hardware.
 

Durante

Member
It's funny that everyone appears to be going with AMD CPUs, given that at the power envelope consoles are going for Intel beats the crap out of them by a massive margin.
 

Panajev2001a

GAF's Pleasant Genius
It's funny that everyone appears to be going with AMD CPUs, given that at the power envelope consoles are going for Intel beats the crap out of them by a massive margin.

Yes, If you go x86 right now you should still go Intel...
Unless, for what console maker want to achieve, their new Fusion APU's are really onto something Intel is not providing yet and integrating an Intel CPU with a GPU from nVIDIA the same way AMD does with their CPU and GPU cores might be more challenging (****-blocked by nVIDIA for example, in this case).

Rumors were that Intel could have gotten at least one design with LRB, maybe even both makers, but Intel did not deliver on that front.
 

Zarx

Member
It's funny that everyone appears to be going with AMD CPUs, given that at the power envelope consoles are going for Intel beats the crap out of them by a massive margin.

I can only imagine that AMD gave them a sweetheart deal on the licensing.
 
No kidding. That's one hell of a performance drop.

Dont know but i reckon the previous devkit was also using pci channel to put data on the gpu.
More like a traditional pc, maybe current kits have a fully working HSA board inside.

So with Direct Compute/C++ AMP and HSA with unified memory architecture could actually be a big performance boost. Do wonder if its a APU(fx + weaker gpu) + dedicated gpu or a FX + dedicated gpu.

But i could be wrong.
 

Zarx

Member
Dont know but i reckon the previous devkit was also using pci channel to put data on the gpu.
More like a traditional pc, maybe current kits have a fully working HSA board inside.

So with Direct Compute/C++ AMP and HSA with unified memory architecture could actually be a big performance boost. Do wonder if its a APU(fx + weaker gpu) + dedicated gpu or a FX + dedicated gpu.

But i could be wrong.

we also don't know the clocks they were using in the kits. could be a big boost in power for all we know.
 

mrklaw

MrArseFace
Going from an 8-core Sandy Bridge-E to an FX processor... that's a very large drop.

having an 8-core sandy bridge doesn't mean thats the power they were going for. More likely they just wanted something up and running quickly, and intel had 8-core chips available. The key was probably the requirement to have 8 hardware threads available for tools etc.

GTX570 is interesting - there are a range of GPUs available, so it would be easier to pick one that is closer to your intended target performance.

And why not use an AMD chipset - Nvidia seems an odd choice. Is Sea Islands rumoured to be making architectural changes that would have anything in common with Fermi?
 

Dr. Ecco

Neo Member
so how close to Meet The Robinsons are we talking here?

MeetTheRobinsons2.jpg

For a moment, I thought you were making reference to the toaster, not the graphics
 

Oblivion

Fetishing muscular manly men in skintight hosery
Wait, they're going with an Nvidia GPU? I thought MS had some bad blood with them after the first Xbox? Hence why they went with ATi with the 360?


The RAM count seems absolutely disgusting if true. That's an ~16-23X upgrade. I call shenanigans. Won't be any more than 4 gigs. 6 at the very most.
 

El_Chino

Member
Wait, they're going with an Nvidia GPU? I thought MS had some bad blood with them after the first Xbox? Hence why they went with ATi with the 360?


The RAM count seems absolutely disgusting if true. That's an ~16-23X upgrade. I call shenanigans. Won't be any more than 4 gigs. 6 at the very most.

You have to take in an account the OS and the KINECT sensor.
 

Router

Hopsiah the Kanga-Jew
The RAM count seems absolutely disgusting if true. That's an ~16-23X upgrade. I call shenanigans. Won't be any more than 4 gigs. 6 at the very most.

You need to account for the os (1gb min) and whatever the next Kinext will hog up.
 

JaseC

gave away the keys to the kingdom.
Wait, they're going with an Nvidia GPU? I thought MS had some bad blood with them after the first Xbox? Hence why they went with ATi with the 360?

More recent rumours indicate an AMD Sea Islands derivative.

The RAM count seems absolutely disgusting if true. That's an ~16-23X upgrade. I call shenanigans. Won't be any more than 4 gigs. 6 at the very most.

Which would be in line with the RAM figures mentioned in the thread title, as the specs are not referring to consumer SKUs: "DF/Eurogamer: First Xbox 3 Devkit leaks, 8 Core Intel CPU, nvidia GPU, 8-12GB RAM"
 

Oblivion

Fetishing muscular manly men in skintight hosery
You have to take in an account the OS and the KINECT sensor.

Oh right, forgot about the OS. Of course, no one gives a shit about that, it's what the games will end up utilizing that's actually important, after all.
 

thuway

Member
Oh right, forgot about the OS. Of course, no one gives a shit about that, it's what the games will end up utilizing that's actually important, after all.

I beg to differ. The new OS will probably use Windows RT applications, and let them run persistently in the background. Imagine playing a game with Skype in the background, DVRing your favorite TV show, and giving you live updates on your email/FB/twitter in real time.
 

SmokyDave

Member
I beg to differ. The new OS will probably use Windows RT applications, and let them run persistently in the background. Imagine playing a game with Skype in the background, DVRing your favorite TV show, and giving you live updates on your email/FB/twitter in real time.

I imagined and then I shuddered.
 

mrklaw

MrArseFace
I beg to differ. The new OS will probably use Windows RT applications, and let them run persistently in the background. Imagine playing a game with Skype in the background, DVRing your favorite TV show, and giving you live updates on your email/FB/twitter in real time.

PS3 can already DVR things for you while you play, and social feeds can be integrated into games easily. You can do that with a streamlined, efficient OS running critical features in the background with limited memory footprint and CPU usage.

I agree they might go Windows RT which will require a fairly large chunk of memory, but I don't think its particularly needed for having critical features running in the background - MS will just want to do it for 'synergy' shit with windows 8 etc.

so we'll probably lose 1-2GB for OS
 

Bear

Member
The RAM count seems absolutely disgusting if true. That's an ~16-23X upgrade. I call shenanigans. Won't be any more than 4 gigs. 6 at the very most.

Dev kits typically have twice the amount of RAM as the final units, for debug purposes.

Half of 8-12 GB is exactly the amount you suggested.
 
Dont know but i reckon the previous devkit was also using pci channel to put data on the gpu.
More like a traditional pc, maybe current kits have a fully working HSA board inside.

So with Direct Compute/C++ AMP and HSA with unified memory architecture could actually be a big performance boost. Do wonder if its a APU(fx + weaker gpu) + dedicated gpu or a FX + dedicated gpu.

But i could be wrong.
The allegedly having a FX processor (CPUs only) and separate GPU means there is NO HSA or any other type of efficiency seen in Sandy bridge (It's a Fusion design) or AMD APUs (Fusion design). The article has to be wrong on that point.
 

ekim

Member
The allegedly having a FX processor (CPUs only) and separate GPU means there is NO HSA or any other type of efficiency seen in Sandy bridge (It's a Fusion design) or AMD APUs (Fusion design). The article has to be wrong on that point.

What if there is just no HSA/SOC at all? :p
 
The allegedly having a FX processor (CPUs only) and separate GPU means there is NO HSA or any other type of efficiency seen in Sandy bridge (It's a Fusion design) or AMD APUs (Fusion design). The article has to be wrong on that point.

It wouldn't be the first time devs had to build games on Dev kits that didn't actually expose the most impactful hardware feature for a new console. In any case I wouldn't expect HSA to make a huge difference in the first generation of software. For now an FX and discrete GPU are plenty to build a game around. You'll still have lots more performance and memory to work with, and advanced DX11 features to implement. Even if you can't rely on the HSA benefits from the outset, those things will come along.
 

Perkel

Banned
Why? How does not using PowerPC make them "bad"?

Same reason why Xbox360 CPU was above and beyond what was available in 2006 in Intel/AMD stores.
I don't see Intel or AMD using their best CPUs in next gen consoles. Their best CPUs just have to high price to power and they have high power consumption opposite what consoles need. They need cheap CPU with low power consumption and power behind that.

I don't fallow what is IBM doing these days but i don't think they stopped developing CPUs in 2009-2010 and their CPU catalog should have vastly better CPU for console use.
 

Dabanton

Member
Oh man, even 100% pure gaming company doing it as well. It is quite seriously. I think it's much sooner to the end.

huh? A very overdramatic assessment.

It's the way things will be as we go forward a 'pure' games machine would be risking signing a sales death warrant. Even Nintendo knows this.

Look at the PS3 and the 360 take away their abilities to go on the internet,have media apps,stream movies, look at pictures etc and you would have a hell of of a lot less attractive systems.
 
Same reason why Xbox360 CPU was above and beyond what was available in 2006 in Intel/AMD stores.
I don't see Intel or AMD using their best CPUs in next gen consoles. Their best CPUs just have to high price to power and they have high power consumption opposite what consoles need. They need cheap CPU with low power consumption and power behind that.

I don't fallow what is IBM doing these days but i don't think they stopped developing CPUs in 2009-2010 and their CPU catalog should have vastly better CPU for console use.

Unless you want a behemoth server CPU, or want to fund development of a brand new chip, they really don't. And the 360's CPU was worse than its contemporary x86 processors in basically every single metric, except raw vector math (where it was still well behind the Cell!). And if MS or Sony want want fat vector performance on the CPU, AMD can graft on a monster vector unit the same way IBM did, though the current theory is those tasks will leverage HSA/GPGPU to alleviate the burden on the CPU as much as possible.
 

KageMaru

Member
It is but it would indicate a pretty large shift in design. I mean Quad Core i5 Sandy bridges were outperforming AMDs 8 Core chips in many cases.

Sounds like cost cutting measures while maintaining an adequate level of performance

I agree with the poster before saying they probably went with Intel so they could have something up faster for devs. This wouldn't be the first time early dev kits featured a better CPU for an Xbox platform. I just wonder if they are keeping devs informed of what they should really expect when sending out these kits.

so how close to Meet The Robinsons are we talking here?

MeetTheRobinsons2.jpg

Not even close for any next Gen system.

huh? A very overdramatic assessment.

It's the way things will be as we go forward a 'pure' games machine would be risking signing a sales death warrant. Even Nintendo knows this.

Look at the PS3 and the 360 take away their abilities to go on the internet,have media apps,stream movies, look at pictures etc and you would have a hell of of a lot less attractive systems.

Yeah people have this weird assumption that multi-media features have to come at the expense of gaming.

If anything, I would think these multi-media functions allow for a larger hardware budget since you're essentially expanding your potential market.
 
It wouldn't be the first time devs had to build games on Dev kits that didn't actually expose the most impactful hardware feature for a new console. In any case I wouldn't expect HSA to make a huge difference in the first generation of software. For now an FX and discrete GPU are plenty to build a game around. You'll still have lots more performance and memory to work with, and advanced DX11 features to implement. Even if you can't rely on the HSA benefits from the outset, those things will come along.
True but why go with more expensive hardware for the same bang? I'm expecting 1st generation games on next generation consoles to be GPU bound not CPU bound and 2nd generation to be CPU bound (from next gen engine demos). Am I wrong here? Why the 8 core FX with desktop CPUs with high power/heat? Is this a guess of the author and he chose the most powerful discrete CPUs offered by AMD and does not understand GPU compute is more powerful and for those few cases/threads that are more efficient on a CPU, more is better there too?

Edit, you know this:
Brad Grenz said:
And if MS or Sony want want fat vector performance on the CPU, AMD can graft on a monster vector unit the same way IBM did, though the current theory is those tasks will leverage HSA/GPGPU to alleviate the burden on the CPU as much as possible.
It's also a disconnect to go from programming a Sandy bridge Fusion design to a non fusion design, timing would be thrown off, rules thrown out the window....see the point. It makes sense if you are moving closer to the final design but no sense if you are moving away. Also to be forward compatible you have to follow HSA rules. Those could be emulated by the OS I guess.
 
What if there is just no HSA/SOC at all? :p
That went though my head also but I dismissed it. What if the PS4 is still Cell plus discrete GPU, that makes as much sense. Cell is dead because APUs replaced it. GPGPU discrete GPUs can do some of the CPU duties but without a coherent and fast buss between CPU and GPU along with a bunch of HSA rules in how you use them, it makes more sense to use an advanced Cell processor over a X86 CPU. This is an old argument in the PS4 threads when we didn't know (at least I was ignorant) about Fusion designs and APUs and we couldn't believe Sony went with a X86 CPU.

I think we are discussion the first developer hardware that was replaced by a AMD APU and descrete GPU (just like the PS4 developer kits) and following that is the beta with final hardware design and that is according to rumors being shipped to select developers now.
 

Durante

Member
Imagine playing a game with Skype in the background, DVRing your favorite TV show, and giving you live updates on your email/FB/twitter in real time.
You mean, imagine a PC? Can I also compile stuff and run a webserver in the background? ;)
 
From the PS4 thread:

Advanced Micro Devices, Inc, or AMD, a semiconductor company, has announced its collaboration with Microsoft Corp for more than 125 Windows 8-based PC designs from OEMs including ASUS, Dell, Fujitsu, HP, Lenovo, Samsung, Sony, Toshiba and more.

Mainstream and ultrathin notebooks, tablets, all-in-one and traditional desktops, home theater PCs and embedded designs powered by the second generation AMD A-Series APUs and AMD Z-Series APUs

-- AMD Start Now Technology: AMD-powered Windows 8-based notebooks boot,
resume and respond faster than competing x86 solutions(1);
-- AMD Catalyst(TM) drivers compatible with Windows 8 featuring support
for DirectX 11, DirectX 11.1 and Windows Display Driver Model 1.2;
-- AMD AllDay(TM) Power enables consumers everywhere to experience
unmatched mobility with more than 12 hours of resting battery life on
their AMD-based device(2);
-- AMD Eyefinity Technology: a feature unique to AMD-powered PCs,
consumers can now span their Windows 8 desktop, user interface, games
and apps seamlessly across three or more monitors for a truly
immersive experience(3);
-- AMD App Acceleration: AMD Radeon GPUs with AMD App Acceleration let
you run multiple applications at the same time with remarkable speed
and reliability that provide enhanced performance beyond traditional
graphics and video processing. Customers running AMD-powered Windows 8
PCs can run desktop apps as well as new apps available from the
Windows Store and from AMD AppZone for a fast and fluid experience.
The above features should be in Game consoles too regardless of the OS.

AMD Start Now is probably the 16 gig nand Flash mentioned as being in the PS4 and it should also be in the Xbox 720 or Durango or Xbox 8 whatever it will be called.

Eyefinity:

1) It's a DP interface that supports multiple monitors including the two LCD screens in a head mounted display as well as the TV all at the same time.
2) Or three possibly more monitors at the same time in any configuration.
3) Or a TV and the patented Xbox 720 peripheral room view at the same time.

AllDay tech is multiple power saving modes built in to the APU.
 

M3d10n

Member
why does a devkit have twice the RAM than retail? i dont understand.

Debug code is much larger than release code due to added stuff like checks and which source file and line each instruction block comes from. Also, in order to properly debug memory allocation problems special markers need to be written around allocated blocks, which will bloat memory usage. The extra RAM is also used for debugging tools.
 
Top Bottom