• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Xbox 3 = 6-core CPU, 2GB of DDR3 Main RAM, 2 AMD GPUs w/ Unknown VRAM, At CES

Status
Not open for further replies.
Proelite said:
But PCs have up to 8 gigs of ram, they are dirt cheap nowadays. Why can't they pack in at least four gigs or the console will be outdated in two years.

You should compare that spec sheet RAM to video card RAM.

edit -

iamshadowlark said:
How credible is this site supposed to be?

The site doesn't look credible at all.
 

Proelite

Member
DopeyFish said:
That memory bandwidth is so insane that it's more than the PCIe x16 3.0 lane and DDR3 memory bandwidth added together and multiplied by 6

Just to give you an idea

Well that flew over your heads.

Hynix Semiconductor introduced the industry's first 1 Gib GDDR5 memory. It supports a bandwidth of 20 GB/s on a 32-bit bus, which enables memory configurations of 1 GiB at 160 GB/s with only 8 circuits on a 256-bit bus. Hynix 2 Gbit GDDR5 boasts a 7 GHz clock speed. The newly developed GDDR5 is the fastest and highest density graphics memory available in the market. It operates at 7 GHz effective clock-speed and processes up to 28 GB/s with a 32-bit I/O. 2 Gbit GDDR5 memory chips will enable graphics cards with 2 GiB or more of onboard memory with 224 GB/s or higher peak bandwidth. The memory maker claims that the new chip will be in demand in the second half of 2010.

Hmm. Maybe 256 GB/S isn't so out there afterall.
 

guek

Banned
Proelite said:
But PCs have up to 8 gigs of ram, they are dirt cheap nowadays. Why can't they pack in at least four gigs or the console will be outdated in two years.

256 gb/s GDDR5 LOL
because PC ram and console ram are not the same ram
 

Kevin

Member
Yeah these specs are quite disappointing. The leap between last gen and current gen was significantly higher. I don't think the advances in graphics on these systems will be that much more substantial then current stuff. I'm no expert on this though. I will say I highly doubt AMD's previous statement of "Avatar like" visuals.

Also, this would continue to hold back PC gaming as well since as we know most companies focus on consoles first and PCs second which is why PC gaming hasn't progressed much over the years. My opinion of course and I own a GTX 590 with Battlefield 3, Witcher 2, Skyrim, Crysis 2 DX11, etc. While they look great, they aren't that big of a leap considering how many years have passed.
 
claviertekky said:
Power7 CPUs have a requirement to be made in the 45nm process.

That said, I don't know how credible that site just posted is.

Just seems like speculative conclusions made from the info posted yesterday.

I know. My point is a 6-core POWER7 is just the 8-core with two cores disabled. Too much wasted die space for a console.
 

BurntPork

Banned
100MB eDRAM?

TLje1.gif


Give that to the person who made this rumor and force them to look at it for 24 hours.
 

DieH@rd

Banned
So i just visited that gddr5 wiki page.

Eight 2Gib chips [for total of 2gb] running at 512 bus could get "peak" performance of 250GB/s. Impressive, if true.

/still not convinced
 
DopeyFish said:
it's plausible. however GDDR5 doesn't exist in that speed afaik

that's operating at the same bandwidth as the EDRAM in the Xbox 360 (iirc) which would mean... this thing would absolutely slaughter PCs without blinking

That's really interesting, maybe the rumored 2gb of GDDR3 is just leaked early dev-kit specs. How would the Power7 CPU stack up compared to Xenon? Sounds like they went for another "Bruce Lee" style console, which after Kinect, I wasn't sure they were going to do.

Any chance Epic could complain and we get 3gb of GDDR5? Or is that just too costly?
 
Kevin said:
Yeah these specs are quite disappointing. The leap between last gen and current gen was significantly higher. I don't think the advances in graphics on these systems will be that much more substantial then current stuff. I'm no expert on this though. I will say I highly doubt AMD's previous statement of "Avatar like" visuals.

Also, this would continue to hold back PC gaming as well since as we know most companies focus on consoles first and PCs second which is why PC gaming hasn't progressed much over the years. My opinion of course and I own a GTX 590 with Battlefield 3, Witcher 2, Skyrim, Crysis 2 DX11, etc. While they look great, they aren't that big of a leap considering how many years have passed.


This.
 

DopeyFish

Not bitter, just unsweetened
Kevin said:
Yeah these specs are quite disappointing. The leap between last gen and current gen was significantly higher. I don't think the advances in graphics on these systems will be that much more substantial then current stuff. I'm no expert on this though. I will say I highly doubt AMD's previous statement of "Avatar like" visuals.

Also, this would continue to hold back PC gaming as well since as we know most companies focus on consoles first and PCs second which is why PC gaming hasn't progressed much over the years. My opinion of course and I own a GTX 590 with Battlefield 3, Witcher 2, Skyrim, Crysis 2 DX11, etc. While they look great, they aren't that big of a leap considering how many years have passed.

Uhhhh

A console at these specs (depending on the gpus) if segaleaks is accurate (they are not) but to just give it an ounce of thought

This would be a far larger jump than Xbox->360

IMO
 

Log4Girlz

Member
Honestly, 2 GB of RAM would make me perfectly happy. If its a unified pool, then you can easily allocate more than 1 GB to texture memory alone, that's really not too bad.
 
DopeyFish said:
Uhhhh

A console at these specs (depending on the gpus) if segaleaks is accurate (they are not) but to just give it an ounce of thought

This would be a far larger jump than Xbox->360

IMO

Don't know about the jump comparison, but it would definitely be an awesome console.
 

SkylineRKR

Member
Kevin said:
Yeah these specs are quite disappointing. The leap between last gen and current gen was significantly higher. I don't think the advances in graphics on these systems will be that much more substantial then current stuff. I'm no expert on this though. I will say I highly doubt AMD's previous statement of "Avatar like" visuals.

Also, this would continue to hold back PC gaming as well since as we know most companies focus on consoles first and PCs second which is why PC gaming hasn't progressed much over the years. My opinion of course and I own a GTX 590 with Battlefield 3, Witcher 2, Skyrim, Crysis 2 DX11, etc. While they look great, they aren't that big of a leap considering how many years have passed.

I think that seeing the current state of the economy its very expected that this leap won't be as big as the ones before it. With the rise of casual gaming the demand for graphics isn't what it once used to be as well. Its more about what kind of innovations the consoles actually offer. Its suicide to go all out with specs, and then fall flat on your face because of either selling for a premium price or selling at a major loss per console.

I'm not really looking for better graphics myself. All I want is better performance. No sub-HD sub 30fps and lack of AA and other shit, but just Crysis 2 and BF3 at a locked 60fps and native 1080p resolution. And 64 guys online. If such games are launch material, i'm fine... probably.
 

DieH@rd

Banned
-ImaginaryInsider said:
Any chance Epic could complain and we get 3gb of GDDR5? Or is that just too costly?

At these speeds, i recon that ram is not cheap. And they need to find place for 4 additional chips on board. If they want to make that change, they need to do it NOW if the wish to hit late 2012 window.
 
Kevin said:
Yeah these specs are quite disappointing. The leap between last gen and current gen was significantly higher. I don't think the advances in graphics on these systems will be that much more substantial then current stuff. I'm no expert on this though. I will say I highly doubt AMD's previous statement of "Avatar like" visuals.

Also, this would continue to hold back PC gaming as well since as we know most companies focus on consoles first and PCs second which is why PC gaming hasn't progressed much over the years. My opinion of course and I own a GTX 590 with Battlefield 3, Witcher 2, Skyrim, Crysis 2 DX11, etc. While they look great, they aren't that big of a leap considering how many years have passed.

I do not think this rumor is true, but honestly I don't think you can ask for much more.


edit -

SkylineRKR said:
I'm not really looking for better graphics myself. All I want is better performance. No sub-HD sub 30fps and lack of AA and other shit, but just Crysis 2 and BF3 at a locked 60fps and native 1080p resolution. And 64 guys online. If such games are launch material, i'm fine... probably.

I agree with this. I also want better physics and less pop-in. I don't need a huge upgrade.
 

DopeyFish

Not bitter, just unsweetened
SkylineRKR said:
I think that seeing the current state of the economy its very expected that this leap won't be as big as the ones before it. With the rise of casual gaming the demand for graphics isn't what it once used to be as well. Its more about what kind of innovations the consoles actually offer. Its suicide to go all out with specs, and then fall flat on your face because of either selling for a premium price or selling at a major loss per console.

I'm not really looking for better graphics myself. All I want is better performance. No sub-HD sub 30fps and lack of AA and other shit, but just Crysis 2 and BF3 at a locked 60fps and native 1080p resolution. And 64 guys online. If such games are launch material, i'm fine... probably.

100 MB EDRAM would ensure 1080pw/ 4xAA in basically every game

that memory bandwidth would ensure PCs wouldn't catch up in a long, long time

what makes this smell like bullshit is that the EDRAM would -not- be necessary with pipe that fast... it just doesn't make sense to me
 

Gaborn

Member
DopeyFish said:
Uhhhh

A console at these specs (depending on the gpus) if segaleaks is accurate (they are not) but to just give it an ounce of thought

This would be a far larger jump than Xbox->360

IMO

Frankly, it's madness. I mean, 6 core isn't even PC Standard yet. Traditionally there is a lag between consoles and PC gaming. I really, really really really really doubt this. It's nice wishful thinking though.
 

Faddy

Banned
guek said:
because PC ram and console ram are not the same ram

So much this although...

If Microsoft finally has the impetus to put Windows 8 then maybe 2gigs of system ram for running the OS, a web browser (the area where a large amount of ram is recommended), email clients and messaging systems that might be a reasonable estimate.

The whole Windows 8 synergy across PCs, tablets and phones might extend to consoles which brings into question whether PowerPC architecture will still be used or will they opt for x86 or ARM so there is compatibility across all their devices.
 

clav

Member
Gaborn said:
Frankly, it's madness. I mean, 6 core isn't even PC Standard yet. Traditionally there is a lag between consoles and PC gaming. I really, really really really really doubt this. It's nice wishful thinking though.
We're talking about PPC architecture here though, and PPCs aren't used in user PCs although Apple formally used them before the switch to Intel chips.
 

Kevin

Member
I think at this point there are far to many unknowns to determine just how powerful these consoles will actually be. As we have learned, consoles operate a lot differently then PCs and game companies get a lot more out of console hardware then they do with equal hardware PCs. So these specs, if true may not be entirely a bad thing. I really do hope we get some factual information at CES in January but I have my doubts.

I also still wonder if AMD is sticking with their, 'Avatar" quality visuals comment they made not that long ago. If anyone was to know how powerful the next gen consoles are going to be then it would likely be the guys that make the graphics chips. :)
 

SkylineRKR

Member
DopeyFish said:
100 MB EDRAM would ensure 1080pw/ 4xAA in basically every game

that memory bandwidth would ensure PCs wouldn't catch up in a long, long time

what makes this smell like bullshit is that the EDRAM would -not- be necessary with pipe that fast... it just doesn't make sense to me

The EDRAM thing is the one thing I absolutely don't believe.

For the rest, an i7 and perhaps Cayman GPU could be reality. It would be 2 years old by late 2012.
 

Gaborn

Member
claviertekky said:
We're talking about PPC architecture here though, and PPCs aren't used in user PCs although Apple formally used them before the switch to x86 Intel chips.

Even so, I'd be very concerned about the price point of a hexcore system (much less one with a dual GPU). I mean, if we're talking about 2012 NO WAY I believe it. 2013? Maybe.
 
DopeyFish said:
100 MB EDRAM would ensure 1080pw/ 4xAA in basically every game

that memory bandwidth would ensure PCs wouldn't catch up in a long, long time

what makes this smell like bullshit is that the EDRAM would -not- be necessary with pipe that fast... it just doesn't make sense to me

Future proof, 4k resolution baby...smiles
 

guek

Banned
SkylineRKR said:
The EDRAM thing is the one thing I absolutely don't believe.

In comparison, the 360 had 10mb EDRAM in its gpu. And before any numbnuts yells "10X leap is standard!!", the gamecube had 3mb EDRAM in its gpu.
 

regs

Member
Weren't there rumors a awhile back that said that certain games on the new xbox would be playable on the old but you get the better graphics/features on the newer console?

ie: halo 4 would play on the 360 and the new xbox but you would get the full experience on the new machine
 

DopeyFish

Not bitter, just unsweetened
guek said:
In comparison, the 360 had 10mb EDRAM in its gpu. And before any numbnuts yells "10X leap is standard!!", the gamecube had 3mb EDRAM in its gpu.

The GameCube memory bandwidth for the EDRam was like 2.5 GB/s... Xbox 360 was 256

This would be 384 GB/s

Yikes.
 
DieH@rd said:
At these speeds, i recon that ram is not cheap. And they need to find place for 4 additional chips on board. If they want to make that change, they need to do it NOW if the wish to hit late 2012 window.

It was just wishful thinking on my part, if the specs are true, it sounds like it will be a nice jump from what we have. Hopefully it will be enough to keep the system in the general performance ballpark of whatever Sony will release.
 
For those curious about the prospects of an ARM powered next-gen console, consider Qualcomm just announced ARM powered PC's for 2012:

Qualcomm expects the first Snapdragon-powered Windows 8 PC to arrive a year from now, marking the entry into a lucrative new business for the wireless chip company.

Qualcomm is already working with Microsoft to ensure that computers running on the next-generation operating system will be able to run on its chips based on ARM's technology, which sacrifice processing power for more energy efficiency and ability to always remain connected. Qualcomm CEO Paul Jacobs said he sees a majority of the Windows 8 products coming after the end of fiscal 2012, which comes in September.

http://news.cnet.com/8301-1035_3-57326167-94/qualcomm-sees-snapdragon-powered-pcs-by-late-2012/
 

Proelite

Member
Kevin said:
Yeah these specs are quite disappointing. The leap between last gen and current gen was significantly higher. I don't think the advances in graphics on these systems will be that much more substantial then current stuff. I'm no expert on this though. I will say I highly doubt AMD's previous statement of "Avatar like" visuals.

Also, this would continue to hold back PC gaming as well since as we know most companies focus on consoles first and PCs second which is why PC gaming hasn't progressed much over the years. My opinion of course and I own a GTX 590 with Battlefield 3, Witcher 2, Skyrim, Crysis 2 DX11, etc. While they look great, they aren't that big of a leap considering how many years have passed.

Yeah instead of being capable of running laps around the best pcs, these specs can only allow the next xbox to leave it in the dust.

They are bullshit wishful fanwank.
 

Raistlin

Post Count: 9999
guek said:
In comparison, the 360 had 10mb EDRAM in its gpu. And before any numbnuts yells "10X leap is standard!!", the gamecube had 3mb EDRAM in its gpu.
Though many would argue MS fucked up and should have went with 16MB or so.

10MB simply wasn't enough to support 720p with enough effects in a single pass.
 

DopeyFish

Not bitter, just unsweetened
Raistlin said:
Though many would argue MS fucked up and should have went with 16MB or so.

10MB simply wasn't enough to support 720p with enough effects in a single pass.

About 40 MB is the perfect sweet spot iirc

100 MB is just plain ridiculous... And once again... With 256 GB/s the EDRam would be pointless lol
 

Proelite

Member
DopeyFish said:
About 40 MB is the perfect sweet spot iirc

100 MB is just plain ridiculous... And once again... With 256 GB/s the EDRam would be pointless lol

I think Gaf should leak the real loop specs on our homepage. We have enough credibility to cause some chaos.
 

Slayer-33

Liverpool-2
DopeyFish said:
it's plausible. however GDDR5 doesn't exist in that speed afaik

that's operating at the same bandwidth as the EDRAM in the Xbox 360 (iirc) which would mean... this thing would absolutely slaughter PCs without blinking


I admit I got wood.
 

Raistlin

Post Count: 9999
DieH@rd said:
Second to last paragraph @ http://en.wikipedia.org/wiki/GDDR5, and those are 256mbit numbers.
Let me explain.


First off, I'm aware of Hynix's tech. However, who is using it at this point? Is it in large-scale production at reasonable prices? (serious question ... I thought it wasn't, but will be happy if wrong)


Regardless, my issue isn't necessarily just regarding the GDDR5. The issue is the overall architecture. If you read my previous posts, my main criticism leveled against the original 'leaked specs' wasn't just about the individual parts - it was the architecture as a whole.

In the case of the original specs, the DDR3 made little sense in the context of a system with a hex-core processor. You don't include 6 cores just for multitasking and OS functions. The assumption is that it would also contribute to actual in-game processing - stuff like AI, procedural animation, etc. Using slow-ass RAM would unnecessarily gimp performance, making the spec list suspect.

Here we see the opposite. Let's for a minute assume they will use Hynix tech. If so, explain why there is any eDRAM, let alone the insane 100MB being cited? The supposed GDDR5 actually has the same bandwidth as the eDRAM in 360's Xenos.


So think about it for a second. Why would MS use what is likely quite expensive RAM, and then at significant expense include a crazy amount of eDRAM when they already have serious bandwidth? Taken as a whole, it looks suspect. It seems redundant, and gets even crazier when you consider costs.
 

Log4Girlz

Member
Raistlin said:
Let me explain.


First off, I'm aware of Hynix's tech. However, who is using it at this point? Is it in large-scale production at reasonable prices? (serious question ... I thought it wasn't, but will be happy if wrong)


Regardless, my issue isn't necessarily just regarding the GDDR5. The issue is the overall architecture. If you read my previous posts, my main criticism leveled against the original 'leaked specs' wasn't just about the individual parts - it was the architecture as a whole.

In the case of the original specs, the DDR3 made little sense in the context of a system with a hex-core processor. You don't include 6 cores just for multitasking and OS functions. The assumption is that it would also contribute to actual in-game processing - stuff like AI, procedural animation, etc. Using slow-ass RAM would unnecessarily gimp performance, making the spec list suspect.

Here we see the opposite. Let's for a minute assume they will use Hynix tech. If so, explain why there is any eDRAM, let alone the insane 100MB being cited? The supposed GDDR5 actually has the same bandwidth as the eDRAM in 360's Xenos.


So think about it for a second. Why would MS use what is likely quite expensive RAM, and then at significant expense include a crazy amount of eDRAM when they already have serious bandwidth? Taken as a whole, it looks suspect. It seems redundant, and gets even crazier when you consider costs.

Latency?
 
Status
Not open for further replies.
Top Bottom