• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

TRUTHFACT: MS having eSRAM yield problems on Xbox One

Status
Not open for further replies.
if it is just yields, is there a chance MS brute forces it and just orders fuck loads of chips to get enough good ones, and sucks up the cost - assuming yields will improve within 12 months?

I doubt Fab capacity is elastic enough for that approach, especially at the leading edge nodes.
 

Flo_Evans

Member
This shit reminds me of that Michael Crichton novel Disclosure. Have they checked the air handlers in the fab plant? Have they?!
 
MS has promised "practically silent operation" for the Xbox One to fit with the living room theme. The heat issues could be a result of subpar cooling brought on by low noise requirements.
 

gofreak

GAF's Bob Woodward
Sorry if already answered but how would dropping clockspeed improve yields?

Cutting the number of compute units would seem more obvious ala the redundant 8th SPE in PS3 cell.
 

Spongebob

Banned
Wow, I didn't think of that. This is why I'm having a hard time believing this. If it ends being true the differences in games would be pretty noticeable. I mean, what are the number for the new bandwidth of the ESRAM with the rumoured downgrade, fellow GAFers?
A slower GPU wouldn't need as much memory bandwidth.
 

GribbleGrunger

Dreams in Digital
That's what I can't figure out. Who is the PR team that is handling this fiasco? Whoever it is needs to bring this under control and start to really hammer home some counterpoints before the narrative is so set in stone it would take a chisel to undo it.

At this point even a jack hammer would struggle to put a dent in it. I just can't believe what I've been seeing in the last few weeks. This is an unprecedented meltdown that will already have spread beyond the confines of forums.
 

cchum

Member
if it is just yields, is there a chance MS brute forces it and just orders fuck loads of chips to get enough good ones, and sucks up the cost - assuming yields will improve within 12 months?

May have already tried. Disclaimer: Charlie can be inaccurate at times.

"The chip, which is still referred to as ‘Oban’, is being run through multiple fabs in very high quantities, too high by more than an order of magnitude to simply be for dev kits. "

http://semiaccurate.com/2012/09/04/microsoft-xbox-next-delay-rumors-abound/
 

onQ123

Member
So after all the Wii U bashing do you think Xbox biggest fans will be able to accept that the PS4 GPU could have a bigger advantage over the Xbox One as the Xbox One has over the Wii U?
 

artist

Banned
Sorry if already answered but how would dropping clockspeed improve yields?

Cutting the number of compute units would seem more obvious ala the redundant 8th SPE in PS3 cell.
Since we dont know what kind of yield issues there could be (functional units vs functional units at target clock), my take;

The eSRAM has a clock of it's own too. So if the issue is indeed concerning the eSRAM, then;

1. Downclock the eSRAM (Bandwidth gets effected)
2. Downclock the GPU (Raw graphics power gets effected)
3. Fuse off CUs in the GPU and maintain the same clock (Raw graphics power gets effected)

Either of the above is not promising.

Every 100MHz drop in the ESRAM drops the eSRAM bw by ~13Gbps. 10CUs at 800MHz ~ 12CUs at 667MHz (~ 1TFlop)
 
Good God, has this been a nightmare of a reveal for the Xbox One. So much controversy and unanswered questions going into E3, not to mention all the new games. This year's E3 is definitely gonna be one of the most interesting in a very long time.
 
Sorry if already answered but how would dropping clockspeed improve yields?

Cutting the number of compute units would seem more obvious ala the redundant 8th SPE in PS3 cell.

Would cutting the number of compute units have an even bigger impact in terms of performance than a downclock?
 

Pistolero

Member

Matt's intervention was about heat issues and inadequate yields, which hve been rumored for a while now, so there is no reason to doubt him -especially someone with his post history-. But he didn't hint at a any downclock...

Regardless of these details, the whole picture baffles the mind. Microsoft launched first and enjoyed great success owing to the longest generational period yet. They had all the time and the money in the world, and they still give the impression of being late to the game. Incomprehensible in any language...
 

KidBeta

Junior Member
Hopefully they'll fix all their issues by the time the console launches. Lower clockrate would be pretty bad.

They need to fix it well before the console launches, the units for the launch should be starting to ramp up for production soon.
 

3 to 4x weaker than a 780 is weak.

Remember this is before the console has even come out usually it takes a year or 2 for pc tech to leapfrog new console power.

This time however right off the bat pcs melt consoles somehow. Disappointing all the way around especially as a pc gamer. It will hold pc gaming down in a fear years because developers always develop for the lowest common denominator.
 

AOC83

Banned
If meaning all PC's? Then no of course not. You'll still have those DX8 alikes using Integrated cards like me.

If talking High end? Yeah, woefully underpowered in comparison.

Yes, i was talking high end PCs and no it´s not woefully underpowered compared to those, except you are talking about a NASA supercpomputer.
 

Flo_Evans

Member
Sorry if already answered but how would dropping clockspeed improve yields?

Cutting the number of compute units would seem more obvious ala the redundant 8th SPE in PS3 cell.

I'm no expert in silicon manufacturing but from what I understand there are always tiny errors in the chip. Running at a higher clock speed makes these flaws "break" and causes errors.

Basically all i7 intel chips are supposed to be 3.8GHz, but they have some way of testing them and the 'rejects' get sold as 3.4, 3.2, ect.
 
A slower GPU wouldn't need as much memory bandwidth.
If we were talking about GDDR5 bandwidths I'd agree, but that isn't the case. The ESRAM was there to address that issue and it didn't even have the bandwidth of a slow GDDR5. I mean, there are GPUs in the PC market with a raw power even to the X1's and already have more bandwidth, didn't they?
 
At this point even a jack hammer would struggle to put a dent in it. I just can't believe what I've been seeing in the last few weeks. This is an unprecedented meltdown that will already have spread beyond the confines of forums.


Microsoft worked hard over the last few years to gain the trust of gamers.. only to shit on everything they had earned with the force of a volcanic explosion with the XBONE
 
Has anyone considered that maybe is not only yields but also durability they are after? After all the RROD fiasco, I'm pretty sure Microsoft want to make sure the XBox One is durable and reliable as hell.
 

XenoJim

Banned
They had all the time and the money in the world, and they still give the impression of being late to the game. Incomprehensible in any language...

Hubris maybe? Or they grew content with resting on their laurels. Either way something has seemed off since the system's announcement. Nothing about this feels polished or thought out as much as it should be. I can't put my finger on it completely but something internally at Microsoft is responsible for this laziness. I wonder if they thought they had more time perhaps? Maybe they figured Sony wouldn't push a potential launch for another two or so years and they felt that gave them enough leeway to not really care.
 

Biker19

Banned
So after all the Wii U bashing do you think Xbox biggest fans will be able to accept that the PS4 GPU could have a bigger advantage over the Xbox One as the Xbox One has over the Wii U?

Considering that the OG Xbox & the Xbox 360 had mostly better multiplats than both PS2/PS3, it might be a tough pill for them to swallow.
 

Proelite

Member
So which Nvidia/AMD graphics card would the new gen consoles compare to if this rumor holds up?

Sony PS4 GTX670(?)
XBone ??
WiiU ??

On paper:
Sony: 7850
Xbone: 7750 the oced one

In the read world due to the closed nature of consoles, comparable to PCs with:
Sony: 7950 boost
Xbone: 7850
 

ironcreed

Banned
Dat secret sauce.

XK7wSYU.gif
 

szaromir

Banned
They need to fix it well before the console launches, the units for the launch should be starting to ramp up for production soon.
That makes sense.

3 to 4x weaker than a 780 is weak.

Remember this is before the console has even come out usually it takes a year or 2 for pc tech to leapfrog new console power.

This time however right off the bat pcs melt consoles somehow. Disappointing all the way around especially as a pc gamer. It will hold pc gaming down in a fear years because developers always develop for the lowest common denominator.
The most limiting factor are going to be game budgets, not Xbone's processing power. Still, the experience on the console will suffer greatly if they scale the specs down.
 
3 to 4x weaker than a 780 is weak.

Remember this is before the console has even come out usually it takes a year or 2 for pc tech to leapfrog new console power.

This time however right off the bat pcs melt consoles somehow. Disappointing all the way around especially as a pc gamer. It will hold pc gaming down in a fear years because developers always develop for the lowest common denominator.

Isn't that a $400-500 graphics card? Essentially the price of the consoles themselves?
 
Status
Not open for further replies.
Top Bottom