• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

TRUTHFACT: MS having eSRAM yield problems on Xbox One

Status
Not open for further replies.

Shadders

Member
This looked quite fixed:

195x110





The manufacturing issues are especially a problem if you don't have enough money
(or you don't want to spend the money but a downclock would kill the system so I highly doubt they will do it - they wait until process ripens)
.

They might be thinking that it just doesn't matter. History teaches us that the most powerful console has no divine right to being the most successful, plus third party games will probably be similar on both machines, at least in terms of features and functionality.

"The power of the cloud" is pretty emotive too, even if it is total horseshit.

I don't think they'll be too scared of releasing an underpowered machine anyway.
 

Elrina

Neo Member
I can't believe you guys are even entertaining this rumor considering they are already confirming specs on games.

If this rumor were true why would they come out Yesterday and confirm this kind of stuff.

It just doesn't logically fit.


I mean really do you guys honestly think the in-engine trailer we saw is really possible on the hardware your entertaining at 1080p and 60 fps?

Sure it does. They want to do their best to impress people as much as possible. They can "officially confirm" whatever they want right now. As it happens, they haven't confirmed the clock speeds (and even if they had, it would still mean very little at this point). You're basing your entire argument on the fact that the in-engine graphics of a single game looked pretty, and that they're claiming it will run at a native 1080p/60fps.

All they have to do later, when they "officially confirm" something else, is claim that it was the truth at the time. There are other methods they can use to still hit the 1080p/60fps mark, and it will still look better than anything that's been released this generation.

Official confirmations of game specs right now don't mean a lot when they're launch titles for new hardware. All they are is official confirmation of the targeted specs that they believe they can deliver based on the dev kit hardware they're currently working with. Until the dev kit has been finalized (which it hasn't yet), things can change.

Ultimately, this specific issue (whether the clock speed might get reduced slightly or not) is not even really all that important, and I doubt many people see it as anything more than an overall indication of a continuing trend of bad news. Either way the games will look great, and either way multi-platform titles will still look better on the PS4 (and even better on the PC).
 

artist

Banned
While I doubt that they will downclock the gpu, 1080p@60 does not imply they didn't do it. There are other things that can be compromised in order to reach a framerate/res target.
I know, just look at this;
1080p60fps native means nothing? What is that a joke?
2nbbnuq_zps8b759819e1jsu.jpg

^ needs to be updated with choice quotes like FordGTGuy's.

In 5 days, we'll know the real sources from the sheep.
Do you really expect Microsoft to admit issues?
Not happening

I don't know if you're being sarcastic or not, but down clocking the gpu would have nothing to do with frame rate or resolution. It's a choice by the devs.

RR7 had 1080p and 60fps 7 years ago on the ps3.
I was being sarcastic.
 

R-User!

Member
If you were a publisher, why wouldn't you take MS's money to do this?

I mean, getting a sack full of cash for a time exclusive. A timed exclusive for 3 days only. For games that aren't even finished yet on consoles that haven't been released. Not to mention, the E3 media is probably pre-rendered / bullshots anyway.

It's not like Ubi/Activion/EA etc won't do a proper marketing push for the PS4 versions once the games are ready for launch.

Morals
 
Hardware prototypes don't exist? ok
It sucks to type on the phone.

But yeah, not only is it a prototype model, but if I'm not mistaken under clocking won't change the physical qualities of the chip, but it's a way to get better yields of said chip on a die due to it becoming faulty after running at higher temperature than it can.
 
Xbone Confirmation Bias (XCB)

Relating to being emoionally invested in a games console, People display this bias
when they gather or remember information
selectively, or when they interpret it in a biased
way . The effect is stronger for emotionally
charged issues and for deeply entrenched beliefs.



"Power of the cloud" is their attack and defence until Digital Foundry and a teardown reveals the truth.

brand fanboys of all kinds are massively prone to it. we're seeing it endlessly at the moment on all sides.
 

Rourkey

Member
Cant see themselves hamstringing themselves for a the generation by dropping the GPU clock just to overcome some fabricating issues at the beginning. They'll ship with next to no units which will create demand in itself.

From what ive read on here it seems it is the ESRAM part of the APU that is the main issue so dropping GPU clock to get a better yield seems like it will only have a minimal impact
 

Risette

A Good Citizen
It sucks to type on the phone.

But yeah, not only is it a prototype model, but if I'm not mistaken under clocking won't change the physical qualities of the chip, but it's a way to get better yields of said chip on a die due to it becoming faulty after running at higher temperature than it can.
Also, anything running at a hypothetical higher "final" clock can be underclocked to accommodate flakey yields.
 

Perkel

Banned
In 5 days, we'll know the real sources from the sheep.

For hardware it won't happen. For games absolutely.

I am pretty sure Forza 5 trailer was "enchanced" (vide fire from exhaust) and no way in hell quantum break bridge destruction was real time on console.
 

twobear

sputum-flecked apoplexy
I have a question; when you fabricate semiconductors, does every single transistor etc. have to come out properly or can you lose a certain number, or what?
 

chubigans

y'all should be ashamed
Just to put this into perspective for people, I did a little quick calculation. (If anybody sees any glaring errors lemme know)

I tried to do a little measurement in photosho of what I thought the boundaries of the die under the spreader looked to be, and coated the results to some GPUs die sizes. I'm estimating this as though the APU is 20.5mmx20.5mm. I'm also assuming they're using a 300mm wafer.

With a kinda average setup, (4mm edge of exclusion, .08mm clearance all
around) you would get 133 chips at 100%. Let's say "less than 30%" is 29%.

If MS is paying $5000 a wafer, that's just over $130 per APU. Just for the silicon.

Ouch. Considering I'm putting the best possible construction on everything and the numbers get worse really quickly just by dropping that percentage a few points.
I've had a nagging feeling that the XBone will be very expensive...interesting numbers.
 
1. Downclock: Loss by suicide.

2. Supply Constraint: Loss of early adopters, but Sony has shown that you CAN recover from that.

3. Launch Delay: See #2 but to a greater degree.

4. Rushed Launch with hardware instability and UI/Kinect less than reveal presentation: this is where it could go REALLY bad, especially if they DO subsidize this box. Imagine getting locked into an expensive live subscription with no way out with a box that does not perform as expected. I foresee some sad Christmases.
 
I have a question; when you fabricate semiconductors, does every single transistor etc. have to come out properly or can you lose a certain number, or what?

You design hardware by taking this into account. AMD has done this for a very long time with their processors. More often than not, their lower end chips are just higher end chips with disabled cores. GPU manufacturers basically do the same thing and/or downclock. For instance, the HD7950 is probably just a HD7970 that is too defective to actually run as a HD7970.
 

artist

Banned
Just to put this into perspective for people, I did a little quick calculation. (If anybody sees any glaring errors lemme know)

I tried to do a little measurement in photosho of what I thought the boundaries of the die under the spreader looked to be, and coated the results to some GPUs die sizes. I'm estimating this as though the APU is 20.5mmx20.5mm. I'm also assuming they're using a 300mm wafer.

With a kinda average setup, (4mm edge of exclusion, .08mm clearance all
around) you would get 133 chips at 100%. Let's say "less than 30%" is 29%.

If MS is paying $5000 a wafer, that's just over $130 per APU. Just for the silicon.

Ouch. Considering I'm putting the best possible construction on everything and the numbers get worse really quickly just by dropping that percentage a few points.
My estimation from that pic is that the Xbone APU is ~350mm2 and I highly doubt the yields are going as low as 30%. That is beyond GF100 territory.
 

Risette

A Good Citizen
I have a question; when you fabricate semiconductors, does every single transistor etc. have to come out properly or can you lose a certain number, or what?
They're designed to account for that to a degree. (I imagine some end up completely fucked and unsellable as anything) Also how separate models come to exist. Stuff that came out less than ideal gets rated for a different clock/amount of cores/etc and sold under a different label.

Probably a bigger issue for fixed console setups where there is one spec that must be hit than it is the PC market where uneven yields can be binned under a cornucopia of various lower spec models. MS is probably seeing this crop up because of their relatively complex, non-standard APU design.

edit: poo
 

Rourkey

Member
Sony and MS have both stomached billions of losses at the beginning of the generation life cycle, neither seem to have spent that much on development this time round (by buying off the shelf PC stuff) so perhaps they'll take the cost hit in the APU production
 

Drek

Member
Considering how they've been trying to avoid giving hard facts about their hardware, we probably never will.

If they come in lower than the rumored 1.2 TFLOPS, bandwidth, etc. we'll have an idea something went wrong.

It seems as if the Xboxone specs is a race to the bottom...

Hello WiiU!
Its not like they're replacing the GPU with a block of wood man. It'll never get THAT bad.
 

twobear

sputum-flecked apoplexy
They're designed to account for that to a degree. (I imagine some end up completely fucked and unsellable as anything) Also how separate models come to exist. Stuff that came out less than ideal gets rated for a different clock/amount of cores/etc and sold under a different label.

Probably a bigger issue for fixed console setups where there is one spec that must be hit than it is the PC market where uneven yields can be binned under a cornucopia of various lower spec models. MS is probably seeing this crop up because of their relatively complex, non-standard APU design.

edit: poo

But, say for the sake of the argument, if the Xbone CPU is designed with exactly 5,000,000,000 transistors, would they have to bin it if it had 499,999,999,999?
 

Goldmund

Member
For hardware it won't happen. For games absolutely.

I am pretty sure Forza 5 trailer was "enchanced" (vide fire from exhaust) and no way in hell quantum break bridge destruction was real time on console.
It looked real-time on console to me. If that was on PC, it should have looked leagues better.
 

Raydeen

Member
Xbone Confirmation Bias (XCB)

Relating to being emoionally invested in a games console, People display this bias
when they gather or remember information
selectively, or when they interpret it in a biased
way . The effect is stronger for emotionally
charged issues and for deeply entrenched beliefs.

"Power of the cloud" is their attack and defence until Digital Foundry and a teardown reveals the truth.

Well let's be honest, Digital Foundry is the most irrelevant site on the internet.

Endless comparison videos that only serve to make gamers dissatisfied. Oh no! I'm only getting 3 fps in combat scenes then the 360 / PS3 version. Sell Console, even though I can't tell from the videos....but *I now know it's 3fps less* For a site that claims to be the bastion of the hardcore gamer, NeoGaf certainly seems to be full of people more interested in megagigatessilationrender pixels then the actual content of the games. Not saying MS haven't screwed up, but at least they had some ambition outside just another box that plays games, what have Sony given you with PS4? Exactly the same thing they gave you with PS3. Then again, I backed the loser with the Dreamcast and it's then all new online console gaming and had to say goodbye because all you Sony fuckers bought PS2 instead.
 
In-engine means it was still ran on that hardware and I'll go out on a limb and say that what we saw in that trailer wouldn't be possible at all even in-engine on the specs you guys are entertaining.

No it doesn't. In engine means it was built on the engine, and not via other, outside tools.

For example, there was that 60fps footage of Uncharted 3. Blew everyone's mind. In engine, but the PS3 would probably melt trying to do that.
 

Risette

A Good Citizen
Sony and MS have both stomached billions of losses at the beginning of the generation life cycle, neither seem to have spent that much on development this time round (by buying off the shelf PC stuff) so perhaps they'll take the cost hit in the APU production
That's defeating one of the points of using APUs. One of the reasons they switched, aside from a more standard/mature architecture for development ease, is that they're designed to take economies of scale into account by using a standard PC architecture which fabs are already designed to make and will continue to make for a few years (at the least, when the arch used in the console APUs becomes legacy the fabs that run them will just continue to run.) MS bungled this by adding a custom complication to the APU. Not to mention that the SoC nature of an APU is also meant to make things cheaper by making everything fit into one die.

But, say for the sake of the argument, if the Xbone CPU is designed with exactly 5,000,000,000 transistors, would they have to bin it if it had 499,999,999,999?
I understand what you're trying to get at, but they wouldn't do that. They'd design it to hit a certain spec with rooms for error.

A more apt question would be, if they designed it to use 6 billion with 5 billion usable. In which case, if they had 4.5bn usable, they'd probably check how it functioned with the parts considered defective and ship it out if it worked (maybe it just ran too hot). Warranty will take care of it if anything happens. This is why you have systems that die out of the box or after a day of use.
 

Orayn

Member
No it doesn't. In engine means it was built on the engine, and not via other, outside tools.

For example, there was that 60fps footage of Uncharted 3. Blew everyone's mind. In engine, but the PS3 would probably melt trying to do that.

Well, wasn't there speculation that it did run on a PS3, "in real-time," but ran at half-speed and was later played back twice as fast? Something like the timescale settings in a fighting game's training mode. Not really making an argument about anything in particular, just recalling an interesting tidbit.
 

ciridesu

Member
Well let's be honest, Digital Foundry is the most irrelevant site on the internet.

Endless comparison videos that only serve to make gamers dissatisfied. Oh no! I'm onlygetting 3 fps in combat scenes then the 360 / PS3 version. Sell Console, even though I can't tell from the videos....but *I now know it's 3fps less* For a site that claims to be the bastion of the hardcore gamer, NeoGaf certainly seems to be full of people more interested in megagigatessilationrender pixels then the actual content of the games. Not saying MS haven't screwed up, but at least they had some ambition outside just another box that plays games, what have Sony given you with PS4? Exactly the same thing they gave you with PS3. Then again, I backed the loser with the Dreamcast and it's then all new online console gaming and had to say goodbye because all you Sony fuckers bought PS2 instead.

Us god damn Sony fuckers.

How the fuck is DF irrelevant; just because this time around their articles are against your 'preferred' view?

Why are so many so up in arms damn.
 

Perkel

Banned
I have a question; when you fabricate semiconductors, does every single transistor etc. have to come out properly or can you lose a certain number, or what?

AMD is selling 3 core CPUs. There are no 3 core CPUs. AMD essentially put 4 core CPU and switch one core off because there is big chance it may not work.

Same with how we buy CPU for PC with different Ghz. Lowest cost CPU are from yields and can't go so high as better CPUs. This is why E8400 or E7200 were so legendary for people because Intel sold most of the time good CPU and people could OC them as hell.

I have personally in my old PC E7200 and it went from 2,5Ghz to 3.9Ghz on Air. E8400 could go even to 4.1 on air. Above limits were random for some people on E7200 it couldn't go above 3.7 same with E8400 revisions.
 

twobear

sputum-flecked apoplexy
AMD is selling 3 core CPUs. There are no 3 core CPUs. AMD essentially put 4 core CPU and switch one core off because there is big chance it may not work.

Same with how we buy CPU for PC with different Ghz. Lowest cost CPU are from yields and can't go so high as better CPUs. This is why E8400 or E7200 were so legendary for people because Intel sold most of the time good CPU and people could OC them as hell.

I have personally in my old PC E7200 and it went from 2,5Ghz to 3.9Ghz on Air. E8400 could go even to 4.1 on air. Above limits were random for some people on E7200 it couldn't go above 3.7 same with E8400 revisions.

Huh, my old PC packed away in storage has an E7200 in it. Maybe I'll try overclocking it.
 

Raydeen

Member
Us god damn Sony fuckers.

How the fuck is DF irrelevant; just because this time around their articles are against your 'preferred' view?

Because it's practically impossible to spot any differences in 99% of the videos. Bayonetta PS3 disaster aside, it's a site for nit picking for nit picking sake. And oh the PS3 version has slightly lighter shade of textures? Well better buy a 360 then...

Edit: Now if Digital Foundary had been around in the ZX Spectrum vs Commodore 64 days...now that would be a different matter...
 
Well, wasn't there speculation that it did run on a PS3, "in real-time," but ran at half-speed and was later played back twice as fast? Something like the timescale settings in a fighting game's training mode. Not really making an argument about anything in particular, just recalling an interesting tidbit.

Could be possible. Still, in engine doesn't mean run on the hardware.

We see in-engine cutscenes all the time.

That are pre-rendered and then thrown into movie files onto a game.
 

PJV3

Member
Well let's be honest, Digital Foundry is the most irrelevant site on the internet.

Endless comparison videos that only serve to make gamers dissatisfied. Oh no! I'm only getting 3 fps in combat scenes then the 360 / PS3 version. Sell Console, even though I can't tell from the videos....but *I now know it's 3fps less* For a site that claims to be the bastion of the hardcore gamer, NeoGaf certainly seems to be full of people more interested in megagigatessilationrender pixels then the actual content of the games. Not saying MS haven't screwed up, but at least they had some ambition outside just another box that plays games, what have Sony given you with PS4? Exactly the same thing they gave you with PS3. Then again, I backed the loser with the Dreamcast and it's then all new online console gaming and had to say goodbye because all you Sony fuckers bought PS2 instead.

Maybe gamers just want a great gaming machine, I'm doing ok in the TV department.
 

dancmc

Member
So...I think I am getting a little confused on what the actual rumor on downclocking is.

Is it:

a) the silicon yields have been poor making them consider a decision to downclock the GPU

or

b) the silicon yields have been poor and they have already decided to downclock the GPU

sorry, have skimmed through the thread and trying to pull out rumor vs. hyperbole vs. fanboyism vs. truthfact.
 

Risette

A Good Citizen
Well let's be honest, Digital Foundry is the most irrelevant site on the internet.

Endless comparison videos that only serve to make gamers dissatisfied. Oh no! I'm only getting 3 fps in combat scenes then the 360 / PS3 version. Sell Console, even though I can't tell from the videos....but *I now know it's 3fps less* For a site that claims to be the bastion of the hardcore gamer, NeoGaf certainly seems to be full of people more interested in megagigatessilationrender pixels then the actual content of the games. Not saying MS haven't screwed up, but at least they had some ambition outside just another box that plays games, what have Sony given you with PS4? Exactly the same thing they gave you with PS3. Then again, I backed the loser with the Dreamcast and it's then all new online console gaming and had to say goodbye because all you Sony fuckers bought PS2 instead.
Here's a shocker:

Visuals are part of a game's content. They should be the best possible, not "good enough if you're not a nitpicker."
So...I think I am getting a little confused on what the actual rumor on downclocking is.

Is it:

a) the silicon yields have been poor making them consider a decision to downclock the GPU

or

b) the silicon yields have been poor and they have already decided to downclock the GPU

sorry, have skimmed through the thread and trying to pull out rumor vs. hyperbole vs. fanboyism vs. truthfact.
Underclocking is still a rumor. eSRAM yield issues are confirmed, but not confirmed to be directly related to the GPU rumors.
 

Perkel

Banned
That's defeating one of the points of using APUs. One of the reasons they switched, aside from a more standard/mature architecture for development ease, is that they're designed to take economies of scale into account by using a standard PC architecture which fabs are already designed to make and will continue to make for a few years (at the least, when the arch used in the console APUs becomes legacy the fabs that run them will just continue to run.) MS bungled this by adding a custom complication to the APU. Not to mention that the SoC nature of an APU is also meant to make things cheaper by making everything fit into one die.


I understand what you're trying to get at, but they wouldn't do that. They'd design it to hit a certain spec with rooms for error.

A more apt question would be, if they designed it to use 6 billion with 5 billion usable. In which case, if they had 4.5bn usable, they'd probably check how it functioned with the parts considered defective and ship it out. Warranty will take care of it if anything happens. This is why you have systems that die out of the box or after a day of use.

Also this choice means that they quickly will lower BOM price as their APU will scale down.

If Sony or MS will choose to sell their console below costs which is tradition of console market then they will not loose as much money as non APU hardware.
 

Waaghals

Member

So "real gamers" should not care about technical differences in games, but caring about TV that has nothing to do with games is completely legitimate?

That is just silly. If you like TV that's fine, but it has nothing to do with gaming.
Sure, sometimes the differences are small and meaningless, but there are people with multiple consoles and an eye for framerate and IQ.

The fact that you seem to think that you have to sell your existing console to buy a new one indicates that you are a single console owner, and always have been.

I have no comment for the rest of your post.
 

Nachtmaer

Member
I have a question; when you fabricate semiconductors, does every single transistor etc. have to come out properly or can you lose a certain number, or what?

From what I've read, architectures always have some level of redundancy built in. It'd be silly to waste an entire chip on a dead transistor. Also, like some people mentioned before, if this redundancy can't cover up certain faults, those parts (shaders, cores, cache) might get fused off for a lower end SKU. If the chip isn't able to run at a certain frequency within a certain TDP/voltage range, it goes into a lower bin.
 
Status
Not open for further replies.
Top Bottom