fluffydelusions
Member
Dat sweep to the leg... Right out of karate kid
Dat sweep to the leg... Right out of karate kid
No matter if Xbone ends up being a 900GF or 1200GF console, multiplatform games will be designed around it and then up-ported to PS4. The only thing that could just is just how much better it's going to end up on the PS4. If MS downclocks to something like 900GF, that could make an impact on design and possibly limit the scope and size of the games, not just IQ and FPS. And then Sony devs are going to destroy everything else.
I read the OP but I can't find the explanation. Why would an esram issue cause a gpu downgrade?
wtf where is this from?
I read the OP but I can't find the explanation. Why would an esram issue cause a gpu downgrade?
Production yield targetting and heat mapping are at the same stages of a console development. While they don't actually relate to each other, they can occur at the same time. This may explain the mixed messages from a few posters.I read the OP but I can't find the explanation. Why would an esram issue cause a gpu downgrade?
I read the OP but I can't find the explanation. Why would an esram issue cause a gpu downgrade?
I read the OP but I can't find the explanation. Why would an esram issue cause a gpu downgrade?
Sony's APU is 3 billion transistors. MS's is 5 billion. And more complex because of added parts.
Because they are all in one part, one part fucked up means the whole APU will be affected by it. And since 25% of APU's die is dedicated for ESRAM (in my opinion this is not smart at all. With this many transistors MS should have just outright put more GPU power), the heat issue can be quite real.
and since ESRAM's bandwidth is directly correlated with GPU's speed (lower GPU speed and lower the ESRAM bandwidth therefore lower the thermal issue) seems plausible, but I still don't think MS can be THAT stupid to figure out this in the blueprint stage.
If they did then this is colossal FUCK UP.
come to think of it the yield for this chip must be horrible.
Don't think that takes developer pride/reputation into the equation at all. No way all 3rd parties would do this to then be shat on from a very great height by first party developers. You're only as good/marketable as your last game, and if that is too far behind the curve (because targeting lowest denominator surely wouldn't be industry unanimous)... well we're already seeing devs dropping like flies of late. Reputation is a massive deal to survival in the current climate. If the power difference is as large as this thread suggests it could be, I would hazard that it will be a case of downwards rather than upwards ports. Particularly if the bulk of core (i.e. frequent game purchasing) gamers gravitate to the more powerful system.
Actually if Apple are anything to go by, nothing makes money like producing great products that people want to buy and then selling them with a healthy profit margin.
I'm not sure why loss leadership has become de riguer in the console space but it's always seemed batty to me. It's as though the companies (except Nintendo) have come to believe that the aim in console manufacture is to 'win' by selling the most units and not turning a profit because that's what the fans think.
Just this morning got this email from Amazon (UK) regarding my pre-order that was placed 3 days after it went up:
Yield problems? Certainly seems like it now!
So Sony didn't max out their silicon budget then. Interesting.
For everyone asking- this information is all pretty recent. Around the PlayStation Meeting the Xbox One was way behind (OS + hardware). Engineers were scrambling to get things sorted out.
It turns out, they didn't sort it out. The OS you saw was a complete and total lie. The current plan is to get the yields up, lower the clock rate, and to have enough units out for a sell out in the Fall.
For those asking how this affects performance- to be perfectly frank; it is nothing turning down features won't solve. The mass market will never notice a difference between 1080p and 900p; neither will they care about dynamic shadows / global illumination / or tesselation. Go to your PC - and turn shadows from Ultra to medium, disable tesselation, and lower the resolution to 900p; and you'll find games run totally fine.
Microsoft is purely behind and it's now time to make drastic decisions. I don't think any one is happy about the lower clocks, but no one is depressed about it either. The Xbox One is an all-in-on device; and that's how it will be marketed.
For everyone asking- this information is all pretty recent. Around the PlayStation Meeting the Xbox One was way behind (OS + hardware). Engineers were scrambling to get things sorted out.
It turns out, they didn't sort it out. The OS you saw was a complete and total lie. The current plan is to get the yields up, lower the clock rate, and to have enough units out for a sell out in the Fall.
For those asking how this affects performance- to be perfectly frank; it is nothing turning down features won't solve. The mass market will never notice a difference between 1080p and 900p; neither will they care about dynamic shadows / global illumination / or tesselation. Go to your PC - and turn shadows from Ultra to medium, disable tesselation, and lower the resolution to 900p; and you'll find games run totally fine.
Microsoft is purely behind and it's now time to make drastic decisions. I don't think any one is happy about the lower clocks, but no one is depressed about it either. The Xbox One is an all-in-on device; and that's how it will be marketed.
For everyone asking- this information is all pretty recent. Around the PlayStation Meeting the Xbox One was way behind (OS + hardware). Engineers were scrambling to get things sorted out.
It turns out, they didn't sort it out. The OS you saw was a complete and total lie. The current plan is to get the yields up, lower the clock rate, and to have enough units out for a sell out in the Fall.
For those asking how this affects performance- to be perfectly frank; it is nothing turning down features won't solve. The mass market will never notice a difference between 1080p and 900p; neither will they care about dynamic shadows / global illumination / or tesselation. Go to your PC - and turn shadows from Ultra to medium, disable tesselation, and lower the resolution to 900p; and you'll find games run totally fine.
Microsoft is purely behind and it's now time to make drastic decisions. I don't think any one is happy about the lower clocks, but no one is depressed about it either. The Xbox One is an all-in-on device; and that's how it will be marketed.
I read the OP but I can't find the explanation. Why would an esram issue cause a gpu downgrade?
I think because eSRAM bandwidth and GPU frequency are linked if I read it correctly.
So if they are having problems making the eSRAM that works at 102GB/s but works fine at say 95GB/s then they would drop the GPU frequency as well.
But I have no idea what I'm talking about.
They really should have anticipated these problems during the design process.
Its the XBox One, not the first Xbox. I understand the name can lead to confusion.
I read the OP but I can't find the explanation. Why would an esram issue cause a gpu downgrade?
It's been suggested on Beyond3D that the SRAM array may literally be too large for signals to travel the physical distance in time to be considered valid in a single clock cycle. No one has ever made a pool of SRAM this large before. IBM uses EDRAM for their large CPU caches. Maybe that's in part because they were not sure SRAM could scale effectively do to such issues. If correct, that means the normal measures you'd take to improve yields (additional cooling, deactivating defective regions, even increasing production runs) would not be effective. But lowering the clock would give a signal more time to travel through a wire.
It was a bet, like any other bet, but the downclock isn't what is worrisome, it's the half baked OS. I hope they just fix things in time. I'm optimistic they'll get it right, but it might be a few months after launch.
They have a very ambitious goal with the Xbox One OS.
Well, if this progression goes on, MS could end up with a Wiiu power console. So, is worrisome too.
I would say that's somehow exaggerated. Keep in mind that X1 is a closet hardware with custom strengths as the Data Move Engines. Seeing a nightmarish hardware as the PS3 pushing some titles like GT5 or MotorStorm Apocalypse at 12whatever x 1080 I can easily see games in Xbone running at native 1080p. That would depend on the game, the skill of the studio and the complexity of the effects and assets, of course.
Would PS4 outperform and in specific games even double the frame-rate? If this rumour is true (and I don't buy it much) probably and frequently.
Would this downgrade (again, if it's true) drive to a 1080 vs 720 situation? I doubt it. But man, the differences in AA, now that's where I'd expect a great leap.
Think this way.
Sony's APU is 3 billion transistors. MS's is 5 billion. And more complex because of added parts.
Because they are all in one part, one part fucked up means the whole APU will be affected by it. And since 25% of APU's die is dedicated for ESRAM (in my opinion this is not smart at all. With this many transistors MS should have just outright put more GPU power), the heat issue can be quite real.
and since ESRAM's bandwidth is directly correlated with GPU's speed (lower GPU speed and lower the ESRAM bandwidth therefore lower the thermal issue) seems plausible, but I still don't think MS can be THAT stupid to figure out this in the blueprint stage.
If they did then this is colossal FUCK UP.
come to think of it the yield for this chip must be horrible.
There are many theories. IMO, the more I think about this one, the more it makes sense, and I generally discount a lot of what is read on B3D:
This explanation would require one to believe that either MS had originally designed the eSRAM pool with maximum transistor density in mind, and the design failed, either because of leakage/thermal/yield reasons, or they designed the pool with a smaller process in mind, and yields sucked, so they had to shift to a larger process.
If this turns out to be true, the story of this misadventure will be very very interesting.
Cant wait for the rings of death to make another appearance.
She had it coming.
A shitty OS that gobbles up RAM?It was a bet, like any other bet, but the downclock isn't what is worrisome, it's the half baked OS. I hope they just fix things in time. I'm optimistic they'll get it right, but it might be a few months after launch.
They have a very ambitious goal with the Xbox One OS.
Now PS4 only need to support Oculus Rift and the PS4 can play every X1 Game with the same graphics but in Stereo 3D without any problems xD
More importantly wills this cause a launch delay for Xbone to sort stuff out or will they sacrifice more GPU power instead? So in essence can more time afford them a solution that is not detrimental to the performance?
Thuway was banned for teasing nonsense and posting speculation that all Xbox E3 demos would be downgraded as fact.
To get to Wii U levels, you need a new, lower powered, and shittier GPU with 1 GB of DDR3 allocated to games and no ES RAM. That's a far cry from what Xbone is.
Some random internet find. I was searching for "tja gif". "Tja" means well in German.
Depends on the drink.throwing a drink and physical violence are equatable?
A shitty OS that gobbles up RAM?
Someone should tell MS to stop reading Sony's 2006 book of stupid things to do.
Playsation_ said:She had it coming.
Well, if "Thomas was alone" developer went to the Sony offices to present his PS4 new project with an Oculus Rift...maybe is because his game will need one...
I'm pretty sure MS wrote the book on that one.
I don't think going embedded memory with DDR3 was that bad a decision. What I really don't get is why they went with 6t-SRAM instead of eDRAM. The size difference is humongous. Are there production/process advantages to this I am unaware of?
That would have been one catastrophic mistake to make. That's a fundemental design mistake, not just a yield issue.
The Wii u has 35mb of embedded ram of various types
What we are seeing today with MS is their end game/goal. MS is betting that they are the one software company that can bring all the components of life and entertainment together. And by doing so will become the only information gathering conglomerate that will have a major television viewing angle.
Guys "Please understand" Nothing makes money like advertisement sales and MS is looking for a piece of That pie. Not gaming.
I don't think going embedded memory with DDR3 was that bad a decision. What I really don't get is why they went with 6t-SRAM instead of eDRAM. The size difference is humongous. Are there production/process advantages to this I am unaware of?