Playsation_
Banned
I'm wondering if this is why MS is pushing DRM so heavily...
MS is having trouble making enough XboxOne's with the current configuration. Which may lead to either a delay... or lowering the specs in order to be able to make them in sufficient number.
Bad yields like this is like, for example every 10 working consoles they make 5 don''t make it out of the factory. Which is very expensivo.
OK thanks guys, that helps a lot.They try to manufacture as many chips as they can fit on a silicon wafer. They then test the chips to see if they are up to spec. Low yield means there are more chips than expected not meeting the spec. Their options are to delay the launch, lower the spec (down-clocking) so more chips can meet it, except the low yields and be supply constrained and cost inefficient or use brute force and try to pump out as many as they can, also cost inefficient.
Something I've never understood is why people are so quick to hang us insiders. We literally go out of our way to give you something to talk about and a glimpse behind the scenes. As far as I know- Bruce, Cboat, Matt, or Gopher- none of them have any intentions of starting a console proxy war. The Xbone community we are just telling you what we heard.
Well, MS already did the pooled UMA design thing with X360 while adding a fast EDRAM buffer that benefits more than hurts, AFAICT, just looking released titles over the years. Just looking at the stuff rumors and leaks have revealed, MS, having a longview with their DX roadmap and experience from their own studios, seems to have designed their hardware to address the problem of feeding units work data all of the time as well as trying to wipe out most of the cost of cache being killed to feed commonly used data. That's seems to be a more direct approach than the more general purpose one Sony took, AFAICT, but I'm no hardware or graphics tech-head. MS, it seems to me, built their version of the core hardware both they and Sony share, to be potentially much greater with maximizing the benefit of partially resident textures that UE4, idTech 5/6, and CE3+ support. In a general way, Sony added and then maxed out their pipes to the single RAM pool and CPU/GPU caches, while MS also did much the same except for focusing on having a high-speed on-chip cache and adding more dedicated copy/load/store/compress units to ease the burden from the CPU for data movement. MS' solution seems more complicated, but when these newer APU designs' goals are accounted for, memory virtualization should make managing it a relative snap and probably mostly hidden from programmers unless they specify more micromanagement with prefetching and how data is laid out to be consumed in the 32MB cache. I don't see how ESRAM is a mistake when it's clear that it has major benefits for actually utilizing the hardware and trying to limit the loss going from potential performance to achieved performance. That's the same goal as building out a high bandwidth connection to GDDR5 RAM and adding more ways to access cache, bypass it, and move data through it. If Sony's approach was really just flat-out superior, why wouldn't MS have just gone with that considering that it's so similar to what they had in the X360 and that it would be less complex as well as less costly? MS sees the benefit and I'm not convinced that it's just a Rube Goldberg machine method of achieving the same goal because they're using DDR3 for their main work RAM.
Perhaps it's a mistake, but we're not quite near launch yet, with less than six months or so to go. MS could just take a big hit on cost upfront by taking what they get out of current manufacturing and just eat the higher cost/lower yields until the process is smoothed out over the next year, but not necessarily downclocking to make target shipments and time window. It's not uncommon for new consoles to be in less than great shape so close to release, as MS was behind with X360 and was forced to demo in-progress games with less than ideal performance at E3 '05 with their beta kits, IIRC, which don't quite approximate the final hardware's speed and behavior completely. We could be looking at something very similar with X1. In any case, rumors are rumors, and even when they are right, they can miss a lot of important details that can make them seem more major than they really are. I'm not going to underestimate MS' plans and execution when they came out fine twice before.
So how would that be a 3rd console curse when SNES had already sold less than the NES & the GameCube sold less than the 3rd console?
Well so much for SenjutsuSage's "insider" haha.
more like lack of people who created Xbox what it is today for many people. Most of dudes who were creating Xbox1 is not around anymore
If this rumour is true, how/when do you think we'll find out?
If this rumour is true, how/when do you think we'll find out?
Who is Senjutsu? I have never heard of him what did he say?
When it is released and someone takes it apart and does a total breakdown. I don't see MS letting this information become public. Even thoughthe mass market wouldn't know what it meant anyway
Well, MS already did the pooled UMA design thing with X360 while adding a fast EDRAM buffer that benefits more than hurts, AFAICT, just looking released titles over the years. Just looking at the stuff rumors and leaks have revealed, MS, having a longview with their DX roadmap and experience from their own studios, seems to have designed their hardware to address the problem of feeding units work data all of the time as well as trying to wipe out most of the cost of cache being killed to feed commonly used data. That's seems to be a more direct approach than the more general purpose one Sony took, AFAICT, but I'm no hardware or graphics tech-head. MS, it seems to me, built their version of the core hardware both they and Sony share, to be potentially much greater with maximizing the benefit of partially resident textures that UE4, idTech 5/6, and CE3+ support. In a general way, Sony added and then maxed out their pipes to the single RAM pool and CPU/GPU caches, while MS also did much the same except for focusing on having a high-speed on-chip cache and adding more dedicated copy/load/store/compress units to ease the burden from the CPU for data movement. MS' solution seems more complicated, but when these newer APU designs' goals are accounted for, memory virtualization should make managing it a relative snap and probably mostly hidden from programmers unless they specify more micromanagement with prefetching and how data is laid out to be consumed in the 32MB cache. I don't see how ESRAM is a mistake when it's clear that it has major benefits for actually utilizing the hardware and trying to limit the loss going from potential performance to achieved performance. That's the same goal as building out a high bandwidth connection to GDDR5 RAM and adding more ways to access cache, bypass it, and move data through it. If Sony's approach was really just flat-out superior, why wouldn't MS have just gone with that considering that it's so similar to what they had in the X360 and that it would be less complex as well as less costly? MS sees the benefit and I'm not convinced that it's just a Rube Goldberg machine method of achieving the same goal because they're using DDR3 for their main work RAM.
Perhaps it's a mistake, but we're not quite near launch yet, with less than six months or so to go. MS could just take a big hit on cost upfront by taking what they get out of current manufacturing and just eat the higher cost/lower yields until the process is smoothed out over the next year, but not necessarily downclocking to make target shipments and time window. It's not uncommon for new consoles to be in less than great shape so close to release, as MS was behind with X360 and was forced to demo in-progress games with less than ideal performance at E3 '05 with their beta kits, IIRC, which don't quite approximate the final hardware's speed and behavior completely. We could be looking at something very similar with X1. In any case, rumors are rumors, and even when they are right, they can miss a lot of important details that can make them seem more major than they really are. I'm not going to underestimate MS' plans and execution when they came out fine twice before.
If this rumour is true, how/when do you think we'll find out?
Basically that he had a source that said there is no issue with MS having to down clock anything and is willing to take a ban bet on it
Who is Senjutsu? I have never heard of him what did he say?
Who is Senjutsu? I have never heard of him what did he say?
I'll help out. I can confirm for a fact that it's false. If I'm wrong, I'll take a ban. I can't say how I know this for sure, but I'm pretty damn sure.
.Nope, I don't speak nearly as often on here about what I know for certain on the new Xbox. In fact, I try not to be the subject of any leaks, because I'd more or less be betraying someone's trust, but in the case of this most recent thing about a downclock of the GPU, I literally begged permission to say definitively that it's absolutely not true.
The only thing that's true is that they are having a bit of a headache with the ESRAM. THAT is certainly true, but (all speculation after this) I think that's more down to the manufacturing difficulty of the ESRAM and yields, not actually using a finished and working version of the GPU for software related purposes.
LOL. Nice dig
You can't tell clock speed for a tear down.
Well, MS already did the pooled UMA design thing with X360 while adding a fast EDRAM buffer that benefits more than hurts, AFAICT, just looking released titles over the years. Just looking at the stuff rumors and leaks have revealed, MS, having a longview with their DX roadmap and experience from their own studios, seems to have designed their hardware to address the problem of feeding units work data all of the time as well as trying to wipe out most of the cost of cache being killed to feed commonly used data. That's seems to be a more direct approach than the more general purpose one Sony took, AFAICT, but I'm no hardware or graphics tech-head. MS, it seems to me, built their version of the core hardware both they and Sony share, to be potentially much greater with maximizing the benefit of partially resident textures that UE4, idTech 5/6, and CE3+ support. In a general way, Sony added and then maxed out their pipes to the single RAM pool and CPU/GPU caches, while MS also did much the same except for focusing on having a high-speed on-chip cache and adding more dedicated copy/load/store/compress units to ease the burden from the CPU for data movement. MS' solution seems more complicated, but when these newer APU designs' goals are accounted for, memory virtualization should make managing it a relative snap and probably mostly hidden from programmers unless they specify more micromanagement with prefetching and how data is laid out to be consumed in the 32MB cache. I don't see how ESRAM is a mistake when it's clear that it has major benefits for actually utilizing the hardware and trying to limit the loss going from potential performance to achieved performance. That's the same goal as building out a high bandwidth connection to GDDR5 RAM and adding more ways to access cache, bypass it, and move data through it. If Sony's approach was really just flat-out superior, why wouldn't MS have just gone with that considering that it's so similar to what they had in the X360 and that it would be less complex as well as less costly? MS sees the benefit and I'm not convinced that it's just a Rube Goldberg machine method of achieving the same goal because they're using DDR3 for their main work RAM.
It's a mistake due to an exceptionally simple reason. An APU is all-in-one. If one of the things inside it is a fab dud, the whole APU is a dud. The more things you cram into an APU, the higher the risk it'll be a dud.I don't see how ESRAM is a mistake when it's clear that it has major benefits for actually utilizing the hardware and trying to limit the loss going from potential performance to achieved performance.
Because the PS4 was initially supposed to have just 4 GB GDDR5, a solution considered inferior to 8 GB DDR3+specialRAM according to some metrics. Nobody apart for Cerny and a couple others fully believed they could get 8 GB in time. Hirai accepted the GDDR5 solution and hoped for a jackpot, since Sony badly needed it. It got it.If Sony's approach was really just flat-out superior, why wouldn't MS have just gone with that considering that it's so similar to what they had in the X360 and that it would be less complex as well as less costly?
MS probably made a survey and both developers and gamers said "What we really want is more transistors!!!!!!!"
DAT FOCUS GROUP
So how would that be a 3rd console curse when SNES had already sold less than the NES & the GameCube sold less than the 3rd console?
Last night he was claiming he knew the down clock wasn't true. Of course based on his posting history here and B3D, he doesn't "know" as much as "want".
lol i love how he has to beg his source to allow him to come to xbone's rescue.Nope, I don't speak nearly as often on here about what I know for certain on the new Xbox. In fact, I try not to be the subject of any leaks, because I'd more or less be betraying someone's trust, but in the case of this most recent thing about a downclock of the GPU, I literally begged permission to say definitively that it's absolutely not true.
The only thing that's true is that they are having a bit of a headache with the ESRAM. THAT is certainly true, but (all speculation after this) I think that's more down to the manufacturing difficulty of the ESRAM and yields, not actually using a finished and working version of the GPU for software related purposes.
How can Microsoft be so bad when it comes to hardware. They have so much money, why can't they hire people to be smart in this.
2 DMA units that can swizzle textures seem par for the course for GCN iirc.
So in reality they added one extra DMA unit which can decode JPG and one that can do LZ77.
These aren't great advancements in tech, DMA is decades old.
So has he been banned?
I still can't believe MS have designed such a mess of a console, especially after how well the last two were made.
So now it's looking like it'll be downclocked on top of eSRAM yield issues? This sounds like a disaster.
Excellent explanation of what is meant by "the curse"People call it a "curse" primarily in jest. It's the "curse" of hubris, feeling like your fans will blindly follow you into the next generation regardless of you decisions, and thinking you can strong arm your way into another successful generation.
Nintendo overplayed their hand when they continued to treat 3rd parties like shit, didn't recognize that an 800 pound gorilla was entering the industry whether they liked it or not, and that carts were a dead medium for home consoles due to cost and storage capacity. They failed to reach an accord with Sony on a new CD based system, failed to convince 3rd parties to stick with the later released, high cost media N64, and had more expensive games with worse profit margins due to carts v. CD.
Sony overplayed their hand when they thought MSRP was irrelevant, everyone would worship at the temple of Sony, and that they could Trojan horse blu-ray on everyone. Also, that ease of development was irrelevant because everyone tolerated the PS2's eccentricities so they'd tolerate the even more eccentric design of the PS3 because it's a Playstation.
SEGA overplayed their hand by releasing a muddled and confusing hardware design stuck halfway between the 2D and 3D transitions, with horrible marketing and a surprise launch. This came on the back of souring the SEGA fanbase with crazy Genesis add-ons though.
It isn't a curse, it's a market trend where the third hardware cycle seems to be when first parties who have seen some recent success have built a thick enough echo chamber to be oblivious to the market's demands. Nintendo has honestly never popped this bubble, the Gamecube had all the same problems as the N64 and the Wii only saw success with non-gamers and extremely casual gamers. They continue to survive thanks to a VERY dedicated core.
SEGA wasn't a strong enough company overall to dig out of the Saturn hole and were no longer financially capable of battling Sony, resulting in an exit from the console hardware market.
Sony is the first console manufacturer we're seeing make a real attempt to learn from these mistakes and dig their way out of the echo chamber. That so far has involved a change at the top of their consumer electronics division, their CEO/corporate president, the head of their worldwide studios, and the very people entrusted with product design. Sony handing the PS4 to Cerny and not their traditional stable of hardware engineers is the equivalent of Nintendo telling Miyamoto to just focus on games and letting real hardware guys design an efficient, powerful console. It's a massive directional shift and the first we've seen in the industry. How it pans out will likely change how the rest of the players in this industry work moving forward.
You think? We heard nothing but bad news about the Xbone, while hearing mostly good stuff about the PS4 and you just think that you would choose the PS4 now?I think the PS4 is back in front for me. If I HAD to chose only one console right now based on what we know, it would be the PS4.
Nobody said selling less than your predecessor is 'third console curse' exclusive!So how would that be a 3rd console curse when SNES had already sold less than the NES & the GameCube sold less than the 3rd console?
I don't think going embedded memory with DDR3 was that bad a decision. What I really don't get is why they went with 6t-SRAM instead of eDRAM. The size difference is humongous. Are there production/process advantages to this I am unaware of?
Basically that he had a source that said there is no issue with MS having to down clock anything and is willing to take a ban bet on it
So has he been banned?
The very first time the 32MB eSRAM was mentioned I remember thinking "That makes no sense at all, that can't be right??", and to this day I'm still trying to figure out why they've gone that way. They could have used 32MB pseudo-static eDRAM, taking up one third of the transistors and the difference in latency wouldn't have been that significant.
Why, nothing is confirmed yet? Would need an answer before banning him for being wrong.
On top of that, Beyond3D is taking this news with much more grain of salt and are waiting for more confirmed insiders to confirm this possible rumor (mainly a guy named Matt?) before going to the next rumor stage.
It is interesting reading over there, they do not have a high opinion of our GAF "insiders."
You think? We heard nothing but bad news about the Xbone, while hearing mostly good stuff about the PS4 and you just think that you would choose the PS4 now?
Currently no sane gamer should consider the Xbone, except if you get all the consoles anyway or really really love Halo.
Why, nothing is confirmed yet? Would need an answer before banning him for being wrong.
On top of that, Beyond3D is taking this news with much more grain of salt and are waiting for more confirmed insiders to confirm this possible rumor (mainly a guy named Matt?) before going to the next rumor stage.
It is interesting reading over there, they do not have a high opinion of our GAF "insiders."
I'm being serious here, so I would like you to explain for us plebs considering your constant lack of enthusiasm for MS' approach.
Why, nothing is confirmed yet? Would need an answer before banning him for being wrong.
On top of that, Beyond3D is taking this news with much more grain of salt and are waiting for more confirmed insiders to confirm this possible rumor (mainly a guy named Matt?) before going to the next rumor stage.
It is interesting reading over there, they do not have a high opinion of our GAF "insiders."
Other than the RROD, I can't think of MS having hardware problems. Or am I missing something?
I can see that, as I myself guessed at months ago in another of these threads. Still, seems like an awful big miss for Sony to get and MS to not see considering that they're shipping at nearly the same time with the same visibility for hardware availability. Still, it just seems ridiculously patchwork if it really is just to accommodate DDR3.Because no one predicted that there will be available 8GB GDDR5 when consoles will be out. Sony all along planed 4GB of GDDR5 and even earlier 2GB. They got lucky with 8GB because it is super fresh tech like january this year and their old 4GB was in clamshell design so they just switched it.
If they wanted their media hub they needed 8GB. Only way back then was to use 8GB of DDR3 there was no 8GB of GDDR5. Even 4GB at that time was still not out.
If they wanted to go with 8GB of DDR3 they needed to use eDRAM or eSRAM to "patch" (not "fix") problem with DDR3 bandwidth.
Real question is now why eSRAM and no eDRAM. Some people mentioned that eDRAM can only be produced in few fabs and it will be harder to downsize it in future.
Yeah, sounds like a potential killer mistake if it's as bad as interpretations of the rumored situation sound.It's a mistake due to an exceptionally simple reason. An APU is all-in-one. If one of the things inside it is a fab dud, the whole APU is a dud. The more things you cram into an APU, the higher the risk it'll be a dud.
Because the PS4 was initially supposed to have just 4 GB GDDR5, a solution considered inferior to 8 GB DDR3+specialRAM according to some metrics. Nobody apart for Cerny and a couple others fully believed they could get 8 GB in time. Hirai accepted the GDDR5 solution and hoped for a jackpot, since Sony badly needed it. It got it.
Uh, the same Matt as this one?
wasn't superDAE from beyond3d? enough said about "sources"