ADANIEL1960
Neo Member
Can anybody explain technically, why an overclocked 3 core broadway derivitive is a bad thing for the Wii U CPU?
Useless outside of emulation.Can anybody explain technically, why an overclocked 3 core broadway derivitive is a bad thing for the Wii U CPU?
Wonder what changes Nintendo made to dodge licenses this time.
Can anybody explain technically, why an overclocked 3 core broadway derivitive is a bad thing for the Wii U CPU?
Can anybody explain technically, why an overclocked 3 core broadway derivitive is a bad thing for the Wii U CPU?
No, it doesn't get the RAM wrong.
1 gig is currently all games can use. Might be 2 gigs of RAM in the thing, but that doesn't mean much if only 1 gig is usable for games.
And honestly mentioning GPGPU stuff is kind of a misnomer. That functionality has existed since 2004. The 360 has the capability. WiiU it's likely expanded on, but just having GPGPU functionality isn't exactly much more of a tell on the power in the system than using a unified shader model GPU.
We need to know what the general bandwidth and fillrate are, how many stream processors, triangle counts etc.
CPU: “Espresso” CPU on the Wii U has three enhanced Broadway cores
Whatever this is supposed to mean regarding the CPU specs, it has not been confirmed.
GPU: “GPU7” AMD Radeon™-based High Definition GPU. Unique API = GX2, which supports Shader Model 4.0 (DirectX 10.1 and OpenGL 3.3 equivalent functionality)
No new information on this line has been confirmed. GPGPU was omitted.
Memory: Mem1 = 32MB Mem2 = 1GB (that applications can use)
edram amount I don't believe is confirmed yet. Was "applications" referring to games or OS software? Either way, a whole extra gig of ram was omitted and this cannot be overlooked.
Storage: Internal 8 GB with support for SD Cards (SD Cards up to 2GB/ SDHC Cards up to 32GB) and External USB Connected Hard Drives
8gb internal flash storage, usb connected HDD and SD card support were already known. 32gb internal storage model omitted.
Networking: 802.11 b/g/n Wifi
Already known.
Video Output: Supports 1080p, 1080i, 720p, 480p and 480i
Already known.
Video Cables Supported:
Compatible cables include HDMI, Wii D-Terminal, Wii Component Video, Wii RGB, Wii S-Video Stereo AV and Wii AV.
Already known.
USB: Four USB 2.0 Ports
Already known.
It might not support modern parallelism techniques that the ps3 and xbox 360 supports. That is probably the biggest thing.
Edit: When next gen rolls around it will be even more behind in that department.
Old architecture.
Being multicore would kind of suggest support of parallelism techniques
Weren't Intel Core 2 and "I" core derivitives of the old architechure of P3?
It's weird. The comparisons show that the 6570 performs on par, sometimes noticeably better, sometimes noticeably worse than the 4850, whereas the specs differences suggest that the 4850 should just outperform it very noticeably.
http://www.hwcompare.com/10468/radeon-hd-4850-1gb-vs-radeon-hd-6570-oem-1gb/
Oh well. I've always trusted direct FPS comparisons over what the specs may suggest, so yeah...
Thanks for the clarification, again.
Honestly, if they did indeed know the amount of ram the final system actually had, then not mentioning that the total system RAM was 2gig was and is disingenuous. That is like leaking the 360 and PS3 memory and only mentioning the amount of ram that was specifically useable by games at the time of launch. No one does that because that number can change over time. Why they chose to do that in this "leak" gave me more than a short pause.
Has no one else noticed that these sort of Wii U "leaks" always seem to happen right before Nintendo has a big conference revealing more info about the hardware? I don't believe that is a coincidence, and not in the way in which they were able to receive the information, that Nintendo was about to reveal, early. I do not believe that anything in the OP, that is supposed to be new information about the Wii U hardware, has been confirmed by Nintendo or its partners. The fact that there were glaring omissions, that were confirmed by Iwata two days later, gives it even less credibility.
Whoever the source is for the leak in the OP, either they don't know much about the final Wii U hardware, or they do, and were being purposely disingenuous in describing it. In either case, if I were vgleaks I wouldn't use that person as a trusted source for accurate information in the future.
Is it just me or have Wii U threads calmed down a lot more since the conference. The power of Bayonetta 2? Anyway even though the info in this thread is technically outdated I love reading all this stuff so keep it going.
Honestly, if they did indeed know the amount of ram the final system actually had, then not mentioning that the total system RAM was 2gig was and is disingenuous. That is like leaking the 360 and PS3 memory and only mentioning the amount of ram that was specifically useable by games at the time of launch. No one does that because that number can change over time. Why they chose to do that in this "leak" gave me more than a short pause.
Has no one else noticed that these sort of Wii U "leaks" always seem to happen right before Nintendo has a big conference revealing more info about the hardware? I don't believe that is a coincidence, and not in the way in which they were able to receive the information, that Nintendo was about to reveal, early. I do not believe that anything in the OP, that is supposed to be new information about the Wii U hardware, has been confirmed by Nintendo or its partners. The fact that there were glaring omissions, that were confirmed by Iwata two days later, gives it even less credibility.
Whoever the source is for the leak in the OP, either they don't know much about the final Wii U hardware, or they do, and were being purposely disingenuous in describing it. In either case, if I were vgleaks I wouldn't use that person as a trusted source for accurate information in the future.
I not arguing about that...
Atom processors are P3.
Is it just me or have Wii U threads calmed down a lot more since the conference. The power of Bayonetta 2? Anyway even though the info in this thread is technically outdated I love reading all this stuff so keep it going.
It didn't really provide anything new about hardware that was particularly concrete beyond the use of the term "GPGPU."Because the conference did more to shut up certain detractors than gave them ammo to keep flaming and trolling this nintendo platform.
That doesn't seem particularly conducive to development.I'm making sure this time to say this in a clear way that doesn't cause confusion due to a previous exchange. Devs were given the features, but not the "specs". They weren't given things like the clock speeds or for the GPU the amount of ALUs.
I'm making sure this time to say this in a clear way that doesn't cause confusion due to a previous exchange. Devs were given the features, but not the "specs". They weren't given things like the clock speeds or for the GPU the amount of ALUs.
bgassassin, if the GPGU's performance in the WiiU translates in a HD 4850 performance on PC, then the overall experience should not be far from what a majority of gaffers have when they currently play their PC (HD 4850 is 20% inferior to a GTX 560ti which is very popular lately).
If this seems correct to you, then 50% of all my issues with WiiU would be solved.
Do Nintendo just expect devs to try what works instead of giving useful performance information? This can't be true.
Look at the blue toad's forehead...the lighting is funky. I've seen this funkiness in other screens. NL to me is like super unimpressive. Not ugly per say, but nothing to write home about.
Apparently yes. So far it sounds like MS is doing the same.
...No...obviously the Wii U documentation has all the technical information developers need.
Though it is true that Nintendo did keep most of the info very close to its chest before actual final dev kits were sent out.
I understand that. But for a rumor topic, labeled "Wii U final specs", this implies that the information comes from an insider who either has access to final Wii U dev kits or was told what the final consumer hardware would be. Unless you are the source of this leak I don't see why me claiming the source may have been purposely disingenuous, if he did have access to final hardware specs, would matter.I'm making sure this time to say this in a clear way that doesn't cause confusion due to a previous exchange. Devs were given the features, but not the "specs". They weren't given things like the clock speeds or for the GPU the amount of ALUs.
Yet people are discussing this rumor as if it were confirmed. The fact that it is not confirmed has nothing to do with whether or not Nintendo will confirm full specs in the future. Eventually, just like the Wii, someone will crack it open and give us far more accurate general specs.It's labelled rumor for a reason. Nintendo are never going to confirm any of this.
And the source of this rumor has been in this thread and another "insider" member has corroborated it.
They do not have proper GI.It's all real time, they can't bake anything given the editor.
Details here:
http://www.eurogamer.net/articles/digitalfoundry-lbp2-tech-interview
I understand that. But for a rumor topic, labeled "Wii U final specs", this implies that the information comes from an insider who either has access to final Wii U dev kits or was told what the final consumer hardware would be. Unless you are the source of this leak I don't see why me claiming the source may have been purposely disingenuous, if he did have access to final hardware specs, would matter.
Yet people are discussing this rumor as if it were confirmed. The fact that it is not confirmed has nothing to do with whether or not Nintendo will confirm full specs in the future. Eventually, just like the Wii, someone will crack it open and give us far more accurate general specs.
Who is the source of this rumor and who do they work for? I must have missed it.
I'm not saying this disrespectfully, but I honestly do not care how many "insiders" cosign this rumor. For me that does not validate or invalidate this rumor because I know how gaf and, more broadly, message board communities work.
In my eyes the only people qualified to confirm Wii U specs, pre release, are Nintendo, one of it's partners who are working directly on the Wii U or a confirmed Wii U developer who comes out and says "These are the Wii U specs" and not some vague generalization. Everything else I take with a grain of salt. That doesn't mean that I do not have my own speculation of what I think the Wii U specs most likely are but I know the difference between rumor, speculation and confirmation.
!!!BACK-OF-THE-ENVELOPE ALERT!!!
Assuming R7xx architecture (because of "DX 10.1" reference; originally 55nm) shrunk to 40nm at 30% power savings, 25W puts us at 35W of 55nm chip equivalent, which is roughly the halfway point between Radeon HD 4550 and Radeon HD 4650. IOW 240 stream processors, 24 texture units, 600MHz.
This means nothing. It's a computation based on a speculative consumption figure and preexisting products. Don't blame me if it's not pleasant to you somehow.
I am saying it shouldn't matter to you that I call that person disingenuous for reasons stated in my first reply. I assumed that you were taking issue with me making that claim.Not me. And I may be confused here and don't take this the wrong way as I'm trying to understand, but if it doesn't matter then why put forth the effort to say they may have been purposely disingenuous?
Two already have though.
Pardon me, I am playing out of my league, I may use words alien to me to express my thoughts, I was refering to graphics computing only. I don't know if you can still split CPU and GPU functions in a GPGU, that's why I used the word.Are you trying to say GPU or GPGPU? Also I'm not sure where you got the 20% from, but I don't believe those two are anywhere near that close in performance.
Presumably that's in reference to lherre and Arkam, although I think the latter is an engineer rather than a developer?If you do not mind who are the two, what company do they work for and what did they say?
How much power do you save by stripping out things like driving multiple displays etc?
For a example this?
I haven't seen one PS3/360 game that looks just as good lighting wise, and I think the textures and detail are great.
Well, you have to display information on two (or three) different screens with the WiiU. Having a GPU (or GPGU in our case) with Eyefinity if definitely a must.
bgassassin, if the GPGU's performance in the WiiU translates in a HD 4850 performance on PC, then the overall experience should not be far from what a majority of gaffers have when they currently play their PC (HD 4850 is 20% inferior to a GTX 560ti which is very popular lately).
If this seems correct to you, then 50% of all my issues with WiiU would be solved.
Excuse me? What's so impressive about this pic? Looks awful.
You can tell whether or not now the shadows are real time by moving the light source or tinkering the radiosity.
Edit: I wonder if the wii u can do any form of ambient occlusion. That would go a long way in building great lighting and shadow system.
Erf. Thanks to correct me.Just trying to simply correct a mistake. The term is GPGPU (general purpose graphics processing unit), not GPGU
My bad ! Wikipedia listed 1263.4 GFLOPS for the GTX 560Ti and AMD's website gave 1000 GLOPS for the HD 4850. But I guess comparing FLOPS outputed by two different architectures, and made by two different manufacturers, really is a dumb thing to do, as mentioned earlier today by a gaffer.The GTX560ti is much more than just 20% faster than a Radeon4850. Even a GTX460 is more than just 20% faster than a 4850. Do expect worse image quality on Wii U than on PCs because the power will go to extra graphical eye-candy rather than anti-aliasing, resolution, anisotropic filtering, and even higher-res textures, which PCs have the extra power to spend on. Not to say games will look bad; they'll look better than Uncharted, for example, or even Last of Us (assuming the game is developed ground-up on Wii U as a graphical showcase, etc. etc.). You may even get games that look better (image quality aside) than current PC games. This will be even more so with PS4/720.
Read the last couple of pages.
The Wii U CPU is not outdated, it's custom built and been in design/production since 2009 and is a descendant of the Broadway CPU (Wii U CPU based on PowerPC 476?) but not just "a Broadway with 3 cores overclocked."
The Wii U GPU has features that are beyond DX10.1 and on par with DX11 effects (compute shaders). The Wii U GPU will show 3-4x the performance of the Xbox 360 GPU. It's really not that much when you think about it since the GPU in the 360 is 7 years old now.
I did, and honestly don't see why the lighting is supposed to be so impressive. Looks about as good as the lighting on the average 360/PS3 game.
edit: Also, what's this new info on the GPU that's been released?
Looks exactly like Banjoo for the 360 if you ask me.
I think as usual, we're confused by nintendo's ambiguity. They haven't said '75W maximum PSU draw at wall', nor have they defined what 'normal' use is. We have very little context to make any approximations
Personally, my guess would be that the 45/75 numbers are console current. Doesn't make sense to quote PSU figures for one and console figures for the other. Either they are both PSU figures, and efficiency means the actual console is maybe 25-50W, or it's two example figures eg 45W based on web browsing, miiVerse etc, and 75W is when playing a game that is pushing the COU and GPU, and streaming to the gamepad.
It might not support modern parallelism techniques that the ps3 and xbox 360 supports. That is probably the biggest thing.
Edit: When next gen rolls around it will be even more behind in that department.
For a example this?
I haven't seen one PS3/360 game that looks just as good lighting wise, and I think the textures and detail are great.
Is that a good thing?
That it's capable of more isn't in question (Arkam has already alluded to more).We dont know any of the above to be a fact. It's basically all pure speculation (and unlikely speculation imo, based on all the evidence we have so far) on your part.
360 is like 90 Watts so around 2x the Wii U.Not that it's necessarily comparable, but what are power draws for the 360 or PS3 when playing demanding games?