it only flopped once afaik
it only flopped once afaik
Do you really want the word "flops" on your cock? That would not be a confidence booster for most women.
Well, my I had to remove my first one Genesis16BitsHighDefiitionGraphics because it scared away girls I could take home to the folks.
Just looked back and the number I saw was 31.7 GB/s. Weird indeed, but it fit the other info I had. I guess the eDRAM possibly has its own clock discreet from the GPU.
Just looked back and the number I saw was 31.7 GB/s. Weird indeed, but it fit the other info I had. I guess the eDRAM possibly has its own clock discreet from the GPU.
Welcome to 238 pages of figuring it out we did way back when.
http://www.neogaf.com/forum/showthread.php?t=511628
99% sure its 176, wikipedia was just edited by a hopeful. It was just slightly fatter than it should be which threw some people, but the fabrication plant differences account for it. As well as any other changes per shader they did.
8 x 20 shader units = 160ALUs, 550mhz = 176GFLOPs. The chance of it packing double the shader units through magic is negligible.
that's why I'm not letting myself expect anything for NX. Nintendo can always be Nintendo Special.
Your comparing a CPU to a GPU now. ~38GB/s with the Glops is to slow to run games like Fast Racing Neo even with the resolution is at. Hell I doubt it would run in the main menu at 30fps. No program from Nintendo and Shin'en would be able pull off the visuals we have seen on the system because it would have been huge bottleneck for the system.
I would take blu's word and Shin'en word.
I would take blu's word and Shin'en word.
Amazingit only flopped once afaik
I thought it was basically something along the lines of
550MHz*8ROPs*4Bpp*(read+write) = 34.375GB/s
Stop raising the WUST signal, dood! /returns to DOA cave.
So whats the point of the eDRAM, thats not much better than the slow mem1 ddr2.
Weaker than the 360@240GFlops? Maybe they are closer in real world, because of efficiency of new architecture, but damn. The 360 was a great design.
Hey, are they gonna give us some detailed specs this time with NX or are we gonna have to do it all over again?
You're forgetting the z.I thought it was basically something along the lines of
550MHz*8ROPs*4Bpp*(read+write) = 34.375GB/s
Stop raising the WUST signal, dood! /returns to DOA cave.
Hey, are they gonna give us some detailed specs this time with NX or are we gonna have to do it all over again?
I'm not the person to ask for the nitty gritty details, but it is over twice as much bandwidth, it saves cost on motherboard complexity, and reduces latency (CPU also has access, remember). There are also 8 separate channels as opposed to 2 for the DDR3, which should make for less wasted cycles.
You're forgetting the z.
550MHz * 8 ROPs * (4Bpp color + 4Bpp z) * (read + write) = 70GB/s
550MHz * 4 ROPs * (4Bpp color + 4Bpp z) * (read + write) = 35GB/s
Now, I too seem to recall Latte having 8 ROPs, but I could be wrong.
Hey, are they gonna give us some detailed specs this time with NX or are we gonna have to do it all over again?
I'm not the person to ask for the nitty gritty details, but it is over twice as much bandwidth, it saves cost on motherboard complexity, and reduces latency (CPU also has access, remember). There are also 8 separate channels as opposed to 2 for the DDR3, which should make for less wasted cycles.
No it isn't. 176 vliw5 gigaflops (plus whatever enhancement they made to that architecture) are better than 240 R600 period. Wii U also has DX10.1 equivalent API, which are far closer to DX11 than DX9, and shader model 4.1. We had more than one source who said that the Wii U GPU is better than what's in last gen consoles.Yeah. Architecturally it is better, but that is quite a large deficit to fill.
Wii U, beynd having more RAM, is kinda worse than last gen if visual output is to go by...
You're forgetting the z.
550MHz * 8 ROPs * (4Bpp color + 4Bpp z) * (read + write) = 70GB/s
550MHz * 4 ROPs * (4Bpp color + 4Bpp z) * (read + write) = 35GB/s
Now, I too seem to recall Latte having 8 ROPs, but I could be wrong.
You're forgetting the z.
550MHz * 8 ROPs * (4Bpp color + 4Bpp z) * (read + write) = 70GB/s
550MHz * 4 ROPs * (4Bpp color + 4Bpp z) * (read + write) = 35GB/s
Now, I too seem to recall Latte having 8 ROPs, but I could be wrong.
Any link to that source? And did they specify "what" is better in it?No it isn't. 176 vliw5 gigaflops (plus whatever enhancement they made to that architecture) are better than 240 R600 period. And we had more than one source who said that the Wii U GPU is better than what's in last gen consoles.
I am being pretty serious about this. Rendering-wise, (the quality of rendering), no wii U game is reaching the heights of the Crysis series, GTA V, or the Naughty Dog games. If that is because a lot of their games focuse on 60fps rendering, then so be it. But the 30fps Wii U games have been completely visually underwhelming from my perspective. Even the 60fps games have me scratching my head at certain junctures.joke.gif
I think you already know the answer to that.Hey, are they gonna give us some detailed specs this time with NX or are we gonna have to do it all over again?
Even funnier than that are the comments thinking it was the best thing ever when it was obvious and done before.I came here for the 1 flop joke and I was not disappointed
Shinen said it was several generations ahead or something like that, which it technically is, because it's dx 10.1 and uses comparatively modern technology.Any link to that source? And did they specify "what" is better in it?
You have to do a cost analysis.So whats the cost differences, I'm assuming the DDR3 cost more at higher clocks, than the eDRAM.
Hmmm. Does it have to be read+write? And just to play devil's advocate a little, what about the AMD GPUs which have 8 ROPs and less bandwidth? Looking at Wikipedia, the RV730 XT, for instance, features a bus to GDDR4 @ 32 GB/s.You're forgetting the z.
550MHz * 8 ROPs * (4Bpp color + 4Bpp z) * (read + write) = 70GB/s
550MHz * 4 ROPs * (4Bpp color + 4Bpp z) * (read + write) = 35GB/s
Now, I too seem to recall Latte having 8 ROPs, but I could be wrong.
If it's amiibo powered, I'm (somewhat sadly) in pretty good shape! God help me.It's gonna be a bunch of Amiibos that combine like Voltron.
Basically that. It let's them get away with less external I/O, which also has power consumption implications. I may not worry too much about latency. The highest bandwidth consumer will be the ROPs, and the GPU will be hiding latency as much as possible while the CPU... does stuff. I'm not particularly convinced the CPU has a significant role here considering the upgrades they already did to Gekko. (cue Blu shooting me down)
That said, eDRAM isn't necessarily cheap, but they must have had a good contract with Renesas at least to deem it a worthwhile design. With the CPU's SOI eDRAM, IBM would probably have been jumping at the chance to make use of their fabs (before they sold them to GF), even for such an elfin chip size.
Any link to that source? And did they specify "what" is better in it?
I am being pretty serious about this. Rendering-wise, (the quality of rendering), no wii U game is reaching the heights of the Crysis series, GTA V, or the Naughty Dog games. If that is because a lot of their games focuse on 60fps rendering, then so be it. But the 30fps Wii U games have been completely visually underwhelming from my perspective. Even the 60fps games have me scratching my head at certain junctures.
Clock * shaders * 2 = flops.
550mhz * 320 * 2 = 352 GFLOPS
Clock * shaders * 2 = flops.
550mhz * 160* 2 = 176 GFLOPS
Developers have indicated that the Wii U isn't as powerful in graphics terms as the PlayStation 3 and Xbox 360.
Separate development sources, speaking under condition of anonymity, told GamesIndustry International that when it comes to visuals, the Wii U is "not as capable" as Sony and Microsoft's current generation of home consoles.
"No, it's not up to the same level as the PS3 or the 360," one developer said of Nintendo's first high definition home console. "The graphics are just not as powerful."
Another source stated: "Yeah, that's true. It doesn't produce graphics as well as the PS3 or the 360. There aren't as many shaders, it's not as capable. Sure, some things are better, mostly as a result of it being a more modern design. But overall the Wii U just can't quite keep up."
It's 176 gflops confirmed, and the games don't look anything beyond lastgen, and wiiu still hasn't even matched lastgen best looking games, falls right in line with its specs
Nintendo clearly has the most efficient flops.
haha. That pic is a downer but I laughed. Sorry random Splatoon person.
We know Xbox 360 GPU was 240 gflops and PS3's GPU was 192 glfops, It's really hard to believe the Wii U GPU is that much higher, with 352 glops.
176 glops seems a lot more believable.
While not specific on specs, Gamesindustry.biz and Eurogamer article from early 2012 citing developer sources would be right in line with the lower number.
http://www.eurogamer.net/articles/2012-04-03-wii-u-not-as-capable-as-ps3-xbox-360-report
http://www.gamesindustry.biz/articl...ess-powerful-than-ps3-xbox-360-developers-say
Yeah. Architecturally it is better, but that is quite a large deficit to fill. Wii U, beynd having more RAM, is kinda worse than last gen if visual output is to go by...
So awesome that GAF did all the research on this in that epic thread.
Looking back, it's still sort of astounding that Nintendo launched with such anemic HW, given that smartphone SOCs were coming out with GPUs within the same power range (~S800) in mid-2013, with more features and (likely) better CPU performance, wouldn't have been easier to just contact QCOM for some semi-custom stuff?
Yes. Keep in mind that if the Z/ROPs are not integrated into the eDRAM block (the way they were on Xenos' daughter die), but sit with the GPU so that zexels have to travel from the zbuffer to the GPU and back, then the efficiency of your lossless z compression scheme across the bus could drop arbitrarily - from a guaranteed rate for Xenos, to no compression at all for pathological cases. That's because in contrast to a primitive's z's, the z's from the footprint of the primitive in the buffer don't need to exhibit any coherency whatsoever.So this would be bandwidth available directly to the ROPs, assuming they're tightly linked a la 360, correct? Such as that claiming a 256GB/s connection between eDRAM and ROPs, but a more modest 32GB to everything else on-GPU.
I've been discussing the worst-case scenario for the console part- read+write of uncached, uncompressable data. Clearly, on the average, your ROPs don't work at 100% utilisation unless you get ultra-simplistic shading or just zexels (which, btw, is a valid use-case). As re that 730XT - it's just an unbalanced part - it will be doing much worse with its 6GPix/s as it will also need to feed its TMUs from the same 32GB/s pool.Hmmm. Does it have to be read+write? And just to play devil's advocate a little, what about the AMD GPUs which have 8 ROPs and less bandwidth? Looking at Wikipedia, the RV730 XT, for instance, features a bus to GDDR4 @ 32 GB/s.
We know Xbox 360 GPU was 240 gflops and PS3's GPU was 192 glfops, It's really hard to believe the Wii U GPU is that much higher, with 352 glops.
So this would be bandwidth available directly to the ROPs, assuming they're tightly linked a la 360, correct? Such as that claiming a 256GB/s connection between eDRAM and ROPs, but a more modest 32GB to everything else on-GPU.
According to https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units#Comparison_table:_Console_GPUs, Wikipedia says 352.
But then the 5550 with the same "specs" draws a max of 39W, with is above what the total TDP (33w) of Wii-U is.
On Gaf I've read the figure of 176 gflops. Is that confirmed?
It's probably more accurate to label Xenos as 216GFLOPs given the vec4+1 nature of the ALUs, but anyways. There will be some other difficulties in comparing to the vec5 of the r7xx generation.
Similarly, there are some quirky... quirks of G7x that would make the theoretical flops laughable.
I am being pretty serious about this. Rendering-wise, (the quality of rendering), no wii U game is reaching the heights of the Crysis series, GTA V, or the Naughty Dog games. If that is because a lot of their games focuse on 60fps rendering, then so be it. But the 30fps Wii U games have been completely visually underwhelming from my perspective. Even the 60fps games have me scratching my head at certain junctures.
I'm not sure if the following is the same as what you mean by laughable (I would agree tho).
Sony claimed a massive number for PS3's Nvidia RSX GPU at E3 2005 - 1.8 Tflops (that's the "same" number as PS4's GPU today) which I think we can all agree was from adding up every single function on the GPU, programmable and fixed function, better known as "NvFlops". Totally laughable. No different than Microsoft at GDC 200 claiming 140 gflops for the original Xbox.