• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is the Wii U gpu 176 gflops or 352 gigaflops.

Architecture usually plays a larger roll than pure numbers. FLOPS are like the new BITS honestly. Just because something has more, doesn't mean it's better (not saying it isn't better for having more, but I think you get what I'm saying).

Definitely. Newer chips are capable of waaaaaaaaay more than overclocked to hell old processors lol. Architecture is key, I was just referring to the fact that Cellphones are much cheaper than their price tags let on.
 

Lonely1

Unconfirmed Member
That SoC costs like 40-50$.




LPDDR4 has the same BW(dualchannel: 2x17Gb/s) than that 30Gb/s of the WiiU. Also they only overheat because smartphones are just a few millimetres thick, in a console case there would be no problems.

So, in 2016 Nintendo could release a more powerful console with the same power envelope as the Wii U? You don't say! :p
 

Anth0ny

Member
it only flopped once afaik

qj1wRF0.gif
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Iirc, it was leaked on Beyond3d and also the other leak which bgassassin shared.
Aside from it making no sense whatsoever*, I don't recall such a leak. Mind if I ask you for pointers?

* 550MHz on a narrow-for-eDRAM 512-bit bus would be 35GB/s.
 

kabel

Member
So, in 2016 Nintendo could release a more powerful console with the same power envelope as the Wii U? You don't say! :p

I just want to say that the WiiU is laughably underpowered.
It came out in 2012 when we already had the 28nm process.
Why not a SoC from AMD like Sony/Microsoft?


Actually I think they will use AMD SoCs for the NX this time.
No IBM CPU+some crappy GPU anymore pls....

So... Xenoblade X is possible on phones... I need proof of that.. ;)

I bet that would be possible with some minor changes + Vulkan/Metal API.
The problem would be that after 30mins playing you would have to charge the phone again xD
 

Lonely1

Unconfirmed Member
For reference, the Shield Tablet is only ~7 frames/s slower than the Snapdragon 820 (dev kit, optimal conditions) on the Manhattan 3.1 benchmark, and the first has problems with running Trine 2 at lower fidelity than the Wii U.
 
For reference, the Shield Tablet is only ~7 frames/s slower than the Snapdragon 820 (dev kit, optimal conditions) on the Manhattan 3.1 benchmark, and the first has problems with running Trine 2 at lower fidelity than the Wii U.
Sometimes people likes to compare apples to oranges... Especially here on Gaf.
 
Can a computer stream a game to say an ipad or similar system
without wifi or being connected to a cable?

There is much more than power specs about the wii u which make it unique.
 

kabel

Member
For reference, the Shield Tablet is only ~7 frames/s slower than the Snapdragon 820 (dev kit, optimal conditions) on the Manhattan 3.1 benchmark, and the first has problems with running Trine 2 at lower fidelity than the Wii U.

I think that this has something to do with the fact, that up until now there was real low level API on Android/Smartphone SoCs. Just the crappy OpenGL ES. Of course the WiiU has one.

With Vulkan we will see how much (or not^^) unused power there is.
 

Piggus

Member
http://i.imgur.com/nsuQhHG.jpg[IMG]
Wii U gpu 176 gflops or 352 gigaflops, I really don't care we need more beautiful/creative games like splatoon![/QUOTE]

These games would be a lot more beautiful if the aliasing and low resolution weren't murdering the IQ and sawing through your eyeballs.
 

LordOfChaos

Member
Welcome to 238 pages of figuring it out we did way back when.

http://www.neogaf.com/forum/showthread.php?t=511628

99% sure its 176, wikipedia was just edited by a hopeful. It was just slightly fatter than it should be which threw some people, but the fabrication plant differences account for it. As well as any other changes per shader they did.

wiiudie_blocks.jpg


8 x 20 shader units = 160ALUs, 550mhz = 176GFLOPs. The chance of it packing double the shader units through magic is negligible.


The gradual letdown was sadly hilarious. "600Glfops, worst case scenario. They can't even order a part under 300...That, uh, looks...Shit."

that's why I'm not letting myself expect anything for NX. Nintendo can always be Nintendo Special.
 

Rodin

Member
176GFLOPs seems more likely, but it was never actually confirmed.

That SoC costs like 40-50$.




LPDDR4 has the same BW(dualchannel: 2x17Gb/s) than that 30Gb/s of the WiiU. Also they only overheat because smartphones are just a few millimetres thick, in a console case there would be no problems.

1) We don't know the Wii U eDRAM bandwith. It's definitely more than 30GB/s, we simply don't know how much more. There was this dev that said he had access to like 38GB/s iirc, but it was an odd statement and i don't think anyone ever treated it as fact.
2) Throttling on mobile devices is a thing.

At this point, does it really matter?

Of course not.
 
Can a computer stream a game to say an ipad or similar system
without wifi or being connected to a cable?

There is much more than power specs about the wii u which make it unique.

You're talking about Data transmission, and the Wii U basically has built in hardware for that purpose, but attempting to somehow knock wifi transmission seems a bit strange. I can control my PS4 from work with my Vita because it uses wifi, whereas I can't take the Wii U tablet out of the room, let alone the house :/

Power is a relatively useless metric to measure these days anyway. As mentioned earlier, just because the Wii U is weak on paper doesn't mean it won't have good looking games. It's about utilization of what's there.
 
You're talking about Data transmission, and the Wii U basically has built in hardware for that purpose, but attempting to somehow knock wifi transmission seems a bit strange. I can control my PS4 from work with my Vita because it uses wifi, whereas I can't take the Wii U tablet out of the room, let alone the house :/

Power is a relatively useless metric to measure these days anyway. As mentioned earlier, just because the Wii U is weak on paper doesn't mean it won't have good looking games. It's about utilization of what's there.

Never tried playing my ps4 on vita,
but how bad is the input lag? The wii u is flawless 0 input lag I am sure.
 
LPDDR4 has the same BW(dualchannel: 2x17Gb/s) than that 30Gb/s of the WiiU. Also they only overheat because smartphones are just a few millimetres thick, in a console case there would be no problems.

What!? 30Gb/s is extremly slow for eDram even back when the Ps2 had it which was like 48Gb/s.

That's too slow to run Mario Kart 8 or even Wind Waker HD.
 
does it also do Data transmission?
My comcast IS sucks 4-5mps.

When you're at home, it'll use the network as it's transmission. So anything connected to your router, regardless of your ISP, and those speeds are typically fast as hell. Think PS4 -> router -> Vita (input from Vita) -> router -> PS4. Now when you talk about accessing it from outside your house (like from work, or the library), then that's when your ISP comes into play, and that will most definitely affect the connection. Think PS4 -> Router -> World Wide Web -> Vita (input from Vita) -> World Wide Web -> Home router -> PS4.

The Vita can also talk directly to the PS4 without a network at all, but I haven't used that mode. I usually use remote play in the bathroom lol.
 
It doesn't matter, 176 gflops or 352 gflops.
It's both pretty low.

Even smartphones are way more powerful. (Snapdragon 820 -> 588 gflops)



Thats theorical though. Thermal throttle wont allow Snapdragon in a smartphone environment to run at the clock required for 588gflops.
 

FZZ

Banned
Gflops are the new gigahertz which were the new polygons which were the new bits

You do you gamers
 

LordOfChaos

Member
probably a double vs single precision thing going on

It's 176 single precision. Even half precision would not double the number in the HD 4000 series architecture.

That's too slow to run Mario Kart 8 or even Wind Waker HD.

Just because you feel like it or do you have something more there? The PS2 iirc had an extremely wide eDRAM bus, even by todays standards, and we can't see that on the Wii U. The clock speed advantages of today were partially offset by that.

Intel's Crystalwell has about 50GB/s bandwidth serving an 800Gflop GPU, so I don't know what's unbelievable about the Wii U being at 38.

Besides, you have dev comments putting it at ~38GB/s, and Fourth Storm saying 30, so somewhere around there is damn sure to me.
 
Aside from it making no sense whatsoever*, I don't recall such a leak. Mind if I ask you for pointers?

* 550MHz on a narrow-for-eDRAM 512-bit bus would be 35GB/s.

How can Nintendo maintain the framerates or Monolithsoft Xenoblade X with a very small palrty bandwidth? Did Renesas have silicon that slow, clocked at the same frequency as the GPU?
 
Just looked back and the number I saw was 31.7 GB/s. Weird indeed, but it fit the other info I had. I guess the eDRAM possibly has its own clock discreet from the GPU.
 
It's 176 single precision. Even half precision would not double the number in the HD 4000 series architecture.



Just because you feel like it or do you have something more there? The PS2 iirc had an extremely wide eDRAM bus, even by todays standards, and we can't see that on the Wii U. The clock speed advantages of today were partially offset by that.

Intel's Crystalwell has about 50GB/s bandwidth serving an 800Gflop GPU, so I don't know what's unbelievable about the Wii U being at 38.

Besides, you have dev comments putting it at ~38GB/s, and Fourth Storm saying 30, so somewhere around there is damn sure to me.

Your comparing a CPU to a GPU now. ~38GB/s with the Glops is to slow to run games like Fast Racing Neo even with the resolution is at. Hell I doubt it would run in the main menu at 30fps. No program from Nintendo and Shin'en would be able pull off the visuals we have seen on the system because it would have been huge bottleneck for the system.

I would take blu's word and Shin'en word.
 

poodaddy

Member
OP I feel your pain breh. I learned some time ago to never get technical with gaf on console hardware. It's about as productive as fighting a brick wall. Most gaffers don't give a damn about educating themselves on gaming hardware, they just wanna play the games and then come on here to talk shit about them. Tech forums might be a better place for this kind of question, but then you gotta put up with the pc master race assholes calling you a peasant or some shit. It's difficult to find people who really want to discuss this kind of stuff :/. I feel you though OP.

EDIT: Why isn't making fun of a legitimate technical question considered shit posting? Two pages of dumbass unproductive drive by posts and we finally get a good poster on page 2 who posts facts and adds to the discussion. Immature as fuck man.
 
When you're at home, it'll use the network as it's transmission. So anything connected to your router, regardless of your ISP, and those speeds are typically fast as hell. Think PS4 -> router -> Vita (input from Vita) -> router -> PS4. Now when you talk about accessing it from outside your house (like from work, or the library), then that's when your ISP comes into play, and that will most definitely affect the connection. Think PS4 -> Router -> World Wide Web -> Vita (input from Vita) -> World Wide Web -> Home router -> PS4.

The Vita can also talk directly to the PS4 without a network at all, but I haven't used that mode. I usually use remote play in the bathroom lol.

WTH! really....so why aren't people buying more Vitas than?
I'm sold, which model do you recommend/version/bundle ect do you recommend?
 
Top Bottom