• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U is supposedly running a chip based on the RV770 according to endgadget.

AndyD

aka andydumi
Plinko said:

If it can do 4 SD streams at once total, how will they fit 1 HD and multiple SD streams at once? It just seems the capacity is not there. Maybe I read it wrong.
 
HomerSimpson-Man said:
At that resolution you're really stretched the limits of your 4850 especially to the huge 1080p frame buffer at only 512MB RAM with Witcher 2 with near max settings. You're better suited towards lower resolutions and easing the settings some more.

I actually bought a gaming laptop that I played it on with a GTX 460M, and it ran quite well on there at 1080p, much better than my desktop. I was just seeing what I could get out of my desktop for the most part, and I would never play with those settings. I was more thinking, if Wii U is using that card and I had trouble running a current PC game on it at 1080p, it just made me wonder how good the Wii U could be. It didn't seem THAT promising, but a lot of it depends how they modify the card, as well as the fact that more can be done with standardized hardware.
 
I took a seat and I feel much better now. It was too much of a geekfest for me at that moment lol.
HomerSimpson-Man said:
I looked at the 480GTX and 580GTX, the wattage and heat usage is scary at over 350 watt. The 480GTX was released the in the Spring of last year and 580GTX fall/winter and the different is like 10 watts. It's no wonder Dennis doubts it in a next gen console.

The 480GTX is now a year old, but that thing is still a monster of a card that goes for like $350-400.
Are you sure about that?

I'm pretty sure 580GTX was the perfect card for the PS4 and that it would run under 100 watts once it was optimized....>_>

People call Nintendo fans crazy...
Zombie James said:
Is that just for the card or total system consumption?
Benchmarks that I've seen have put the 580GTX between 280-320 watts under load and most computers using that card are usually paired with a 750 watt+ power supply.

Those things are absolute animals. I want one.
AceBandage said:
So, about a 4850, just like we've been hearing.
That's far more than just a "50% increase."
Ohhh StuBuuuuuurns.
:p
Best wait for confirmation.
 
nib95 said:
Disappointing. Old tech in new hardware. I hope they don't plan to charge much for it...
Yes, because consoles have never done that before. Do you choose to ignore the IBM CPU news too? Or is 2010 too old?
 

hellclerk

Everything is tsundere to me
nib95 said:
Disappointing. Old tech in new hardware. I hope they don't plan to charge much for it...
Read the thread. This has been standard practice for the gaming industry since the beginning. The PS3 and 360 are in fact anomalies to this.

Also, It shouldn't cost more than $350 (at least that's my current limit for day 1 purchase).
 
HomerSimpson-Man said:
Apparently just the GPU.

150 idle, 360+ under full load. @_@

Of course load varies from game to game, and your card won't constantly get slammed running a game, but still, the ground level and ceiling for the card...wow.

http://www.bit-tech.net/hardware/graphics/2010/11/09/nvidia-geforce-gtx-580-review/8

See, this is why I asked. Some sites don't label their charts correctly for power consumption. Others, like Anandtech, do: http://www.anandtech.com/show/4008/nvidias-geforce-gtx-580/17

Total SYSTEM power consumption = 173W idle, 389W Crysis, 452W Furmark.
 
doomed1 said:
Read the thread. This has been standard practice for the gaming industry since the beginning. The PS3 and 360 are in fact anomalies to this.

Also, It shouldn't cost more than $350 (at least that's my current limit for day 1 purchase).
Thw Wii was also an anomaly going in the opposite direction. I'm starting to think that the system will be underpowered compared to the current rumors to accommodate the crazy new controller but I hope that Nintendo surprises me.
 

RoboPlato

I'd be in the dick
A 4870 would be a fantastic baseline for the Wii U especially if the processor is as capable as rumors make it sound. If the amount of RAM is good, which Nintendo seems to know the value of, the machine will be pretty capable.
 

PantherLotus

Professional Schmuck
I have to admit this thread makes me feel better about the WiiU capabilities, but worse about me ever really understanding hardware.

I think the "bleeding edge" meta discussion is fucking retarded, though. Whether a piece of hardware is or is not bleeding edge, it's rare indeed to expect anything in production to be both at the forefront of technology AND accessible to a large audience.
 
Thought it might be a r800, the low end of them would be similar to the r700 mid to high and they ran cooler. The 4870 was a pretty hot card.

Guess the r700 would be much cheaper though.
 

nib95

Banned
GoldenEye 007 said:
Yes, because consoles have never done that before. Do you choose to ignore the IBM CPU news too? Or is 2010 too old?

I don't care about what's happened in the past. I'm comparing it to NOW. With the current hardware climate and current console specs, this is a sorry excuse for hardware from Nintendo. Come the new Playstation and Microsoft consoles, once again Nintendo will be far away in the distance charging silly money for dated hardware and visuals/tech.
 
nib95 said:
I don't care about what's happened in the past. I'm comparing it to NOW. With the current hardware climate and current console specs, this is a sorry excuse for hardware from Nintendo. Come the new Playstation and Microsoft consoles, once again Nintendo will be far away in the distance charging silly money for dated hardware and visuals/tech.


Not even close to the same situation as last gen. Not even close.
 

hellclerk

Everything is tsundere to me
Saint Gregory said:
Thw Wii was also an anomaly going in the opposite direction. I'm starting to think that the system will be underpowered compared to the current rumors to accommodate the crazy new controller but I hope that Nintendo surprises me.
Not really. The NES and SNES were both running off of decade old hardware. Now, we can know for certain that the WiiU is more powerful than the PS3 and 360, so claiming it's "underpowered" is foolish, silly, and creates empty confirmation bias. Until the PS4 and Xbox 720 or whatever are announced, there is neither proof nor reason to believe that the WiiU will be underpowered in any way shape or form. This "problem" is one entirely made up in your head and doesn't exist.
Unless you're talking PC, but then PCs tend to ALWAYS make consoles seem underpowered

nib95 said:
I don't care about what's happened in the past. I'm comparing it to NOW. With the current hardware climate and current console specs, this is a sorry excuse for hardware from Nintendo. Come the new Playstation and Microsoft consoles, once again Nintendo will be far away in the distance charging silly money for dated hardware and visuals/tech.
You mean the ones that don't exist yet and only live in your imagination?
 
nib95 said:
I don't care about what's happened in the past. I'm comparing it to NOW. With the current hardware climate and current console specs, this is a sorry excuse for hardware from Nintendo. Come the new Playstation and Microsoft consoles, once again Nintendo will be far away in the distance charging silly money for dated hardware and visuals/tech.

You really have very little hardware knowledge. Ace, we could use that thread you were talking about the other day

Edit: And what? The 360 and ps3 were cutting edge for like what, a couple of months? And they were outclassed again. I think you are going to be very disappointed with what Sony and Micro bring to the table next gen. While they will certainly push boundaries it won't be a ps2 to ps3 level leap. It's not possible and you couldn't afford it if it was. Even if Epic can manage to put that Samaritan demo onto a single die the power draw will probably still be crazy high not to mention expensive.

I expect next gen visuals to be something on the order of the Witcher 2 on say max settings or a little more than that. Hell, i'm playing Max Payne on max settings right now and i think it looks great lol
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
Willy105 said:
Why would it be compared to the PS3? It's a handheld.

Someone did it earlier in the thread. Why? I have no idea.
 

guek

Banned
nib95 said:
I don't care about what's happened in the past. I'm comparing it to NOW. With the current hardware climate and current console specs, this is a sorry excuse for hardware from Nintendo. Come the new Playstation and Microsoft consoles, once again Nintendo will be far away in the distance charging silly money for dated hardware and visuals/tech.

what are you? 12 or just ignorant? or both!
 
nib95 said:
I don't care about what's happened in the past. I'm comparing it to NOW. With the current hardware climate and current console specs, this is a sorry excuse for hardware from Nintendo. Come the new Playstation and Microsoft consoles, once again Nintendo will be far away in the distance charging silly money for dated hardware and visuals/tech.
Don't you have to be at least 13 years old to post on GAF?
 
DragonKnight said:
You really have very little hardware knowledge. Ace, we could use that thread you were talking about the other day


I told you guys. Mandatory thread about how consoles and game development works.
But noooo.
:p
 
AceBandage said:
I told you guys. Mandatory thread about how consoles and game development works.
But noooo.
:p
What do you mean "nooooo"? Make the damn thread and put the work into convincing the mods it's of value.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
So, has anybody found out if this article is actually just a re-using of the information from the previous rumor article released?

If so, this means absolutely nothing.

I can't wait for tomorrow's thread based on an article about how the GPU is identical to the Wii and then Thursday's thread based on an article about how the system could do holographic display.
 
Zombie James said:
See, this is why I asked. Some sites don't label their charts correctly for power consumption. Others, like Anandtech, do: http://www.anandtech.com/show/4008/nvidias-geforce-gtx-580/17

Total SYSTEM power consumption = 173W idle, 389W Crysis, 452W Furmark.
That is seriously fucking mindblowing. I think Mr_Brit is fucking insane at this point. I was willing to entertain the possibility before...but that thing is a goddamned monster that should not exist on this planet.

O_O
nib95 said:
I don't care about what's happened in the past. I'm comparing it to NOW. With the current hardware climate and current console specs, this is a sorry excuse for hardware from Nintendo. Come the new Playstation and Microsoft consoles, once again Nintendo will be far away in the distance charging silly money for dated hardware and visuals/tech.
You are in the extreme minority now. Anyone who actually expects the WiiU to pack more than a 4870 is several things other than sane. They are insane, ignorant of pc hardware, or they are trolls.

Which one are you?
 
shadyspace said:
What do you mean "nooooo"? Make the damn thread and put the work into convincing the mods it's of value.


There's far more qualified people that would have better insight (particularly in game development). I'd be glad to help throw a topic together, but without prior mod consent, it would just become a trolling ground and a place for people to point out where I'm wrong.
It needs to be a collaboration between a few different knowledgeable people.
 

nib95

Banned
DragonKnight said:
You really have very little hardware knowledge. Ace, we could use that thread you were talking about the other day

Sorry, but with current hardware prices on the PC front, minimum spec in my personal opinion should have been R900, especially with new, completely revised architecture around the corner. It's not only more feature rich and better performing, but far more efficient as well in terms of heat output and energy consumption.

I guess I'll form a proper opinion once we know the full specs of this custom GPU. Because if adapted enough, it could technically still have room for a good margin of added prowess.
 
nib95 said:
Sorry, but with current hardware prices on the PC front, minimum spec in my personal opinion should have been R900. It's not only more feature rich and better performing, but far more efficient as well in terms of heat output and energy consumption.

I guess I'll form a proper opinion once we know the full specs of this custom GPU. Because if adapted enough, it could technically still have room for a good margin of added prowess.


You really can't compare consoles to PCs though.
First off, PCs can deal with heat and power a lot better than consoles. An R900 would melt a console.
Second, consoles are closed function machines. Games are programed specifically for them and don't have to worry about different specs or overhead like PCs do.
 

wsippel

Banned
nib95 said:
I don't care about what's happened in the past. I'm comparing it to NOW. With the current hardware climate and current console specs, this is a sorry excuse for hardware from Nintendo. Come the new Playstation and Microsoft consoles, once again Nintendo will be far away in the distance charging silly money for dated hardware and visuals/tech.
"Based on R700" is what we hear all the time. All AMD GPUs are based on some older AMD GPUs. That tells us nothing. Whatever ends up in WiiU most certainly isn't any R700 as you know it. Nintendo has a quite talented chip design team in the US. What probably happened was that two or three years ago, Nintendo contracted AMD to design a chip for their next system. Nintendo sent their design guys from NTD over, AMD took whatever was bleeding edge at the time, and they started working on it together for the next few years - the result being whatever powers WiiU. It's probably not even finished right now - doesn't get any more "bleeding edge", right?

Granted, that tells us absolutely nothing about the performance, but whatever those two teams came up with is almost certainly not a 2008 GPU anymore.
 

witness

Member
Well thats pretty, better than I expected to hear. Now give us that RAM and full processor info and we'll be set. Exciting graphical potential.
 

Raistlin

Post Count: 9999
Plinko said:
That's what I took from this article. This absolutely confirms the hardware is capable of pumping out data to multiple controllers.

This is huge positive news.
Actually I wouldn't take that as confirmation of anything. While I love Engadget, sometimes they jump to conclusions without knowing the full picture, or drop some news where potentially erroneous implications will be made since they haven't given the full details.


Assuming the GPU can output 4 SD streams, we still don't know:

1) What implication that has for the main screen? Would it still have the power to do much at that point? Depending on the game (Catan for example), you wouldn't need it ... but it's quite possible this sort of scenario would seriously limit the sorts of content.

2) Even if the GPU can do this, what sort of RAM or other bottlenecks may exist? Again limiting what may be possible in the scenario.

3) Most importantly, this says absolutely nothing about the actual bandwidth of the wireless connection between the controllers and the console. 4 WVGA streams would actually account for more bandwidth than a 1080p feed being streamed wirelessly. That's more data than any commercially available wireless HDMI solution - which are quite expensive and only now even becoming reliable. Assuming Nintendo could do that cheaply seems like a stretch.
 

Vinci

Danish
AceBandage said:
There's far more qualified people that would have better insight (particularly in game development). I'd be glad to help throw a topic together, but without prior mod consent, it would just become a trolling ground and a place for people to point out where I'm wrong.
It needs to be a collaboration between a few different knowledgeable people.

I'd love to see the thread, mostly because I'm not as familiar with hardware as I should be. Great learning experience. Also, it might lure brain_stew out of hiding.
 
I mean lets think about the scenario here: top of the line gaming rigs cost several hundreds of dollars, shrinking and customizing always adds even more cost, and console RAM is expensive. You think Sony and Micro are going to sell at losses upwards of 200-300 dollars? Both are going to maximize performance and minimize loss. We can assume that both will sell at a loss but certainly not one in the $100+ arena anymore. Hell, with the NGP not being profitable until 2015, I don't expect the ps4 to be the ps3 of the next gen. Unless the entire playstation division wants to be in the red until 2018 lol.
 

ASIS

Member
I have but one question, how will this be compared to the next generation PS4 and X720 (assuming they go all out in power like they did this gen)? Will it be a generation behind like the Wii was? Will it be the "dreamcast" of next gen? or more like the PS2 in comparison to GC and Xbox?
 

Vinci

Danish
DragonKnight said:
I mean lets think about the scenario here: top of the line gaming rigs cost several hundreds of dollars, shrinking and customizing always adds even more cost, and console RAM is expensive. You think Sony and Micro are going to sell at losses upwards of 200-300 dollars? Both are going to maximize performance and minimize loss. We can assume that both will sell at a loss but certainly not one in the $100+ arena anymore. Hell, with the NGP not being profitable until 2015, I don't expect the ps4 to be the ps3 of the next gen. Unless the entire playstation division wants to be in the red until 2018 lol.

Sony doesn't mind. That company loves us.
 

nib95

Banned
AceBandage said:
You really can't compare consoles to PCs though.
First off, PCs can deal with heat and power a lot better than consoles. An R900 would melt a console.
Second, consoles are closed function machines. Games are programed specifically for them and don't have to worry about different specs or overhead like PCs do.

Firstly, it was to my understanding that R900 was more efficient than RV770. Smaller nm processes and better efficiency in both heat and power.

R900 would not "melt" the console at all. There's much variance in what the architecture can provide as well, so depending on the speed, specs and so forth, it needn't necessarily run very hot at all. The higher end of the 69xx perhaps, but it's the same thing or worse with the higher end of the 48xx family.

And I appreciate the second part of your post, but even then, it's about future proofing your console and me as a consumer getting better value for money. This consoles supposed to last what, 4-5 years? Going with an architecture model that dates 2008 imo seems a bit regressive. I would have expected hardware that technically or graphically pushed the industry forward a bit further.
 
AceBandage said:
You really can't compare consoles to PCs though.
First off, PCs can deal with heat and power a lot better than consoles. An R900 would melt a console.
Second, consoles are closed function machines. Games are programed specifically for them and don't have to worry about different specs or overhead like PCs do.

Edit: Yes. People reading "OMG old GPU" Don't realize two things:

--the WiiU is a closed environment
--Nintendo will certainly optimize the GPU
 
Nintendo-4Life said:
I have but one question, how will this be compared to the next generation PS4 and X720 (assuming they go all out in power like they did this gen)? Will it be a generation behind like the Wii was? Will it be the "dreamcast" of next gen? or more like the PS2 in comparison to GC and Xbox?


That depends on two things:

1. What feature set does the Wii U's GPU have. If it adds things like full Tessellation (the R770s can do it, but not well), then it'll be like the PS2 of the generation.
2. Just how crazy Sony and MS are willing to be. We've heard rumblings that MS doesn't want to take much of a loss, if any, on their next console but we'll see.
 

[Nintex]

Member
Nintendo-4Life said:
I have but one question, how will this be compared to the next generation PS4 and X720 (assuming they go all out in power like they did this gen)? Will it be a generation behind like the Wii was? Will it be the "dreamcast" of next gen? or more like the PS2 in comparison to GC and Xbox?
I'm in the PS2 comparison camp but it depends on Sony and Microsoft. It all comes down to their timing and investment. If they take a hit on hardware again they could easily outgun Nintendo the question is if that is the most wise thing to do.
 

antonz

Member
Pretty much no matter what Wii U wont get left behind because the consoles will all be using the same Engines. Unreal3/3.5, Cryengine 3 etc. Scaling will just be the situation. Just like Cryengine 3 looks better on PC than it does on Consoles.
 

wsippel

Banned
nib95 said:
Firstly, it was to my understanding that R900 was more efficient than RV770. Smaller nm processes and better efficiency in both heat and power.
Well, it's not. The process has nothing to do with the GPU family, and the main difference, efficiency wise, is the difference between VLIW5 (R700) and VLIW4 (recent Radeons) from what I can tell. Raw performance per Watt is pretty much identical. Depending on the workload, VLIW4 can be more or less efficient. Neither is a golden bullet, neither is inherently more efficient.
 
Top Bottom