• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U is supposedly running a chip based on the RV770 according to endgadget.

Darkman M

Member
Shin Johnpv said:
Doesn't mean anything in a closed environment.

Your PC performance with similar parts does not equal what those parts can do in a closed environment.


Im talking about at 1080p resolution Large textures, max shadows, Max everything. Consoles today can barely get games at 720p right.
 
Speculation is nice and all, but it doesn't lead anywhere. Even if (old) this rumour was true, the only thing we would know is that the Wii U GPU is RV770 based, which is barely better than nothing. We still don't know the actual specifications (clocks, EDRAM, SPUs, TMUs, ROPs, etc.) so any attempts at determining its performance level now are basically pointless.

Even if we had those details, we still wouldn't know how the multiple screen rendering will affect performance. It's fair to say that those expecting multiple Wii U controller support are going to end up being disappointed, and so will be those who are expecting the hardware in the WiiU to be vastly superior to the other current generation consoles. Given the size of the box, the power envelope used for the various components, and Nintendo's penchant for day-one profitability, they can't aim that high. Those expecting a box with a Power 7 CPU with a top of the line RV770 GPU are clearly in denial.

The sensible thing to expect is a two-year old mid-range GPU and a Power 7 based custom CPU downclocked and with less cores. Nintendo only needs a bit of headroom to achieve performance parity with the other consoles while being able to stream to the controller. After all, it's not like, this far into the generation, third parties are going to suddenly choose the Wii U as their lead platform, which is the one thing that would make the Wii U versions better. When it comes to Nintendo, their games have always been about artstyle more than technical prowess, so they don't need to have state-of-the-art hardware either.
 

elsk

Banned
Sure it sounds good compared to 5-6years old hardware (PS3/Xbox360). But when the new Sony and MS consoles are released, Wii U is going to be in the same situation Wii was all this years.
 

Truth101

Banned
elsk said:
Sure it sounds good compared to 5-6years old hardware (PS3/Xbox360). But when the new Sony and MS consoles are released, Wii U is going to be in the same situation Wii was all this years.

Not really,

Any game made for the PS4/720 should easily be scalable down to the Wii U. Which was not the case for the Wii.
 
Unfortunately I'm gonna have to call bullshit on Engadget's claim since it flies in the face of what AMD has stated about the GPU. AMD stated that "The custom AMD Radeon™ HD GPU reflects the best characteristics of AMD’s graphics technology solutions: high-definition graphics support; rich multimedia acceleration and playback; and multiple display support." and "The AMD custom graphics processor features a modern and rich graphics processing core, allowing the new console to shine with new graphics capabilities." These two phrases right here point to the GPU at least being based on the Cypress core, since it is the first AMD GPU to natively support eyefinity, AMD's proprietary multi display technology.
 
Really confused with this rampage of new info that keeps tipping the scales.

In 2 days we have gone from "Wellll, we talked to some devs and some of them said its only 50% more powerful than da HD twins" To which I thought "What does that mean? Whats being measured? :/".

And now this thread ... IDK whats going on.
 

schuelma

Wastes hours checking old Famitsu software data, but that's why we love him.
AzureNightmare said:
Unfortunately I'm gonna have to call bullshit on Engadget's claim since it flies in the face of what AMD has stated about the GPU. AMD stated that "The custom AMD Radeon™ HD GPU reflects the best characteristics of AMD’s graphics technology solutions: high-definition graphics support; rich multimedia acceleration and playback; and multiple display support." and "The AMD custom graphics processor features a modern and rich graphics processing core, allowing the new console to shine with new graphics capabilities." These two phrases right here point to the GPU at least being based on the Cypress core, since it is the first AMD GPU to natively support eyefinity, AMD's proprietary multi display technology.

So is that good or bad
 
elsk said:
Sure it sounds good compared to 5-6years old hardware (PS3/Xbox360). But when the new Sony and MS consoles are released, Wii U is going to be in the same situation Wii was all this years.
Let the big boys handle this one, Junior.
 

jonno394

Member
elsk said:
Sure it sounds good compared to 5-6years old hardware (PS3/Xbox360). But when the new Sony and MS consoles are released, Wii U is going to be in the same situation Wii was all this years.

Erm, i'm more than happy to keep playing PS3 graphic quality games for the next 5 years tbh, especially Nintendo first party titles.

I don't think Graphics need to get that much better visually. We're reaching a ceiling, and I don't see the point in consistently striving for reality. I want the next gen machines to just improve the processing power, allow for more developed AI and game engines.

Graphics can take a back seat for me.
 

StevieP

Banned
elsk said:
Sure it sounds good compared to 5-6years old hardware (PS3/Xbox360). But when the new Sony and MS consoles are released, Wii U is going to be in the same situation Wii was all this years.

In which universe? The Wii is 90's tech overclocked. The Wu is modern tech. Even if Sony decides they will not adhere to the laws of physics, and create that Kutaragi Grill that draws 400w and will sell for $800 they're not going to be putting out a system that will destroy the Wu. Nor dedicated gaming PCs. It's just not possible without doing pretty much all of those things.

Improved IQ is a given, leaving a console in the dust is not. The only place you have an argument here is third party support, which is mostly a question mark. But again using Sony as an example, the PS3 is the lowest-selling console this generation but still has ports that it shares with 360/PC for the exact reason I detailed above: because it's possible.
 
elsk said:
Sure it sounds good compared to 5-6years old hardware (PS3/Xbox360). But when the new Sony and MS consoles are released, Wii U is going to be in the same situation Wii was all this years.
How so?
This statement has never been made before and intrigues me.
 

Zzoram

Member
The 4870 is roughly 3x more powerful than the X1900 for PC games. The Wii U should be at least twice as fast as the 360 if it's really a 4870, even if it's a bit underclocked.
 
AzureNightmare said:
Unfortunately I'm gonna have to call bullshit on Engadget's claim since it flies in the face of what AMD has stated about the GPU. AMD stated that "The custom AMD Radeon™ HD GPU reflects the best characteristics of AMD’s graphics technology solutions: high-definition graphics support; rich multimedia acceleration and playback; and multiple display support." and "The AMD custom graphics processor features a modern and rich graphics processing core, allowing the new console to shine with new graphics capabilities." These two phrases right here point to the GPU at least being based on the Cypress core, since it is the first AMD GPU to natively support eyefinity, AMD's proprietary multi display technology.

I wouldn't read too much into that. That's just PR gibberish. Eyefinity support is pointless for this kind of device, because it's only ever going to use two different displays. For all we know, the rumour could be good, but it is still as vague as it gets.


Zzoram said:
The 4870 is roughly 3x more powerful than the X1900 for PC games. The Wii U should be at least twice as fast as the 360 if it's really a 4870, even if it's a bit underclocked.

It's a custom GPU. Still, the 4870 is not a GPU, but a stand-alone graphics card. A big part of a card's performance has to do with the available bandwidth and the amount of memory, specifications which remain unknown for the Wii U. Even if we knew the exact specifications of the GPU Nintendo will be using, we would still have to know the precise details about their memory subsystem and the parts they're using. In any case, I don't buy that they're using anything but a mid-range GPU.
 

Nirolak

Mrgrgr
Zzoram said:
The 4870 is roughly 3x more powerful than the X1900 for PC games. The Wii U should be at least twice as fast as the 360 if it's really a 4870, even if it's a bit underclocked.
Though it is worthy to note that the 4870's power is barely used by most PC games, so you could get a lot more of an increase than 3x in a console or just from a hardware perspective.

But again, this doesn't jive with the power assessments so far, so there's something up here.
 

Azure J

Member
AzureNightmare said:
Unfortunately I'm gonna have to call bullshit on Engadget's claim since it flies in the face of what AMD has stated about the GPU. AMD stated that "The custom AMD Radeon™ HD GPU reflects the best characteristics of AMD’s graphics technology solutions: high-definition graphics support; rich multimedia acceleration and playback; and multiple display support." and "The AMD custom graphics processor features a modern and rich graphics processing core, allowing the new console to shine with new graphics capabilities." These two phrases right here point to the GPU at least being based on the Cypress core, since it is the first AMD GPU to natively support eyefinity, AMD's proprietary multi display technology.

There's just two things that have managed to keep me from considering this as a possibility. First off, literally everyone who has cracked off a rumor about the GPU in the system has mentioned R700/RV770, the former more prominently than the latter. The second thing is that console GPU designs aren't and don't have to be just the off the shelf store part slapped into a box. The console makers have the ability to look at the chips, architectures and everything in between and make modifications to further suit their own design goals. It isn't without precedent, the Xenos GPU in the 360, while based of the Radeon X1900 and thus a DX9 part also had things added to itself like a tessellator which allowed it to bring DX9+ visuals to the table.

I'd totally like to believe that Nintendo pulled a coup and picked up something like say the 5770 part which is the same thing as a 4870 except with half the bus width and the more modern bullet points (a tessellation unit, Eyefinity support, Shader Model 5 & OpenGL 3.2 [Not too sure about these last two, but I think they were debuted in this gen of Radeons...]), but when everything out currently states that Nintendo was enamored with the R700 class and customizing GPUs baselines with things from newer GPU models isn't a new or seemingly hard thing to do, it gets really hard to take that leap of faith and believe otherwise.
 

Instro

Member
Gravijah said:
So, does that make the PC Broly since it requires the might of multiple consoles to take it down?
The PC is more like Majin Buu, it gains strength by adding new parts. It's potential is limitless.
 

Ravidrath

Member
Not sure why people are surprised the chip is a few years old.

Did people really expect some cutting edge thing in a system the size of a Wii?
 

StevieP

Banned
Ravidrath said:
Not sure why people are surprised the chip is a few years old.

Did people really expect some cutting edge thing in a system the size of a Wii?

It's a lot longer than the Wii.
 
AzureJericho said:
There's just two things that have managed to keep me from considering this as a possibility. First off, literally everyone who has cracked off a rumor about the GPU in the system has mentioned R700/RV770, the former more prominently than the latter. The second thing is that console GPU designs aren't and don't have to be just the off the shelf store part slapped into a box. The console makers have the ability to look at the chips, architectures and everything in between and make modifications to further suit their own design goals. It isn't without precedent, the Xenos GPU in the 360, while based of the Radeon X1900 and thus a DX9 part also had things added to itself like a tessellator which allowed it to bring DX9+ visuals to the table.

I'd totally like to believe that Nintendo pulled a coup and picked up something like say the 5770 part which is the same thing as a 4870 except with half the bus width and the more modern bullet points (a tessellation unit, Eyefinity support, Shader Model 5 & OpenGL 3.2 [Not too sure about these last two, but I think they were debuted in this gen of Radeons...]), but when everything out currently states that Nintendo was enamored with the R700 class and customizing GPUs baselines with things from newer GPU models isn't a new or seemingly hard thing to do, it gets really hard to take that leap of faith and believe otherwise.
The problem I have with that rumor is that it seems to just be everyone echoing each other.
 

Krowley

Member
AzureNightmare said:
The problem I have with that rumor is that it seems to just be everyone echoing each other.


But you would probably have about the same effect if it were the truth. Everybody would be saying basically the same thing.
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
XPE said:
Well I've been running a 4870 for a couple of years now and its done every thing ask of it.


So a 4890 should be a very capable card.
That's great to hear. You should give it a raise.
 

KrawlMan

Member
Nirolak said:
Though it is worthy to note that the 4870's power is barely used by most PC games, so you could get a lot more of an increase than 3x in a console or just from a hardware perspective.

But again, this doesn't jive with the power assessments so far, so there's something up here.

I really wish Nintendo would just cut the crap and tell us what's up with this system. :(
 

ElFly

Member
Nirolak said:
Though it is worthy to note that the 4870's power is barely used by most PC games, so you could get a lot more of an increase than 3x in a console or just from a hardware perspective.

But again, this doesn't jive with the power assessments so far, so there's something up here.

The guy who made the "50% more powerful" leak could be just looking at the mhz the thing runs.

Like, for example, a Radeon HD 4670 runs at 750mhz (according to wikipedia), while the Xenos runs at 500 mhz (wikipedia again), which is precisely 50% more power.

But, like someone else said, the chip on the Wiiu will be a custom one, roughly based on an existing one, so anything can go.
 

Hcoregamer00

The 'H' stands for hentai.
AzureJericho said:
There's just two things that have managed to keep me from considering this as a possibility. First off, literally everyone who has cracked off a rumor about the GPU in the system has mentioned R700/RV770, the former more prominently than the latter. The second thing is that console GPU designs aren't and don't have to be just the off the shelf store part slapped into a box. The console makers have the ability to look at the chips, architectures and everything in between and make modifications to further suit their own design goals. It isn't without precedent, the Xenos GPU in the 360, while based of the Radeon X1900 and thus a DX9 part also had things added to itself like a tessellator which allowed it to bring DX9+ visuals to the table.

I'd totally like to believe that Nintendo pulled a coup and picked up something like say the 5770 part which is the same thing as a 4870 except with half the bus width and the more modern bullet points (a tessellation unit, Eyefinity support, Shader Model 5 & OpenGL 3.2 [Not too sure about these last two, but I think they were debuted in this gen of Radeons...]), but when everything out currently states that Nintendo was enamored with the R700 class and customizing GPUs baselines with things from newer GPU models isn't a new or seemingly hard thing to do, it gets really hard to take that leap of faith and believe otherwise.

I have been staying away from this discussion in part because of the head-banging level of hyperbole in some of the graphics arguments. You have a very good point, which people kind of tend to forget.

Regardless of what Nintendo uses as a base, they will ultimately have it custom made to their specifications. Assuming that they are using an RV770 as a base, it isn't a bad thing for them to get the chip as a baseline and then do necessary stuff to get what they perceive as necessary to what their goals are.

The ram configuration may be different, the kind of ram, the clock speed of the GPU, etc. People tend to forget that before the Wii, Nintendo engineers were very good at optimizing a console's graphics capabilities. Get a RV770 as a base and I bet that we can see some interesting stuff in the final machine.
 

Azure J

Member
Nirolak said:
Though it is worthy to note that the 4870's power is barely used by most PC games, so you could get a lot more of an increase than 3x in a console or just from a hardware perspective.

But again, this doesn't jive with the power assessments so far, so there's something up here.

I agree we're missing something here. If the people that claimed 50% more were doing it by offhand measures or were inexperienced folks looking at something going on off screen, I would back this more, but fact of the matter is, these are devs working on the system stating this. It seems as though there are a few possible conclusions here:

- If this is right, then Nintendo nerfed the potential of the GPU somewhat. (For what purpose though?)

- If the dev kits weren't finalized and what was shown at E3 more for the sake of showing something running than running on the absolute final hardware with finalized clocks, RAM amounts and the like (since every other bit of hardware has to be locked down at this point), then we could be in for a surprise next year when the system comes out.

- There is also the possibility that another chip in the family was used and tweaked to Nintendo's design goal from the R700 series. The issue here is whether or not they belong to what was the highest end of the 700 chip series (4830, 4850, 4870), the performance:power sweetheart (RV740 aka 4770) or the rest of the family which goes from 360x2 to barely functioning 360.
 

AColdDay

Member
guek said:
if it's a 4890?

VegetaAscendedSuperSaiyanEp155.png


ascended saiyan. it can kick lots of android ass, but still can't match up to perfect cell (the ps4? DUN DUN DUUUUUN!)
Guek with a 5 star post
 

AColdDay

Member
AzureJericho said:
Where have you been, he's been on point with the DBZ references in this thread. :lol
I read through the rest of the thread with awe. We are on the ground floor of something big.

Would the Xbox 360 be Vegeta? He always finds himself surpassed by Goku (Xbox getting surpassed by Gamecube after the pricedrop, Wii surpassing 360).
 

OMT

Member
Also:

NES - Ten-year-old CPU, PPU uses around four-year-old tech
Genesis - Ten-year-old CPU, recycled seven-year-old VDC
SNES - Seven-year-old CPU, custom PPU based on five-year-old tech
Saturn - CPU uses five-year-old tech, same with VPU
PlayStation - CPU uses seven-year-old tech, 3D relies on CPU, graphics processor about the same
N64 - CPU uses five-year-old tech, same with graphics/sound processor
Dreamcast - CPU development of eight-year-old tech, PVR uses three-year-old tech
PS2 - CPU based on five-year-old tech, Graphics Synthesizer completely new
GCN - CPU based on four-year-old tech, Flipper completely new
Xbox - CPU based on two-year-old tech, XGPU completely new
Xbox 360 - CPU a derivation of completely new Cell processor, GPU completely new
PS3 - CPU "Cell" completely new, GPU uses year-old tech
Wii - CPU based on nine-year-old tech, Hollywood based on five-year-old tech

The PS360 aren't the rule, they're the aberrations. Looks like they attempted to push Nintendo out of the market, and got sidestepped. There's no reason to believe that they'll push for the bleeding edge next generation.
 
OMT said:
Also:

NES - Ten-year-old CPU, PPU uses around four-year-old tech
Genesis - Ten-year-old CPU, recycled seven-year-old VDC
SNES - Seven-year-old CPU, custom PPU based on five-year-old tech
Saturn - CPU uses five-year-old tech, same with VPU
PlayStation - CPU uses seven-year-old tech, 3D relies on CPU, graphics processor about the same
N64 - CPU uses five-year-old tech, same with graphics/sound processor
Dreamcast - CPU development of eight-year-old tech, PVR uses three-year-old tech
PS2 - CPU based on five-year-old tech, Graphics Synthesizer completely new
GCN - CPU based on four-year-old tech, Flipper completely new
Xbox - CPU based on two-year-old tech, XGPU completely new
Xbox 360 - CPU completely new, GPU completely new
PS3 - CPU completely new, GPU completely new
Wii - CPU based on nine-year-old tech, Hollywood based on five-year-old tech

The PS360 aren't the rule, they're the aberrations. Looks like they attempted to push Nintendo out of the market, and got sidestepped. There's no reason to believe that they'll push for the bleeding edge next generation.
Son of a poop. This should be quoted in future threads.

Of course, I think the reason the PS3/360 pushed the cutting edge tech was because if they had gone the Wii route the visuals would not have made nearly as much of a jump. I know that sounds really obvious but when that's your selling point it would make sense to push the powerful stuff.
 
OMT said:
Also:

NES - Ten-year-old CPU, PPU uses around four-year-old tech
Genesis - Ten-year-old CPU, recycled seven-year-old VDC
SNES - Seven-year-old CPU, custom PPU based on five-year-old tech
Saturn - CPU uses five-year-old tech, same with VPU
PlayStation - CPU uses seven-year-old tech, 3D relies on CPU, graphics processor about the same
N64 - CPU uses five-year-old tech, same with graphics/sound processor
Dreamcast - CPU development of eight-year-old tech, PVR uses three-year-old tech
PS2 - CPU based on five-year-old tech, Graphics Synthesizer completely new
GCN - CPU based on four-year-old tech, Flipper completely new
Xbox - CPU based on two-year-old tech, XGPU completely new
Xbox 360 - CPU completely new, GPU completely new
PS3 - CPU completely new, GPU completely new
Wii - CPU based on nine-year-old tech, Hollywood based on five-year-old tech

The PS360 aren't the rule, they're the aberrations. Looks like they attempted to push Nintendo out of the market, and got sidestepped. There's no reason to believe that they'll push for the bleeding edge next generation.
That's some useful information you've got there, and I appreciate you doing all the work and posting it instead of me having to go do it myself.
I'd always wondered how this leap compared to previous gen's leaps
 
I think engadget is off. I think its based on a 5770. The 5770 has all the right features. Low power requirements and the eyefinity display tech required for the controller.
 
OMT said:
Xbox 360 - CPU completely new, GPU completely new
PS3 - CPU completely new, GPU completely new

What? The 360 CPU is hardly new; it's a PowerPC derivative. It's custom made, but you can't possibly say that it's new technology. Also, RSX is also pretty much not what you would call new. It's essentially a modified NV47 GPU. The only new, cutting-edge parts this gen are Cell and Xenos.
 

Instro

Member
Clevinger said:
PS Vita shows at least Sony is still interested in the bleeding edge.
I've been wondering about this for awhile, but how new are the CPU and GPU in the Vita? Both have been around for at least a year havent they?
 

OMT

Member
slopeslider said:
That's some useful information you've got there, and I appreciate you doing all the work and posting it instead of me having to go do it myself.
I'd always wondered how this leap compared to previous gen's leaps

Thanks guys. Kept me distracted for a couple of hours. As you can see, console development progressed at a fairly steady rate until Microsoft entered the fray.
 
Top Bottom