• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U To Use 'Modern', 'Custom' AMD GPU With Multiple Display Support (5000+ series?)

Mr_Brit

Banned
AMD announce Wii U GPU partnership with Nintendo

Today at E3, AMD (NYSE: AMD) announced its support for Nintendo’s newly-announced Wii U™ system, as a new way to enjoy HD console gaming entertainment. The custom AMD Radeon™ HD GPU reflects the best characteristics of AMD’s graphics technology solutions: high-definition graphics support; rich multimedia acceleration and playback; and multiple display support. As an industry leader, AMD has supplied the game console market with graphics expertise and ongoing support for more than 10 years.

“We greatly value our synergistic relationship with the AMD design team. The AMD custom graphics processor delivers the best of AMD’s world-class graphics expertise. AMD will support our vision of innovating play through unique entertainment experiences," said Genyo Takeda, senior managing director, Integrated Research & Development of Nintendo Co. Ltd.

“AMD shares Nintendo’s excitement for the new HD entertainment experience planned for the Wii U console,” said David Wang, corporate vice president of Silicon Engineering, AMD. “We’re proud to provide our leading-edge HD multimedia graphics engine to power the new entertainment features of the console. Nintendo is a highly-valued customer and we look forward to the launch in 2012.”

AMD custom graphics enable the new Nintendo system to provide exciting, immersive game play and interaction for consumers around the world. The AMD custom graphics processor features a modern and rich graphics processing core, allowing the new console to shine with new graphics capabilities.

The multiple display support makes it sound like at least a 5000 series chip so it probably is DX11 spec compliant.
 

Daschysta

Member
Good news if true. How does the low end of the 5xxx series compare to the low end of the 4xxx may be helpful in estimating the low end...

Though nintendo could have just customized it to do so I guess.
 

clav

Member
Direct X 11 compliant wouldn't matter since this is not a Windows system.

Regardless, great news that AMD once again is Nintendo's chosen graphics processor. This will probably ensure backwards compatibility all the way to the Gamecube.
 

antonz

Member
Ninman said:
Doesnt multiple displays refers to the streaming on the controller?
Streaming on the controller is not one ofthe best characteristics of AMD’s graphics technology solutions. Eyefinity on the other hand is very much a hyped characteristic of AMD Tech solutions
 

_bla_

Member
claviertekky said:
Direct X 11 compliant wouldn't matter since this is not a Windows system.
It matters, because DX11 also means Shader Model 5 shaders and hardware tessellation are supported.
 

klier

Member
We we'll have a powerful CPU and a powerful GPU too.

PS3 vs Wii U is like Dreamcast vs Xbox 1 (graphics)??
 
Would be so nice if Nintendo and AMD were working on a custom R8xx GPU with DX11-class features (I know Nintendo uses OpenGL) would put Wii U even further ahead of 360/PS3 in graphics performance and features, but we'll have to wait and see what developers leak. I wish I knew more.
 

clav

Member
_bla_ said:
It matters, because DX11 also means Shader Model 5 shaders and hardware tessellation are supported.
That would matter for OpenGL, correct?

DirectX is a proprietary format used for Windows systems. Why would DirectX work with Nintendo-based hardware unless you're implying the Wii OS runs on MS code, which would be preposterous?
 

Wallach

Member
antonz said:
Streaming on the controller is not one ofthe best characteristics of AMD’s graphics technology solutions. Eyefinity on the other hand is very much a hyped characteristic of AMD Tech solutions

No, but it's also a flowery press release so they probably just took the opportunity to use it. These devices are designed to be connected to televisions in the living room, there's not a lot of reason to bother with multiple display support.
 

guek

Banned
klier said:
We we'll have a powerful CPU and a powerful GPU too.

PS3 vs Wii U is like Dreamcast vs Xbox 1 (graphics)??

it really depends on a lot of unknown factors. If the wiiu uses a power7 based cpu and a 5xxx gpu, it seems likely that the gap will be closer to ps2 -> xbox/gc.

That is unless sony and MS decide to go all out again or don't launch until something like 2015
 
markot said:
Wonder if anyone will use Nvidia next gen >.>


I'm hoping with PS4, Sony uses Nvidia's Maxwell GPU architecture, due in 2013. It's two GPU-gens beyond what we have now, with Fermi. Before Maxwell, there is Kepler, due in 2011 (could be delayed who knows).
 

Log4Girlz

Member
Considering its using a Power 7, the GPU would have to be pretty powerful to justify it. I would say realistically 3x the over-all power of a current HD system. I don't think we'll see that power for sometime though, most multi-plat games will look exactly the same as the older HD twins...but with time we will start seeing the impressive shit come out.
 

Mr_Brit

Banned
claviertekky said:
That would matter for OpenGL, correct?

DirectX is a proprietary format used for Windows systems. Why would DirectX work with Nintendo-based hardware unless you're implying the Wii OS runs on MS code, which would be preposterous.
We're talking about the GPU supporting the DX11 spec such as tesselation, unified shaders, compute shaders not that Wii U will be using DirectX.
 

clav

Member
Mr_Brit said:
We're talking about the GPU supporting the DX11 spec such as tesselation, unified shaders, compute shaders not that Wii U will be using DirectX.
So specifically that would be OpenGL 4.0 and above?
 

Cooter

Lacks the power of instantaneous movement
Log4Girlz said:
Considering its using a Power 7, the GPU would have to be pretty powerful to justify it. I would say realistically 3x the over-all power of a current HD system. I don't think we'll see that power for sometime though, most multi-plat games will look exactly the same as the older HD twins...but with time we will start seeing the impressive shit come out.


What does 3x over-all power mean in games? Should we expect 720/PS4 to have 3x over-all power of WiiU?
 
Cooter said:
What does 3x over-all power mean in games? Should we expect 720/PS4 to have 3x over-all power of WiiU?


Nobody knows yet, console architectures are constantly in flux, there's no way to know right now. Too many factors to consider.
 
Log4Girlz said:
Considering its using a Power 7, the GPU would have to be pretty powerful to justify it. I would say realistically 3x the over-all power of a current HD system. I don't think we'll see that power for sometime though, most multi-plat games will look exactly the same as the older HD twins...but with time we will start seeing the impressive shit come out.

Isn't there already a pretty big difference in multiplat games on the PC vs. console? At least in terms of framerate at 1080p. I don't see why devs wouldn't take advantage of a more powerful console.

Hell, if I were running Nintendo, I'd ASK devs to please take advantage of our console, and make the Wii U version the DEFINITIVE version on console.
 
Log4Girlz said:
Considering its using a Power 7, the GPU would have to be pretty powerful to justify it. I would say realistically 3x the over-all power of a current HD system. I don't think we'll see that power for sometime though, most multi-plat games will look exactly the same as the older HD twins...but with time we will start seeing the impressive shit come out.

Couldn't they have the Wii U versions running at 60fps compared to the 30fps that you see on the PS3 and 360? That would big quite a big upgrade even if the games are still built with the PS3/360 in mind at first.
 

guek

Banned
SolidSnakex said:
Couldn't they have the Wii U versions running at 60fps compared to the 30fps that you see on the PS3 and 360? That would big quite a big upgrade even if the games are still built with the PS3/360 in mind at first.

THQ developer (forgot his name) has already said that the wii-u version of darksiders 2 will be the definitive one due to hardware power and additional features.
 
guek said:
THQ developer (forgot his name) has already said that the wii-u version of darksiders 2 will be the definitive one due to hardware power and additional features.

Source? I'd like to see this.
 

ymmv

Banned
If the Wii U is indeed a lot more powerful than the current PS3/360 GPU what could happen is that it becomes the graphical baseline for next gen games. In that case it won't matter that the next gen Xbox and PS have better graphic cards, because multiplatform developers will make games for the lowest common denominator. Sony and MS will get games that run smoother but will look only slightly better.

In a (mostly) multiplatform world graphics won't be a decisive buying factor for consumers anymore, but other things: multimedia capabilities, internet connectivity, gameplay methods, controller, etc.
 

clav

Member
Dreams-Visions said:
Sony will kinda have to if they want to keep BC.
Unless Sony is capable of pulling some ninja code like Microsoft did with the 360 and Xbox 1 emulator, yeah, Sony won't change the GPU brand for the PS4.

PS2 emulation is still a huge question mark for today.
 

bobbytkc

ADD New Gen Gamer
Dreams-Visions said:
Sony will kinda have to if they want to keep BC.


I don't really see why. You can swap GPU brands on a PC without repercussion, and current console GPUs are really close to PC GPUs.
 

Instro

Member
Interesting news there. Could be 5xxx, could be 4xxx series thats customized to a point that it has many features of newer gpu lines.

Hopefully some future dev leaks get us a better picture.
 
bobbytkc said:
I don't really see why. You can swap GPU brands on a PC without repercussion, and current console GPUs are really close to PC GPUs.
See: Xbox --> Xbox 360 transition and what happened with Backwards Compatibility.

See: PS2 --> PS3 transition and what happened to Backwards Compatibility.

No really, you should look it up. It's pretty interesting.

Perhaps someone will explain why, but I don't feel up to it right now other than to say, "It don't work like PC, bruh. Dedicated code for features exclusive to xy hardware that doesn't exist for ab hardware."
 

iamblades

Member
_bla_ said:
It matters, because DX11 also means Shader Model 5 shaders and hardware tessellation are supported.

Shader model 5 isn't even really a thing(all it added was the compute shader, which does nothing that couldn't be done already, just that you can use MS's standard HLSL for it now instead of BrookGPU or OpenCL or Close to metal or CUDA or whatever alternate language your hardware supports) and has no impact on non DirectX code., and AMD/ATi parts have supported hardware tessellation for a LOOONG time, so that's really a non issue.

DX 11 level hardware really adds nothing for Nintendo's purposes.
 

Fredescu

Member
DoomXploder7 said:
so how much duct tape at minimum? I'm ignorant on all this tech stuff.
This tells us next to nothing. The lowest end 5000 series card is barely capable of playing games.
 

Izayoi

Banned
Well, that's a relief. I was really worried after the underwhelming tech demos.

Show me the money, Nintendo.
 

Fou-Lu

Member
guek said:
THQ developer (forgot his name) has already said that the wii-u version of darksiders 2 will be the definitive one due to hardware power and additional features.

It's so weird to hear stuff like this said about a Nintendo console. Yes Gamecube was no pushover and neither were any of their older systems, but you still didn't hear shit like this.
 

Raistlin

Post Count: 9999
Doesn't the Wii U only support one LCD controller? If so, that's really only two displays ... which non Eyefinity GPU's support.
 

bobbytkc

ADD New Gen Gamer
Dreams-Visions said:
See: Xbox --> Xbox 360 transition and what happened with Backwards Compatibility.

See: PS2 --> PS3 transition and what happened to Backwards Compatibility.

No really, you should look it up. It's pretty interesting.

Perhaps someone will explain why, but I don't feel up to it right now other than to say, "It don't work like PC, bruh. Dedicated code for features exclusive to xy hardware that doesn't exist for ab hardware."


I understand that. But the impression I got of the PS3 GPU is that it is pretty standard ("off the shelf" as people around these parts say). PS2's GPU was a custom job. The xbox does not utilise eDRAM like the 360 so the memory architecture has changed. I am just not aware that the PS3 GPU has any custom hardware features that will hamper it if Sony decides to swap GPU vendors.
 

linko9

Member
Izayoi said:
Well, that's a relief. I was really worried after the underwhelming tech demos.

Show me the money, Nintendo.

Really don't understand how the garden and zelda demos were underwhelming, (or sub-par...). They're running at 1080p, and look pretty freaking good.
 
Ichor said:
It's so weird to hear stuff like this said about a Nintendo console. Yes Gamecube was no pushover and neither were any of their older systems, but you still didn't hear shit like this.

Isn't this on top of streaming to the controller? What if there's a game that just says "Nuh uhh, we ain't streaming. Here's ALL the horsepower we can throw at you."
 

antonz

Member
linko9 said:
Really don't understand how the garden and zelda demos were underwhelming, (or sub-par...). They're running at 1080p, and look pretty freaking good.
I dont think alot of people realize Nintendo was showing its stuff in 1080P
 

iamblades

Member
Fredescu said:
This tells us next to nothing. The lowest end 5000 series card is barely capable of playing games.

The lowest end 5000 series desktop part (5450) should be quite a bit more powerful than the Xenos, more transistors, more clockspeed, more hardware features.

how a card performs in a windows pc tells you nothing about how a similar GPU will perform in a dedicated console environment.

I doubt they go that low end though...
 

clav

Member
bobbytkc said:
I understand that. But the impression I got of the PS3 GPU is that it is pretty standard ("off the shelf" as people around these parts say). PS2's GPU was a custom job. The xbox does not utilise eDRAM like the 360 so the memory architecture has changed. I am just not aware that the PS3 GPU has any custom hardware features that will hamper it if Sony decides to swap GPU vendors.
http://en.wikipedia.org/wiki/CUDA

This architecture only exists on Nvidia graphics cards.

Think about how those proprietary calls would translate that to an AMD card if the games were not coded for that. There would need to be a system call translator, which would not be efficient unless it were coded at a really low level.
 
Ichor said:
It's so weird to hear stuff like this said about a Nintendo console. Yes Gamecube was no pushover and neither were any of their older systems, but you still didn't hear shit like this.

kinda nice to see it even when it wont last long

too bad I had not access to Darksiders 1 other than people calling it a dark zelda
but I will give this a trying know I will no doubt own a U
 

Gravijah

Member
Net_Wrecker said:
Isn't this on top of streaming to the controller? What if there's a game that just says "Nuh uhh, we ain't streaming. Here's ALL the horsepower we can throw at you."

They'll be able to make games for Wiimote+CC and stuff, so I don't see why not.
 

Izayoi

Banned
linko9 said:
Really don't understand how the garden and zelda demos were underwhelming, (or sub-par...). They're running at 1080p, and look pretty freaking good.
antonz said:
I dont think alot of people realize Nintendo was showing its stuff in 1080P
I was not aware of this.

I thought that direct-feed Zelda shot was 720p native, though?

I'm confused.
 
guek said:
THQ developer (forgot his name) has already said that the wii-u version of darksiders 2 will be the definitive one due to hardware power and additional features.

That's what i'm expecting. I'll be switching over from the PS3 to the Wii U for multiplatform games once its released.
 
Top Bottom