• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Zoramon089 Rumor: WiiU GPU is a modified Radeon E6760 [AMD: We haven't said anything]

Ryoku

Member
I think CoD: Black Ops 2 it's a good example of the graphical improvement, isn't it?. X360 and PS3 first CoD:BO ran at 600p and 540p respectively, while the CoD:BO2 will run at 1080p at Wii U. Taking in consideration that the first generation of games is always not well optimized, i'd say it's a good demonstrative about the wii u power

I don't think it has really been 100% confirmed at this point that it runs at 1080p, despite what PR has said thus far (full HD). Regardless, I don' think CoD is the best benchmark for a system's capabilities.
 

Reallink

Member
Is there any particular reason a card like this wouldn't be able to run PS360 third party fare at 1080p or even 1600x900 scaled? I'm still trying to figure out how even most inept under budgeted developer could be struggling with 720p if these specs have any truth behind them.
 
In the right hands this little machine will kick out some great results. I am in it for the Nintendo exclusives...especially from the Galaxy team and Retro Studios. The rest of the third party support is just a cherry on top of my ice cream. ;0)

When PS4/720 come out I will see how open they are as a platform, and if they are too much of a walled garden for my taste then I am going Wii U/PC next gen.
 
2.5x Gflops more than the Xenos isn't a generational leap? I like how arbitrary terms are being used as if they have absolute definitions...

It is not a generational leap historically speaking. Else the Wii U would be sporting a GTX680/Radeon 7970 class GPU. We're not seeing the typical 8-10+ times jump in processing power.
 

Lord Error

Insane For Sony
Drop this FLOPs debate when comparing GPUs with different architectures. It really doesn't help. To give you an idea of performance, an E6760 with 576 GFLOPs is slightly more powerful than a 4850 with 1TFLOP.
I've seen you post this before, but are there any benchmarks that show that?

IIRC, 4850 can run Crysis on High, in 1080p @40FPS and in 720p in 60+FPS. Nothing we've seen on Wii U indicates this level of performance.
 

Ryoku

Member
Is there any particular reason a card like this wouldn't be able to run PS360 third party fare at 1080p or even 1600x900 scaled? I'm still trying to figure out how even most inept under budgeted developer could be struggling with 720p if these specs have any truth behind them.

Well, I'd think developers prefer more consistent framerates on a console than on a PC (where the player makes that choice). I will say that I was able to run Crysis 2 on max settings (DX9, 1680x1050) at playable framerates on PC with a 4850. Obviously, this may be unrepresentative of the Wii U's capabilities, but we don't really know what GPU is inside the damn thing (I'm not taking these AMD tech support emails as 100% fact).
 
I don't think it has really been 100% confirmed at this point that it runs at 1080p, despite what PR has said thus far (full HD). Regardless, I don' think CoD is the best benchmark for a system's capabilities.

Hey, I know one thing, 720p/60 fps/pointer controls is enough capability for me.
 

Tagg9

Member
This realistically doesn't mean much, as I'm sure the GPU has been modified beyond recognition at this point.
 

Paracelsus

Member
It's more important to have someone explain the tech behind the gpu, what it can do or can't do. For example, by looking at the next-gen tech demos, can it run Agni's Philosophy and UE4 without downporting?
 

Daschysta

Member
Is there any particular reason a card like this wouldn't be able to run PS360 third party fare at 1080p or even 1600x900 scaled? I'm still trying to figure out how even most inept under budgeted developer could be struggling with 720p if these specs have any truth behind them.

Well one has to assume first that the developers have any intention of doing anything beyond porting the exact same game, and even then many would favor AA/better frame rate/ more effects over higher resolution. If the graphics card rumour is accurate, knowing that it has substantially more ram (which can hopefully be opened up even more to the tune of 1.25 gb/ 1.5 gb, a lot of edram, and more modern architecture and api support second gen wii-u games should see a big leap.

We have to keep in mind taht there have been indications that Nintendo nailed down the specs very late, and was selective in who they gave all information to, it may simply have been that the studios doing ports didn't know exactly what to shoot for so ended up just making sure that the port was identical to be safe.

Everything points to the Wii-U being a fair bit more powerful than current gen, second gen games should be very pretty.
 

pramath

Banned
How many more pictures do you want? Here's my inbox after I searched/filtered "AMD"

78C9W.png

Got to give you props OP.

You stuck by your thread and have gone out of your way to prove everything you're saying.

Good job.

You may have cracked the puzzle here.
 

Durante

Member
It's more important to have someone explain the tech behind the gpu, what it can do or can't do. For example, by looking at the next-gen tech demos, can it run Agni's Philosophy and UE4 without downporting?
Not even remotely.

Is there any particular reason a card like this wouldn't be able to run PS360 third party fare at 1080p or even 1600x900 scaled?
I can't think of any such reason, if it's running at full spec.
 

pramath

Banned
That's only about 2-3 times more powerful, and this thing has to make up for the lackluster CPU as well.. so I wouldn't say it blows it away.

I'd say closer to 4, and An out-of-order CPU will beat the piss out of an in-order CPU. The clockspeed is meaningless.

If these specs are accurate the Wii U will outright stomp the 360 and PS3.
 
It's more important to have someone explain the tech behind the gpu, what it can do or can't do. For example, by looking at the next-gen tech demos, can it run Agni's Philosophy and UE4 without downporting?

I love Nintendo consoles and their games, and I know that anything coming from those engines that would run on Wii U would realistically be a downport. I am okay with that as most of the next gen third party games will be available on PS4/720 or PC, which I will own at least on of those. They will run and look much better on those platforms.
 

Durante

Member
We're talking a modern cpu. You're being intentionally obtuse.
No. I'm pointing out how ridiculous the statement "The clockspeed is meaningless" is. Sure, an OOE CPU will generally feature significantly higher IPC than an in-order CPU. But that doesn't mean that this advantage can not be offset if the difference in clock speed is large enough.
 

pramath

Banned
No. I'm pointing out how ridiculous the statement "The clockspeed is meaningless" is. Sure, an OOE CPU will generally feature significantly higher IPC than an in-order CPU. But that doesn't mean that this advantage can not be offset if the difference in clock speed is large enough.

Yeah lets also not forget the Wii U CPU has 32MB of Embedded Ram, which is more than three times what Xenon had.

In, fact all Nintendo systems with PowerPC processors have had generous amounts of EDRAM since the Gamecube. It helps offset having to put more external memory by having screaming fast memory on the die.

WHy do you think the Gamecube games had some fantastic loading times compared to the other systems?
 
X360 GPU aparently has 240GFlops, this one has 576, which is 2.4X more.


As I said, 4850 runs Crysis at High in 1080p @40FPS, so yes, it is more powerful than PS3 or X360. What I'm questioning is whether this E6760 GPU is really any more powerful than 4850 like that poster suggested.
In very high settings no way a 4850 runs crysis at 40 fps.
 
No. I'm pointing out how ridiculous the statement "The clockspeed is meaningless" is. Sure, an OOE CPU will generally feature significantly higher IPC than an in-order CPU. But that doesn't mean that this advantage can not be offset if the difference in clock speed is large enough.

So, a Celeron overclocked to 4Ghz could be better than a i7 at 2Ghz? I agree with him, clock is meaningless.
 

pramath

Banned
More fuel to the fire:

AMD's E6760 is meant to run at 35W, which would definitely fit in line with Iwata saying that the console usually runs around 45W.

http://www.anandtech...es-radeon-e6760

The E6760 replaces the RV730 based E4690 (4600/4600M) as AMD’s top of the line embedded video card, which at a couple of generations old makes the E6760 a bigger step up than we normally see in the desktop/mobile space; on top of 50% more SPs, it has all of the architectural enhancements from the Radeon 5000 and 6000 series. Utilizing a fully enabled Turks GPU with a core clock of 600MHz and a memory clock of 800MHz (3.2GHz data rate), in terms of performance the E6760 is effectively a 6750M suitable for soldering directly onto a motherboard, or in comparison to desktop parts it performs closely to a 6750 with significantly lower power consumption. The TDP of the card is 35W, owing to its mobile heritage. This includes the 1GB of GDDR5 on the MCM package.

This could explain why there is only going to be 1GB of memory for video game application, and 1GB for OS functions, they 1GB automatically comes with the E6760, and the other 1GB is probably slower RAM.
 

DrNeroCF

Member
Not even remotely.

I can't think of any such reason, if it's running at full spec.

There's no way the Wii U couldn't run the Unreal 4 demo with the baked lighting and maybe slightly less particles.

With the focus on real time everything in UE4, no one's going to be able to tell a difference if the same scene is faked using a fraction of the processing power (and what's to stop a dev from building the game in real time with powerful PCs, then hitting a button to bake everything in for Wii U, iOS, etc?).
 

Durante

Member
Yeah lets also not forget the Wii U CPU has 32MB of Embedded Ram, which is more than three times what Xenon had.

In, fact all Nintendo systems with PowerPC processors have had generous amounts of EDRAM since the Gamecube. It helps offset having to put more external memory by having screaming fast memory on the die.

WHy do you think the Gamecube games had some fantastic loading times compared to the other systems?
Why did you just go off on an unrelated tangent about the memory subsystem? You remind me of a student that doesn't know the answer to a question and starts talking about something he does know in the hope of scoring some points.

And by the way, 32 MB of eDRAM is infinity times what Xenon had. Xenon had no embedded DRAM at all.

There's no way the Wii U couldn't run the Unreal 4 demo with the baked lighting and maybe slightly less particles.
So what you're saying is, it could run a downport? If so, we agree!
 

Darryl

Banned
It's more important to have someone explain the tech behind the gpu, what it can do or can't do. For example, by looking at the next-gen tech demos, can it run Agni's Philosophy and UE4 without downporting?

Browsing the FFXIV threads, we've learned that the engine used in Agni's Philosophy is being partly (they referred to the engine as being it's sister, and having used 'the core') used in the production of FFXIV 2.0. FFXIV 2.0 is coming to PS3, so I think the possibility of the Agni's Philosophy engine being used in some form to make games on the Wii U isn't so crazy.
 

KageMaru

Member
I can't think of any such reason, if it's running at full spec.

Even if the Wii-U GPU only has 8 ROPs like current gen GPUs?

Yeah lets also not forget the Wii U CPU has 32MB of Embedded Ram, which is more than three times what Xenon had.

In, fact all Nintendo systems with PowerPC processors have had generous amounts of EDRAM since the Gamecube. It helps offset having to put more external memory by having screaming fast memory on the die.

WHy do you think the Gamecube games had some fantastic loading times compared to the other systems?

Do we even know if the 32MB of eDRAM is directly linked to the CPU or GPU? Also, the load times on the GC had more to do with the disc drive than anything else.
 

Ryoku

Member
I've seen you post this before, but are there any benchmarks that show that?

IIRC, 4850 can run Crysis on High, in 1080p @40FPS and in 720p in 60+FPS. Nothing we've seen on Wii U indicates this level of performance.

Sorry for the late response. I'm going by 3Dmark vantage scores right now, as that's literally the only benchmark we have on the E6760.

E6760 = 5870
AMD 4850 = 5782/4797 (with variance, depending on brand).

http://www.em.avnet.com/en-us/desig...60-Embedded-Discrete-Graphics-Processors.aspx
http://www.pcper.com/reviews/Graphi...512MB-Preview-RV770-Discovered/3DMark-Vantage
http://www.hardwaresecrets.com/article/Sapphire-HD-4850-Video-Card-Review/576/7
 

Durante

Member
Be more precise. Agni's Philosophy as it was shown? Certainly not. But "will it run UE4" (the engine, not the techdemo)? It will. Epic even said so.
I answered the question as I understood it, and as I think most would read it: "Will it be able to run next-gen techdemos like UE4 and Agni's without downporting?"

And my answer is "Not even remotely."

(My answer for PS4/720, pending further information, is "remotely", by the way)
 

AzaK

Member
I think CoD: Black Ops 2 it's a good example of the graphical improvement, isn't it?. X360 and PS3 first CoD:BO ran at 600p and 540p respectively, while the CoD:BO2 will run at 1080p at Wii U. Taking in consideration that the first generation of games is always not well optimized, i'd say it's a good demonstrative about the wii u power

There is no confirmation at all that it runs at 1080. The only words we've heard are "Full HD", but when a marketting person says things like that it could mean anything.

720, 1080, higher than Wii's SD, who knows.
 
People... stop arguing over graphics.

REMEMBER! there a folks on GAF that believed @ E3 2011 that the graphics of the games in the games reel of upcoming Wii U games looked way worse than Xbox 360/PS3 titles while the footage was from Xbox 360/PS3...

Everything to them will look far worse if theres "Nintendo" written on it...
 

KageMaru

Member
People... stop arguing over graphics.

REMEMBER! there a folks on GAF that believed @ E3 2011 that the graphics of the games in the games reel of upcoming Wii U games looked way worse than Xbox 360/PS3 titles while the footage was from Xbox 360/PS3...

Everything to them will look far worse if theres "Nintendo" written on it...

Not all of us fall in that ignorant category...
 
People... stop arguing over graphics.

REMEMBER! there a folks on GAF that believed @ E3 2011 that the graphics of the games in the games reel of upcoming Wii U games looked way worse than Xbox 360/PS3 titles while the footage was from Xbox 360/PS3...

Everything to them will look far worse if theres "Nintendo" written on it...

And on the flips side of the coin there are people calling Nintendo Land gorgeous. There is certainly hyperbole going around on both sides. Just don't set yourselves up for disappointment.
 
People... stop arguing over graphics.

REMEMBER! there a folks on GAF that believed @ E3 2011 that the graphics of the games in the games reel of upcoming Wii U games looked way worse than Xbox 360/PS3 titles while the footage was from Xbox 360/PS3...

Everything to them will look far worse if theres "Nintendo" written on it...

nintendo warriors, assemble!
 
A couple of other gaffers have gotten similar/same confirmations of the Wii U using a modified embedded GPU/E6760:

I just got a response from AMD as well. I sent my request the same day Zoramon089 posted his email.

Although it is very similar to Zoramon089, it looks like this person has made sure not to mention the exact model.

1pn2s.png


I see other people have now had similar replies: Thought i'd post mine as it was interesting if predictably unrevealing :)

2Vm5U.jpg

I'd say there is some real meat to the bones of this rumor now.
 

Lord Error

Insane For Sony
In very high settings no way a 4850 runs crysis at 40 fps.
"High", not "Very High", but yes I was looking at a wrong bar on Tom's Hardware Benchmark regardless. It's more like 25FPS in 1080p, 37FPS in 720p and settings on High. Even then, if WiiU has a card that's a bit better than that, you'd think it would show through some pretty substantial performance improvements in multiplatform games.

5750 is what I had in my PC for some time, and looking at benchmarks, that's very similar to 4850 in performance. I can vouch that this card does significantly better in every single multiplatform game compared to current consoles.


Sorry for the late response. I'm going by 3Dmark vantage scores right now, as that's literally the only benchmark we have on the E6760.

E6760 = 5870
AMD 4850 = 5782/4797 (with variance, depending on brand).

http://www.em.avnet.com/en-us/desig...60-Embedded-Discrete-Graphics-Processors.aspx
http://www.pcper.com/reviews/Graphi...512MB-Preview-RV770-Discovered/3DMark-Vantage
http://www.hardwaresecrets.com/article/Sapphire-HD-4850-Video-Card-Review/576/7
I see, that is pretty nice. The last link shows that the score varies heavily depending on which resolution the test is run at, so I think it's still hard to compare because we don't know which resolution the E6760 test was running.
 

wsippel

Banned
I answered the question as I understood it, and as I think most would read it: "Will it be able to run next-gen techdemos like UE4 and Agni's without downporting?"

And my answer is "Not even remotely."

(My answer for PS4/720, pending further information, is "remotely", by the way)
Sorry, didn't want to sound like an ass, but Nintendo fans are kinda burned by "it can't even run the engine, let alone the games" statements. That shit even got me banned a while ago when Capcom made such a statement regarding MT Framework (turned out I was spot on at the time, as Capcom later ported Framework to Wii and 3DS). And Wii U can not only run UE3 perfectly fine, it will be able to run UE4. That in itself is kinda important, even if the games need to be (heavily) downgraded to achieve acceptable performance levels.
 
People... stop arguing over graphics.

REMEMBER! there a folks on GAF that believed @ E3 2011 that the graphics of the games in the games reel of upcoming Wii U games looked way worse than Xbox 360/PS3 titles while the footage was from Xbox 360/PS3...

Everything to them will look far worse if theres "Nintendo" written on it...
I don't think you understand the dissent towards what Nintendo is doing to gaming hardware. People like me have higher expectations for visual fidelity and audio fidelity. Should I be commit myself to the lowest common divider just to play the games I would find interesting?
 

jett

D-Member
Is that a mobile GPU? I guess that explains some stuff. If true. A friend of mine has a MacBook Pro with a 6750(which I gather is about the same) and he has a hard time running games higher than 720p with a solid framerate.
 
Top Bottom