• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
I finally got around to watching the 60fps Sonic video and damn, that looks amazing. It definitely looks like it's running smoother than any 3D Sonic game that I've played.

That said unless we saw the game running on PS360 we can't really make a determination of what it means about the Wii U's hardware. Ubisoft made it seem like Rayman Legends wasn't possible on lower spec'd consoles but I haven't heard many impressions that suggest sacrifices were made when porting that game to the HD twins.

I have the steam version of Sonic Generations and on my PC it doesn't run as well as I would have expected for a platformer so it seems pretty badly optimized. The big differences in art style could account for how much better Sonic Lost World looks to be performing on Wii U but I can't think of another similarly styled game on the older platforms that run that impressively.
 

krizzx

Junior Member
I finally got around to watching the 60fps Sonic video and damn, that looks amazing. It definitely looks like it's running smoother than any 3D Sonic game that I've played.

That said unless we saw the game running on PS360 we can't really make a determination of what it means about the Wii U's hardware. Ubisoft made it seem like Rayman Legends wasn't possible on lower spec'd consoles but I haven't heard many impressions that suggest sacrifices were made when porting that game to the HD twins.

I have the steam version of Sonic Generations and on my PC it doesn't run as well as I would have expected for a platformer so it seems pretty badly optimized. The big differences in art style could account for how much better Sonic Lost World looks to be performing on Wii U but I can't think of another similarly styled game on the older platforms that run that impressively.

Thanks for reminding me. I download the video yesterday but forgot to watch and I have to agree. The next clostest thing on last gen hardware I can think of is Sonic Unleashed on the 360 but it was no where near that smooth or clean.

Also, I must reiterate that I did not see any noticeable aliasing.
 

fred

Member
I finally got around to watching the 60fps Sonic video and damn, that looks amazing. It definitely looks like it's running smoother than any 3D Sonic game that I've played.

That said unless we saw the game running on PS360 we can't really make a determination of what it means about the Wii U's hardware. Ubisoft made it seem like Rayman Legends wasn't possible on lower spec'd consoles but I haven't heard many impressions that suggest sacrifices were made when porting that game to the HD twins.

I have the steam version of Sonic Generations and on my PC it doesn't run as well as I would have expected for a platformer so it seems pretty badly optimized. The big differences in art style could account for how much better Sonic Lost World looks to be performing on Wii U but I can't think of another similarly styled game on the older platforms that run that impressively.

Well I think it would be fair to compare and contrast Generations with Lost World. Twice the framerate, higher poly models and far superior draw distance.

This is something I'd expect the Digital Foundry fellahs to have a look at, although they do seem to have an agenda regarding the Wii U for some strange reason so maybe that's why we haven't seen them looking at something that makes Latte look impressive compared to RSX and Xenon. [/tinfoilhat]
 

Luigiv

Member
The point is that FXAA is not without its benefits even over other AA. You spoke of it like it was the hell spawn of the devil.

It is. I'll take a few extra noticeable jaggies in a few exceptional instances over noticeable blur almost everywhere but a few exceptional instances. It's a poor compromise.
 

krizzx

Junior Member
Well I think it would be fair to compare and contrast Generations with Lost World. Twice the framerate, higher poly models and far superior draw distance.

This is something I'd expect the Digital Foundry fellahs to have a look at, although they do seem to have an agenda regarding the Wii U for some strange reason so maybe that's why we haven't seen them looking at something that makes Latte look impressive compared to RSX and Xenon. [/tinfoilhat]

They definitely have any agenda in regards to the Wii U GPU. They still haven't corrected that completely fictitious analysis claim they made when this thread first spawned that said the Wii U was using a HD 4630 and that everyone could rule out any "next-gen pretentions" about the Wii U, when we now know that the GPU is complete custom made and uses components that range from the HD4000-HD6000 series as well as from Renesas. I lost all respect for DF after that.
 
That is irrelevant to this thread or its discussion.
Then why did you bring it up in the first place? You made it a point of discussion when you brought it up. If you don't want someone expressing doubt at your objectivity, then don't justify your posts with the reasoning that you're objective. You made it relevant.

My other posts and threads are made simply to contrast the overbearing onslaught of negative news and the general negative opinion directed toward Nintendo and their products that I see on the internet. It has reached ridiculous levels with journalists taking news that should have been positive and spinning it to look as negative as possible or cherry picking facts for negative purposes. If there were a more balanced opinion of the hardware or others posting more positive about it then I would not bother.
Why do you feel it's your job to try and "balance" bad news with good? How is that an honest discussion when you purposefully post stories for the express reason that they're good, rather than on the merit of their facts? Do you actually have examples of large sites doing what you say, of taking objectively good news and spinning it?

Whether you intend it or not, your reasoning sounds like Fox News; that the mainstream media is biased and they're there to balance it out. I don't mean that as criticism, but merely as an observation.

Though I must ask again. What is with all of the personal attacks? Its like my attempts to clear up the misconceptions about the Wii U hardware are causing people physical harm with some of the posts I've been seeing.
Honest question: do you think I'm personally attacking you? Because my intention is not but to try and rationally explain my views. If I'm not succeeding if like to know so I can adjust my rhetoric to be more even-keel.
 

krizzx

Junior Member
Then why did you bring it up in the first place? You made it a point of discussion when you brought it up. If you don't want someone expressing doubt at your objectivity, then don't justify your posts with the reasoning that you're objective. You made it relevant.


Why do you feel it's your job to try and "balance" bad news with good? How is that an honest discussion when you purposefully post stories for the express reason that they're good, rather than on the merit of their facts? Do you actually have examples of large sites doing what you say, of taking objectively good news and spinning it?

Whether you intend it or not, your reasoning sounds like Fox News; that the mainstream media is biased and they're there to balance it out. I don't mean that as criticism, but merely as an observation.


Honest question: do you think I'm personally attacking you? Because my intention is not but to try and rationally explain my views. If I'm not succeeding if like to know so I can adjust my rhetoric to be more even-keel.

Huh??? I didn't.
 

SmokyDave

Member
Though I must ask again. What is with all of the personal attacks? Its like my attempts to clear up the misconceptions about the Wii U hardware are causing people physical harm with some of the posts I've been seeing.
Dude that wasn't an attack, the guy just suggested you're not objective, and he's right. Your post history speaks volumes.

I know I'm a fine one to talk, but then I've never attempted to deny my evident bias.
 
That's a bit of hyperbole. I understand that people don't like how it blurs the final image a bit, but it's a good compromise. You get rid of jaggies at very little cost.

I'd rather have the jaggies and a sharp image than smoother lines with a blurred one. It kind of hurts the point of playing things in a higher resolution if you're going to blur the whole image.
 
....
And for 720p to "remain" the sweetspot? It never was.

All Wii U games that aren't launch games or ports either run at 720 60fps or 1080p.
I'm going to skip your rant, the 'I'm a graphics design expert' part and just focus on the two lines that responded to what I said.

Don't you think it's weird how you state that 720p never was the sweetspot resolution for Wii U games to begin with.
Yet for your argumentation for this, you first had to exclude every launchgame and every multiplatform game ever released on the Wii U?
Then you restate the new Wii U games are either running in 720p (which you said never was the sweetspot?!) or 1080p?
And the best part is that, even then, the existence of Lego city which is 720p/30fps still proves that last sentence wrong.

So, in the end, 720p still seems to be the sweetspot for the majority of Wii U games.
 

krizzx

Junior Member
Dude that wasn't an attack, the guy just suggested you're not objective, and he's right. Your post history speaks volumes.

I know I'm a fine one to talk, but then I've never attempted to deny my evident bias.

I posts mostly being in Nintendo related threads equates to me not being objective? That is a personal attack if I've ever saw one. I'm not the topic of this thread, but you are attacking me and what you presume my stance is rather than addressing the facts and details presented.
 

TheD

The Detective
Though I must ask again. What is with all of the personal attacks? Its like my attempts to clear up the misconceptions about the Wii U hardware are causing people physical harm with some of the posts I've been seeing.

They only misconceptions are the ones you are spreading!

You clearly lack the technical knowledge to make statements about how powerful the WiiU is, yet you keep on posting BS because you can not bare to face the facts!

Just about every one of your posts is pure WiiU fanboyism!
 

krizzx

Junior Member
M°°nblade;77913121 said:
I'm going to skip your rant, the 'I'm a graphics design expert' part and just focus on the two lines that responded to what I said.

Don't you think it's weird how you state that 720p? never was the sweetspot resolution for Wii U games to begin with.
Yet for your argumentation for this, you first had to exclude every launchgame and every multiplatform game ever released on the Wii U?
Then you restate the new Wii U games are either running in 720p? (which you said never was the sweetspot?!) or 1080p?
And the best part is that, even then, the existence of Lego city which is 720p/30fps still proves that last sentence wrong.

So, in the end, 720p? still seems to be the sweetspot for the majority of Wii U games.


Alright, no glove with this one. Your entire post follows a straw man argument.

This "exact" strawman argument in question was my response to your statement that the Wii U's sweet spot was "720 30fps" to which I responded was a untrue because 90%+ of the games made "for the Wii U'(as in not a port) or a launch game have been confirmed either 720p 60fps or 1080p(frame rate depends). You eliminated the fps and then proceeded to claim that I was making the statement that common Wii U resolution would not be 720p, as I bolded, which is a completely different argument.

Lego City was a launch period game made in the time period that has been attested by devs and Nintendo themselves to have been using incomplete dev tools giving even Nintendo trouble with the hardware, and I am quite certain I said any game that wasn't a "launch game" or port. Please learn to read what I write and stop filling it in with your own distorted version of what I said that you find easier and argue against.

And I am done replying on that note, until you improve your responses. I do not engage in discussion with people who twist words and distort or ignore facts that they find inconvenient.

They only misconceptions are the ones you are spreading!

You clearly lack the technical knowledge to make statements about how powerful the WiiU is, yet you keep on posting BS because you can not bare to face the facts!

Just about every one of your posts is pure WiiU fanboyism!

And here we have another. I've never made any testament to how powerful the Wii U "is". I've only made suggestions as to whats it appears to be capable of and how powerful it "could be" based on these details. This is what being objective is. I"ve never made any absolute claims without providing visual evidence, relevent technical documentation or direct commentary from directly related professionals to support it. I've never even attempted to claim the Wii U is some magical god machine the way you are trying to make my posts out to be. You seem to be one of the people who view anyone who makes positive statement about the hardware a fanboy.

Stop harassing me. This is the last time I'm saying this. I'm not the topic of this thread. Your bitterness at what I post does not dictate or define what i post and if you continue to derail this thread with personal attacks I will let the mods handle it from here on out.
 
Huh??? I didn't.

Don't inflect your ideology on me. I've said it a dozen time. I'm not here for console war/fanboy garbage like that. Stop trying to twist my words.

You brought up that you don't post here to be fanboy. I responded to that, since you brought it into the discussion. I'm not sure where the disconnect is.

I posts mostly being in Nintendo related threads equates to me not being objective? That is a personal attack if I've ever saw one. I'm not the topic of this thread, but you are attacking me and what you presume my stance is rather than addressing the facts and details presented.
As I wrote, it suggests that carrying on a fruitful discussion with you would be difficult because your posting history suggests that you are not interested in discussing the negatives of the WiiU, and so far, that's bearing out as you have started misconstruing my assertions (you don't seem to be objective, but I'm giving you the benefit of the doubt) with some kind of attack. That's not my intention, as I said.

Dude that wasn't an attack, the guy just suggested you're not objective, and he's right. Your post history speaks volumes.

I know I'm a fine one to talk, but then I've never attempted to deny my evident bias.
Smoky Dave with the assist. He's right; I'm not trying to attack you, I was making an observation, and one that I tried to carefully couch in language that was neither accusatory nor definitive.
 

krizzx

Junior Member
You brought up that you don't post here to be fanboy. I responded to that, since you brought it into the discussion. I'm not sure where the disconnect is.


As I wrote, it suggests that carrying on a fruitful discussion with you would be difficult because your posting history suggests that you are not interested in discussing the negatives of the WiiU, and so far, that's bearing out as you have started misconstruing my assertions (you don't seem to be objective, but I'm giving you the benefit of the doubt) with some kind of attack. That's not my intention, as I said.


Smoky Dave with the assist. He's right; I'm not trying to attack you, I was making an observation, and one that I tried to carefully couch in language that was neither accusatory nor definitive.

That was a response, in case you did not read it to a post that suggested otherwise. I did not "bring it up" the person I was responding to did. Such is why it was written in the form of a response with a quote at the top of the post. I would think that would naturally follow.

You are telling me I did something that I did not. And as I pointed out, I'm not the topic of this thread. You are not addressing the Wii U GPU, you are addressing "me" with negative accusations. That is a personal attack. I will tell you as I just told to other two.
 
That was a response, in case you did not read it to a post that suggested otherwise. I did not "bring it up" the person I was responding to did. Such is why it was written in the form of a response with a quote at the top of the post.

You are telling me I did something that I did not.
Fair enough; I stand corrected. I still stand by my observations in response to that, but let's not detail this any further. If you feel I've attacked you, PM me and we can discuss it.

I'll say it one last time: it wasn't an accusation.
 
Alright, no glove with this one. Your entire post follows a straw man argument.

This "exact" strawman argument in question was my response to your statement that the Wii U's sweet spot was "720 30fps" to which I responded was a untrue because 90%+ of the games made "for the Wii U'(as in not a port) or a launch game are either 720p 60fps or 1080p(frame rate depends). You eliminated the fps and then proceeded to claim that I was making the statement that common Wii U resolution would not be 720p, as I bolded, which is a completely different argument.
First of all, I eliminated nothing. You dropped the fps criterium by saying '720p never was the sweetspot' and I just responded to what you literally wrote.

Secondly, my statement that the majority of Wii U run in 720p 30fps remains true. If you decide to limit the number of games by excluding every multiplatform game (since the Wii U is almost never the lead platform) and every Wii U exclusive released more than 4 months ago, you are no longer speaking about the majority of Wii U games or the common Wii U game. Just 5 or 6 first/second party games.
 
Part of what I do is graphics design and I have an good eye for detail.

No doubt you have an eye for detail, but I don't think you understand the technologies behind what you are seeing.

And this is coming form someone who truly belives that the Wii U is not nearly as weak as people make it out to be (I also believ ethe PS4 and XBone are no where near as powerful as people make them out to be)
 

krizzx

Junior Member
Fair enough; I stand corrected. I still stand by my observations in response to that, but let's not detail this any further. If you feel I've attacked you, PM me and we can discuss it.

I'll say it one last time: it wasn't an accusation.

To say I said something that I did something that I didn't is an accuations, but alright.

As for you observations, they are also incorrect. I have no problems with discussing anything that can be negative or postive. I have a problem with baseless negative claims though. I've asked every person I've responded too who made a negative claim to substantiate it. All I care about are the facts and the details. If you say the Wii U is weaker than the 360/PS3, coherently demonstrate and I find no evidence that contradicts the claim then I will accept. All that matter is that it is found on fact and that all angles explored.

If someone suggests something negative and they back it up with relevant facts, then I will have no problem discussing it with them. The thing is that 90% of the people who do speak negative of the hardware provide nothing or worse, they provide extremely off details that don't properly support the conclusion they make and often say something completely different. In my experience most people who come in here to make negative claims just do it to bash. They base it own little more than desires and presumptions.

If someone makes an argument and there are holes in it, I will poke every single one of those holes. I would expect them to do no less to me.

Just like the FXAA argument earler. I have no emotional investment in FXAA. I did not agree with the way it was being condemned, though so, I defended it. Or with PPC vs x86, or DDR3 vs GDDR5 and so on. My opinions of these things are independent of consoles and companies, but most people aren't they automatically assume that sense they then my reason must be the same as theirs.

I believe DDR3 is far superior to GDDR5 where graphics are not concerned. That has nothing to do with the fact that the PS4 has it and the Wii U/XboxOne use DDR3. I thought that before I knew what any of the next gen consoles would use. That is my opinion as a PC user. Though if I say DDR3 is superior under those circumstances now, I get treated like a console troll who's attacking the divinity of the PS4.

I liked ATI but I belive Nvidia had the overall superior GPUs. If I say that now, I treated like I'm attacking the next gen consoles.

Same with my opinion of PowerPC vs x86. I've always though AMD CPU's were terribly inefficient as well, but now's that treated like a fanboy logic.

People define my opinions to their liking based on their preference. The reality is different.
 
I believe DDR3 is far superior to GDDR5 where graphics are not concerned. That has nothing to do with the fact that the PS4 has it and the Wii U/XboxOne use DDR3. I thought that before I knew what any of the next gen consoles would use. That is my opinion as a PC user. Though if I say DDR3 is superior under those circumstances now, I get treated like a console troll who's attacking the divinity of the PS4.
I'm setting aside the rest of your post, though I read it, so we don't go down that path again. Like I said, benefit of the doubt.

What I am curious about is why you believe DDR3 is better than GDDR5 for non-graphics uses. What, specifically, don't you like about it for general use as main memory? Is there a technical reason? It seems like using ESRAM or EDRAM are to replicate the faster throughout of graphics memory found on modern cards, so what's the benefit of using a mixed architecture to offset the slower speed of DDR3 rather than using something with a higher total speed?

Besides, correct me if I'm wrong, but the Xbox One (and perhaps the WiiU?) use unified memory so there is no dedicated graphics memory like you'd have on a GPU versus regular memory. So if that's the case, graphics are the concern correct? The small pools of other RAM are specifically there to make up for the speed deficiency in using memory ill-suited for graphical work.
 

krizzx

Junior Member
I'm setting aside the rest of your post, though I read it, so we don't go down that path again. Like I said, benefit of the doubt.

What I am curious about is why you believe DDR3 is better than GDDR5 for non-graphics uses. What, specifically, don't you like about it for general use as main memory? Is there a technical reason? It seems like using ESRAM or EDRAM are to replicate the faster throughout of graphics memory found on modern cards, so what's the benefit of using a mixed architecture to offset the slower speed of DDR3 rather than using something with a higher total speed?

Besides, correct me if I'm wrong, but the Xbox One (and perhaps the WiiU?) use unified memory so there is no dedicated graphics memory like you'd have on a GPU versus regular memory. So if that's the case, graphics are the concern correct? The small pools of other RAM are specifically there to make up for the speed deficiency in using memory ill-suited for graphical work.

The biggest issues are latency and energy consumption. If you read any of my post, especially in the CPU thread, you will know that I prefer more efficient hardware to hardware that just boasts big numbers.

For small articulate things, or rapid gathering and writing of small, separate individual units of data, DDR3 is far superior. That is what you do when running things that aren't grahpics GDDR5 is wasted when its not being used to process graphics. GDDR5 is graphics RAM meant to grab huge files more quickly. It falters in comparison when not grabbing huge clusters of data.

For the case of eDRAM, as Shin'en said, the 32 MB is more than enough because its so fast. The latency is also lower. I will always take lower latency over higher bandwidth as it is more efficient. Same with PowerPC vs Intel/AMD x86. I'll take higher peformance per watt over overall most extreme performance.

GDDR5's high memory bandwidth is more of a must to compensate for lack of a efficiency. 60Mb/s DDR3 would give better performance than 60Mb/s GDDR5 for pretty much everything that isn't a graphics.

To top it all off, DDR3 is less expensive. It is just overall superior.
 

Xanonano

Member
The biggest issues are latency and energy consumption. If you read any of my post, especially in the CPU thread, you will know that I prefer more efficient hardware to hardware that just boasts big numbers.
OK then, feel free to quote the figures you've found for the latency and energy consumption of the two types of RAM that allowed you to make this conclusion.
 

krizzx

Junior Member
OK then, feel free to quote the figures you've found for the latency and energy consumption of the two types of RAM that allowed you to make this conclusion.

Why?

I'm not trying to convince anyone to follow my opinion. He asked for my reasoning and I gave it to him.
 
After being quite negative with regards to the possible power of WiiU I have to say I have been very, very impressed with Wonderful 101.

Played about four hours of it yesterday and the game has some really technically impressive sections.

720p native, solid 60fps, no screen tearing and some nice graphical effects with regards to the depth of field, fire and lighting effects. The game also looks super clean with some awesome looking textures.

Definitely the most technically impressive WiiU game so far.
 

muteant

Member
After being quite negative with regards to the possible power of WiiU I have to say I have been very, very impressed with Wonderful 101.
.
It'd be a great-looking game from the isometric perspective if it weren't for the jaggies, jaggies everywhere. I feel the same way about Pikmin 3, except that Pikmin should probably be considered great-looking regardless.

Really concerned that the Wii U hardware is for whatever reason handicapped when it comes to AA. Someone in the Gamescom Showfloor impressions thread mentioned that while gorgeous, MK8 suffers from persistent aliasing. It looks like a big problem for Smash Wii U, too. I'd like to hear from someone specifically looking for the phenomenon to comment on how 3DWorld and Tropical Freeze fare in this context.
 

fred

Member
After being quite negative with regards to the possible power of WiiU I have to say I have been very, very impressed with Wonderful 101.

Played about four hours of it yesterday and the game has some really technically impressive sections.

720p native, solid 60fps, no screen tearing and some nice graphical effects with regards to the depth of field, fire and lighting effects. The game also looks super clean with some awesome looking textures.

Definitely the most technically impressive WiiU game so far.

More impressive than Pikmin 3..? The diffuse mapping in Pikmin 3 is incredible.

Got The Wonderful 101 in the post today but am finishing Pikmin 3 before I give it a go. just got the pink flying Pikmin which are awesome even though they can't punch their way out of a paper bag lol
 
Why?

I'm not trying to convince anyone to follow my opinion. He asked for my reasoning and I gave it to him.
Well, it is a technical thread, so technical details are apropos. I appreciate your explanation, but it would be nice to see technical documentation that reaffirms the assertions you made. With some light Googling, is proving difficult to find any articles that compare the two. In addition, I've never heard much about efficiency issues with GDDR5, nor of it struggling with small data pieces.

Its nice to know why you believe DDR3 is better, but I'd like to see the receipts on the technical details, if for no other reason than my own edification. I know little about games programming since the kind of programming I do doesn't concern itself with base hardware.
 

krizzx

Junior Member
Well, it is a technical thread, so technical details are apropos. I appreciate your explanation, but it would be nice to see technical documentation that reaffirms the assertions you made. With some light Googling, is proving difficult to find any articles that compare the two. In addition, I've never heard much about efficiency issues with GDDR5, nor of it struggling with small data pieces.

Its nice to know why you believe DDR3 is better, but I'd like to see the receipts on the technical details, if for no other reason than my own edification. I know little about games programming since the kind of programming I do doesn't concern itself with base hardware.

Well aright. Honestly, I've grown tired of providing details and link because, as what was seen with my Sonic comparisons, most people either ignore them or outright dismiss them without any evidence to the contrary. It has become annoying. Also, I thought this was common knowledge to anyone knew anything about RAM on the technical. Generally if they don't even know the basic, then showing them more advanced info would be to no avail.

All that and that this topic has been derailed enough already. with talk not relating to the GPU.

Also, its to difficult to find at this point. Google is streamed with nothing but links to site comparing the Xbone and PS4 as opposed to just the base memory modules independent of consoles. I really don't feel like making the effort to dig through all of that to find a link to something I don't have any reason to prove to begin with.

http://www.techspot.com/community/t...-between-ddr3-memory-and-gddr5-memory.186408/ I don't feel like digging any deeper than that.
 
M°°nblade;77880033 said:
Reading the last couple of pages, you guys are trying way too hard to use the 'Wii U games have a higher IQ than current gen console' line as a proof that the console is significantly more powerful than said consoles because they don't.

Regarding the resolution:
Inform yourself better. Go check the B3D rendering resolution pixel counter list instead of just cherrypicking the 1080p games to force a point. 95% of all multiplatform games are running at the exact same resolution as the PS360 versions. This even means subHD (880x720) for games like CoD. The exclusive Wii U games you tout as 'an increasing amount of 1080p games!' are just 1 game (SSBM) and a bunch of Gamecube, 3DS and even PS2 ports. It doesn't prove anything as the PS3 also recieved a few less demanding exclusive releases (even at launch!), a hand full of PS2 HD classics and PSN games that were running in native 1080p.
http://beyond3d.com/showthread.php?p=1113344

Regarding the framerate:
Again, inform yourself better. DF framerate analysis show that the framerate of multiplatform games hovers between the Xbox 360 (usually highest) and the PS3 (usually lowest) version. I see no changes in upcoming releases like splintercell. Nintendo and some other companies targetting 60fps for exclusive Wii U games is great, but again ... this isn't something unique that differentiates Wii U games from current gen. Especially not when these are the same developers that were targetting 60fps as well on PS360 hardware. It's as dumb as claiming that the Xbone is more powerful than the PS4 simply because Microsoft has announced more exclusive 60fps games.

I'm sure Wii U games will look better over time. I expect the average Wii U game to have better lighting and textures than the average PS360 game. These expectations come from the footage I see and my understanding that the Wii U GPU is more modern (full dx10 feature set) and the console has twice the amount of RAM available for games. However, claiming there is, or expecting there will be an IQ difference is based on literally nothing. The games don't show it and from what understand it greatly depends on bandwidth in which the Wii U doesn't have a real advantage over PS360 hardware. Since PS360 image quality didn't increase over the years when developers got more experienced with the hardware, it's save to say that we won't see a positive evolution regarding Wii U IQ as well. People should expect 720p/30fps to remain the sweetspot IQ for the majority of Wii U games since the Wii U hardware simply doesn't have the specs to double the framerate or the rendering resolution. The Wii U is designed to be a 720p/30fps machine. If it wasn't, you already would have noticed.

Also, Krizzx, you claim you post in this thread to understand the Wii U GPU. I don't think that's true. The only reason you are here is because you want to tell a story. A story that the Wii U GPU is much more powerful than it really is. It's dogmatic and thus has nothing to do with learning anything new.

Facts!! I have stated the facts and I try to be as objective as possible.
Facts:
-You are referring to all the launch ports built around incomplete tools and limited budgets, wow Great argument!! (this is a fact based on declarations by Criterion)
-I am getting tired of writing all the coming games to be 720p60, that we have even 60fps videos and that they look amazing. So I am just listing 2, Bayo 2 and MK8. Bayo 2 is a step above Bayo 1 in all aspects and it is running at full 60fps without any dips, and it is a game that I like more for comparisons as it is somewhat representative of X360/PS3 libraries.
- Shinen said they did not optimize their code for Nano Assault.
-Sales of the Wii U are limiting the resources 3rd parties set for Wii U ports (fact, this is the main reason for missing content or features on multiple titles)

So please don't accuse of not stating facts while you on purpose are just sharing certain info to accommodate your line of thinking while omitting other info that just does not support it.

IMO Wii U is built around 720p, so we will either see 720p games with better effects, shaders, lightning @60fps if developers take the time. 1080p would be forcing things and getting fewer bells and whistles, so I prefer going the 720p way, even Shinen went this route with Nano. I am really excited to see what this team pulls off with the next titles.
 
Well, it is a technical thread, so technical details are apropos. I appreciate your explanation, but it would be nice to see technical documentation that reaffirms the assertions you made. With some light Googling, is proving difficult to find any articles that compare the two. In addition, I've never heard much about efficiency issues with GDDR5, nor of it struggling with small data pieces.

Its nice to know why you believe DDR3 is better, but I'd like to see the receipts on the technical details, if for no other reason than my own edification. I know little about games programming since the kind of programming I do doesn't concern itself with base hardware.

Well, the latency of GDDR5 is somewhat higher, although it's not as huge a difference as some make it out to be. Generally, as RAM has gotten faster, the latency has gotten higher. GDDR5's main benefit is its bandwidth - that it can be clocked higher and that the chips can operate on a 32-bit I/O. So to make another general statement, DDR3 will probably prove more useful for low bandwidth, random, CPU look ups. While in graphics workloads, GDDR5 definitely takes the prize.
 
As far as I know, the main reason why high latencies are observed on PC graphics cards' memory is because the memory controllers are designed to combine several accesses to a larger batch, trading off latency for throughput. That's because for GPUs bandwidth is more important. This is probably not an issue for PS4 which has a seperate Bus for CPU accesses. Latencies might still be a bit higher than those of DDR3, but I don't think it will hurt the CPU performance much.
 

joesiv

Member
Idk guys I can tell it's not as good by some of the pixels and I've seen quite a few games in my day.
IMO games that use FXAA when the game also isn't rendering at a native resolution to start with, the blur is horrendous. But just using FFAA in general isn't too bad, better than no AA.


- Shinen said they did not optimize their code for Nano Assault.
To be clear... I thought he said that he didn't optimize his shaders, not that he didn't optimize his code, there's a big difference, but I might have missed the part that he wasn't talking specifically about shaders, and was talking about the engine code...
 

krizzx

Junior Member
Shin'en only used 1core of the CPU, they did not use the DSP, they didn't do any optimization to shaders, they did not manage the memory at all because they never hit its peak and they had the game running at 1080p 60fps(they lowered it to 720p and added a lot of postfx, because they said 1080p gave no distinguishable visual improvement over 720p). The general tone was that they didn't run into any problems and thus didn't need to worry about optimization.
 
Shin'en only used 1core of the CPU, they did not use the DSP, they didn't do any optimization to shaders, they did not manage the memory at all because they never hit its peak and they had the game running at 1080p 60fps(they lowered it to 720p and added a lot of postfx, because they said 1080p gave no distinguishable visual improvement over 720p). The general tone was that they didn't run into any problems and thus didn't need to worry about optimization.

Forgot they used a forward renderer and switched to a deferred renderer for they're next game.
 

krizzx

Junior Member
Oh, found some live gameplay footage of Bayonetta 2. Wish I could find one that isn't YouTube. I want to see in 60fps.

http://www.youtube.com/watch?v=i1kgJHL8Ns8#t=11

Though, I think we can now say clearly that the model of debate is not a cutscene. Its goes straight into combat with her. Them 130k polygons.

Even though youtube caps videos at 30 FPS, it still look extremely fluid and the building cast shadows which is something you couldn't see in the trailer. The water is reflective as well.
 
Oh, found some live gameplay footage of Bayonetta 2. Wish I could find one that isn't YouTube. I want to see in 60fps.

http://www.youtube.com/watch?v=i1kgJHL8Ns8#t=11

Though, I think we can now say clearly that the model of debate is not a cutscene. Its goes straight into combat with her.

Them 130k polygons.

Dude gamershyde has had two since E3, 11 minute long 60fps video of the full demo and a shorter one, very worthy of a download. And it makes a whole lot of difference.
 

krizzx

Junior Member
Dude gamershyde has had two since E3, 11 minute long 60fps video of the full demo and a shorter one, very worthy of a download. And it makes a whole lot of difference.

What? I didn't know about these. I had no internet access for the month of June though. You got any links?
 
What? I didn't know about these. I had no internet access for the month of June though. You got any links?

Sure. They are off screen though but great nonetheless..

Full demo 11 minute:
http://www.gamersyde.com/news_e3_bayonetta_2_full_demo_video-14250_en.html

Shorter gameplay:
http://www.gamersyde.com/news_e3_bayonetta_2_gameplay-14228_en.html

When I saw them it made me investigate the performance and look at screens from Bayo1, also finding out that Bayo 1 is believed to be 60fps on x360 while really it is 40-45fps. The Bayo 2 demo is believed to be full 60fps without drops going by all the impressions I read, you will have to believe me there.
 
Sure. They are off screen though but great nonetheless..

Full demo 11 minute:
http://www.gamersyde.com/news_e3_bayonetta_2_full_demo_video-14250_en.html

Shorter gameplay:
http://www.gamersyde.com/news_e3_bayonetta_2_gameplay-14228_en.html

When I saw them it made me investigate the performance and look at screens from Bayo1, also finding out that Bayo 1 is believed to be 60fps on x360 while really it is 40-45fps. The Bayo 2 demo is believed to be full 60fps without drops going by all the impressions I read, you will have to believe me there.

Actually, the Digital Foundry analysis of the Xbox 360 version shows that the game ran mostly between 50 and 55 FPS with the hardest dips down to around 40.
 
Oh, found some live gameplay footage of Bayonetta 2. Wish I could find one that isn't YouTube. I want to see in 60fps.

http://www.youtube.com/watch?v=i1kgJHL8Ns8#t=11

Though, I think we can now say clearly that the model of debate is not a cutscene. Its goes straight into combat with her. Them 130k polygons.

Even though youtube caps videos at 30 FPS, it still look extremely fluid and the building cast shadows which is something you couldn't see in the trailer. The water is reflective as well.

Did the Jet take them under a overpass? I don't remember seeing that in the E3 demo.
 

TheD

The Detective
And here we have another. I've never made any testament to how powerful the Wii U "is". I've only made suggestions as to whats it appears to be capable of and how powerful it "could be" based on these details. This is what being objective is. I"ve never made any absolute claims without providing visual evidence, relevent technical documentation or direct commentary from directly related professionals to support it. I've never even attempted to claim the Wii U is some magical god machine the way you are trying to make my posts out to be. You seem to be one of the people who view anyone who makes positive statement about the hardware a fanboy.

Stop harassing me. This is the last time I'm saying this. I'm not the topic of this thread. Your bitterness at what I post does not dictate or define what i post and if you continue to derail this thread with personal attacks I will let the mods handle it from here on out.

Yes you have!
Your are not in anyway objective!
You have made tons of claims about things like:
Alright, we got some more Sonic fodder to analyze.

A video and a ton bunch of screens.

http://gamingtrend.com/2013/08/22/sonic-dark-world-screens-and-video/



I can definitely say beyond a shadow of a doubt that the Wii U supersedes the PS3/360 in texture quality. This is one area where is a tremendous leap from the last gen.

The polygon count in Sonic is also clearly a step up from Generations again. I believe we have more than a enough evidence overall at this point in time to conclude that Latte has far superior polygon drawing capability. Not just a small bump. Perhaps we should take another look at the dual graphics engine possibility.

None of the screen shots you posted provide evidence of WiiU being better, yet you claim without a shadow of a doubt that it is better!

You have attacked people for daring to post that a TF2 screen shot that points out that there is nothing special about the lighting!

STOP POSTING BULLSHIT IF YOU DO NOT WANT TO BE CALLED OUT ON IT!
 
Yes you have!
Your are not in anyway objective! JUST ABOUT ALL YOUR POSTS ARE WIIU FANBOYISM!

STOP POSTING BULLSHIT IF YOU DO NOT WANT TO BE CALLED OUT ON IT!

From my point of view, all you post here is 'bullshit' that is inflammatory or combative with krizzx. I mean, that's pretty much all you do. The guy just wants to ask questions and have discussion based on those questions. You're just coming off as an angry hater.
 

TheD

The Detective
From my point of view, all you post here is 'bullshit' that is inflammatory or combative with krizzx. I mean, that's pretty much all you do. The guy just wants to ask questions and have discussion based on those questions. You're just coming off as an angry hater.

How dare you you attack me for calling out a huge fanboy that keeps on shitting up this thread!

He does not ask questions at all!
He makes unfounded statements and attacks anyone that disagrees!
 
Status
Not open for further replies.
Top Bottom