Yeah, but it will be interesting to see what Microsoft's first party games and Nintendo's first party games will look like on the Xbox One and Wii U at E3 2013.
Nintendo doesn't focus on graphics.
Yeah, but it will be interesting to see what Microsoft's first party games and Nintendo's first party games will look like on the Xbox One and Wii U at E3 2013.
Nintendo doesn't focus on graphics.
Doesn't matter, Mario Galaxy looked impressive on the Wii, Metriod Prime, Wind Waker and more looked very impressive on GameCube.
Nintendo doesn't need to focus on graphics to have great looking games.
I don't think we have enough information to dismiss any theory. Fourth Storm put a lot of hard work into his analysis and no one has refuted it yet. We have alternate theories, but I don't think we'll know who's theory pans out until we see what Nintendo has to offer at E3. It should tell us immediately whether we can expect something beyond what the PS3 and 360 have already shown us. More importantly, the gameplay needs to make the gamepad look unique and novel. Not every game necessarily, but at least a couple need to appealing enough that it makes you wonder why no one thought of that before.
Its impossible to directly compare two different architectures.
What Nintendo has to offer has no bearing on the physical components of the hardware, though I fear people will try to draw such conclusions. As I have said many times. Nintendo never tries to max their hardware. Nothing they release will show the full potential of the console.
Are you saying the galaxy games, DKCR or Skyward Sword does not max out the Wii ?
Probably needs alot of power to handle 1080p. So I would put it near 4-5x better when all said is done.10x? LOL in what world? No man.
What Nintendo has to offer has no bearing on the physical components of the hardware, though I fear people will try to draw such conclusions. As I have said many times. Nintendo never tries to max their hardware. Nothing they release will show the full potential of the console.
Yes, because marketing departments rarely embellish useless numeric modifiers.probably around 8-10x in real terms (not paper numbers)
microsoft themselves quantified the overall durango as around 8x as powerful as 360 in that wired article. since wii u roughly equal a 360 so far imo...
Probably needs alot of power to handle 1080p. So I would put it near 4-5x better when all said is done.
That's still a huge leap over Wii U.
Probably needs alot of power to handle 1080p. So I would put it near 4-5x better when all said is done.
That's still a huge leap over Wii U.
I don't think we have enough information to dismiss any theory. Fourth Storm put a lot of hard work into his analysis and no one has refuted it yet. We have alternate theories, but I don't think we'll know who's theory pans out until we see what Nintendo has to offer at E3. It should tell us immediately whether we can expect something beyond what the PS3 and 360 have already shown us. More importantly, the gameplay needs to make the gamepad look unique and novel. Not every game necessarily, but at least a couple need to appealing enough that it makes you wonder why no one thought of that before.
Probably needs alot of power to handle 1080p. So I would put it near 4-5x better when all said is done.
That's still a huge leap over Wii U.
Mario Galaxy ran at 60 FPS with no load times and no performance hiccups. That alone lets you know that its not even remotely pushing the system. Heck, a person hacked the Wii version and got it to run split screen with Luigi as a second character and there was still no sign of frame drop. Nintendo always plays it safe.
http://www.youtube.com/watch?v=Pqef07U0LNs
Those are very different scenarios.I doubt they came close and how did DKCR and Skyward Sword get on that list? They no more maxed out the Wii than Mario Sunshine maxed out the GameCube or Mario 64 maxed out the N64. Power and performance have never been Nintendo's focus even in games. They are all about art and creativity.
That doesn't mean it doesn't push the system.Mario Galaxy ran at 60 FPS with no load times and no performance hiccups. That alone lets you know that its not even remotely pushing the system. Heck, a person hacked the Wii version and got it to run split screen with Luigi as a second character and there was still no sign of frame drop. Nintendo always plays it safe.
http://www.youtube.com/watch?v=Pqef07U0LNs
Normal mapping is overated on a system like the Wii; you have loads of theorectical polygons for an SD console, so you might as well use them instead of faking them (Metroid Prime 3, for instance; as opposed to Conduit doing normal maps in those enclosed rooms). Talk about diminishing gains.If you can name a single Wii game that pushed more than 20 million polygons at 60 FPS with bump mapping,normal mapping, dynamic lighting, bloom and self shadowing(Rebel Strike did this on the GC), then you may be looking at a game that came close to maxing the Wii. Just finding a Wii game that did normal mapping is a rarity. Few devs attempted to get all the console had to offer and none of them really succeeded. Factor 5 probably would have but they went bankrupt midproduction of Lair for the Wii which I think they said was going to run at 720p on the console(the Wii capable of outputting in HD in case you weren't aware).
I doubt they came close and how did DKCR and Skyward Sword get on that list? They no more maxed out the Wii than Mario Sunshine maxed out the GameCube or Mario 64 maxed out the N64. Power and performance have never been Nintendo's focus even in games. They are all about art and creativity.
"Lair" would never run at 720p on the Wii, lol; for starters framebuffer memory is too small (2 MB) it's small for 640x480+z-buffer hence the chronic dithering and almost no one using AA settings (720p would place that under an even bigger strain/require extra RAM from the main 24 MB 1T-SRAM bank); nor would it be called Lair, because that's a Sony trademark; just like Shadows of the Eternals isn't Eternal Darkness 2 because Nintendo owns that. Also bare in mind Factor 5 were trying to attract investment, they'd say anything at that point. Yes, they were capable of pulling something standing appart graphically, but that's not the only way to use your resources.
The problem with Wii's inability to go above 480p had nothing to do with the size of the embedded framebuffer.IIRC they wanted to "bypass" the 2MB framebuffer and use the 24MB main RAM instead to enable 720p.
Yes, because marketing departments rarely embellish useless numeric modifiers.
Even if they could, what's the point?IIRC they wanted to "bypass" the 2MB framebuffer and use the 24MB main RAM instead to enable 720p.
That's interesting, thanks for the insight.The problem with Wii's inability to go above 480p had nothing to do with the size of the embedded framebuffer.
The embedded framebuffer (eFB) is not what the cube/wii show on your TV - the eFB is a back buffer. What you see on the TV is the primary framebuffer (xFB), which low and behold resides in MEM1 (24MB). You can use eFB as a tile buffer and build up an xFB of much larger resolution, not unlike how 360 would build up its primary buffer while using its scarce 10MB of back buffer. The problem on Wii was in the VI (video interface) - the TV signal encoder, which simply could not do > 480p.
The science is irrefutable councilor!Microsofts marketing department just confirmed they embellish useless numeric modifiers 3-9x less often than Sony doesn't.
Maybe they tested render target resolutions of 720p, which I've never tried but again should not be impossible from what I recall. How they'd manage to take that to the TV is entirely beyond my knowledge though, and I've actually experimented with that, unsuccessfully.The dude's/reports saying they somehow tested 720p were talking out of their asses?
What??? That makes no sense at all. That is not even possible. Where did you get that from?
Didn't Marcan state a while ago that the eDRAM on Latte is actually not eDRAM at all, but still 1T-SRAM? 1T-SRAM is often marketed as eDRAM, because that's technically what it is, but it has slightly lower density yet is quite a bit faster (lower, SRAM like latency) at the same clock speed and bus width.
I don't know. He wrote that he's seen Renesas 1T-SRAM, and what's on Latte looks just like it. Also, I'm not sure it would be possible to run Wii software if the memory timing isn't exactly like it should be. And it's impossible to get the exact same timing using regular eDRAM. It needs to be either real SRAM or some form of pseudostatic RAM.i'm pretty sure he's mistaken
I don't know. He wrote that he's seen Renesas 1T-SRAM, and what's on Latte looks just like it. Also, I'm not sure it would be possible to run Wii software if the memory timing isn't exactly like it should be. And it's impossible to get the exact same timing using regular eDRAM. It needs to be either real SRAM or some form of pseudostatic RAM with no or completely hidden refresh cycles.
It doesn't really have to be MoSys 1T-SRAM. Fujitsu FCRAM or Numonyx PSRAM are pretty much exactly the same thing as far as I know.
I just checked, and Renesas apparently offers both regular and pseudostatic eDRAM.oh ok i thought mosys were the only makers of 1tsram, maybe then
I just checked, and Renesas apparently offers both regular and pseudostatic eDRAM.
This isn't about bandwidth, it's about latency.so what could this do to our guesses at edram bandwidth?
This isn't about bandwidth, it's about latency.
Based on the information available now XB1 can draw up to 100W. If I remember correctly Wii U can draw up to 75W (this is not the real usage). So I think is quite safe to say that an 8x-10x more power on XB1 is a little bit out of the question. Even compared to the real wattage: 33W and assuming XB1 will come close to 100W (in an ideal world). Isn't it?
I know there is a difference, and it's big enough for pseudostatic RAM to be a thing, but I don't know how big the difference actually is. Nintendo apparently loves the stuff though, they used it in every console since the Gamecube and every handheld since at least the DSi (the DS supposedly used real SRAM).so it wont make any difference? we still looking at 70 or 140 then?
how much better for latency is it?
I think the SoC is 40nm too, which definitely puts a stop to some ideas about what 33watts can and can't do, as we know that XBone is 1.2TFLOPs.
http://www.neogaf.com/forum/showthread.php?t=560464 according to wired it's 40nm.
I think the SoC is 40nm too, which definitely puts a stop to some ideas about what 33watts can and can't do, as we know that XBone is 1.2TFLOPs.
http://www.neogaf.com/forum/showthread.php?t=560464 according to wired it's 40nm.
Those are very different scenarios.
Mario 64 was a launch title, on a generation Nintendo was hard to develop for, lots of trial and error involved too seeing it was the first acclaimed "full" 3D game.
Mario Sunshine was not launch; it already built on the Nintendo's experience with, say... Pikmin.
And I agree Skyward Sword doesn't push the system, but TP did, for the GC. It's a matter of ambition, SS is more satisfied in being confined, so it does less; it's not trying to prove anything, so it's not pushing the envelope in that sense.That doesn't mean it doesn't push the system.
Sure, they had overhead left (and optimizations possible), but they're using the hardware in a very throughful manner.Normal mapping is overated on a system like the Wii; you have loads of theorectical polygons for an SD console, so you might as well use them instead of faking them (Metroid Prime 3, for instance; as opposed to Conduit doing normal maps in those enclosed rooms). Talk about diminishing gains.
And AFAIK Factor 5 didn't use it.
"Lair" would never run at 720p on the Wii, lol; for starters framebuffer memory is too small (2 MB) it's small for 640x480+z-buffer hence the chronic dithering and almost no one using AA settings (720p would place that under an even bigger strain/require extra RAM from the main 24 MB 1T-SRAM bank); nor would it be called Lair, because that's a Sony trademark; just like Shadows of the Eternals isn't Eternal Darkness 2 because Nintendo owns that. Also bare in mind Factor 5 were trying to attract investment, they'd say anything at that point. Yes, they were capable of pulling something standing appart graphically, but that's not the only way to use your resources.
That's like saying Majora Mask on the GC didn't properly use the extra RAM just because instead of extra resolution and ultradetailed textures it used it to track down characters and the like. Rogue Squadron games had very little game logic going on, no complex physics too; and lots of TEV manipulation, that's fine, but most can think of better rounded-up methods of using said hardware in a more balanced manner.
does this. Any game that runs at 60 FPS can get could get double the performance at 30. If you never hit an instance of slowdown, then that means you were no where near peaking the limits of the hardware.
When did 30FPS become a "ridiculously low"? I guess that means Mario Sunshine was unplayable and Killzone 4 will be.That's a completely nonsensical metric. Completely maxing out a console would be stupid. No one wants to play a game at some ridiculously low framerate. Maxing should be looked at in terms of how optimized it is for a particular framerate (30 or 60), not at any arbitrary rate, because devs specifically target those 2
Also, you are missing the entirety of what I'm saying. The main point was that when Mario Galaxy achieved 60 FPS, it did it with a ton of leeway. It didn't even try to "push" the system. People are letting their opinion of the game itself prevent them from see it on a technical level.
Isn't the 100w just for the SoC though? I'm on my phone so can't easily find where I read it yesterday.Based on the information available now XB1 can draw up to 100W. If I remember correctly Wii U can draw up to 75W (this is not the real usage). So I think is quite safe to say that an 8x-10x more power on XB1 is a little bit out of the question. Even compared to the real wattage: 33W and assuming XB1 will come close to 100W (in an ideal world). Isn't it?
Based on the information available now XB1 can draw up to 100W. If I remember correctly Wii U can draw up to 75W (this is not the real usage). So I think is quite safe to say that an 8x-10x more power on XB1 is a little bit out of the question. Even compared to the real wattage: 33W and assuming XB1 will come close to 100W (in an ideal world). Isn't it?
Isn't the 100w just for the SoC though? I'm on my phone so can't easily find where I read it yesterday.
I wonder if Watch Dogs will show the difference, or possibly Ghosts.
probably around 8-10x in real terms (not paper numbers)
microsoft themselves quantified the overall durango as around 8x as powerful as 360 in that wired article. since wii u roughly equal a 360 so far imo...
Are the architectures really that different ? Both are using EDRam to make up for their slow main ram pool.
8X? HAHAHAHA no. 4-5X at MOST.