• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

Not that it's necessarily comparable, but what are power draws for the 360 or PS3 when playing demanding games?

They've changed a lot as those consoles have gone through revisions. Off the top of my head ~200 watts at launch, more like 100 watts or less now.

23400.png
 
That it's capable of more isn't in question (Arkam has already alluded to more).

The question is what.

The part I quoted seemed to be all unconfirmed speculation. He's saying it's not enhanced Broadway cores (it may be, imo it likely is) the GPU is 3-4X PS360 (again, VERYYYY optimistic and not shown in any games yet). The GPU is DX 11 (again imo it's going to end up a DX10 GPU) etc.
 

The_Lump

Banned
Then you have never seen Banjo Kazzoie: Nuts and Bolts

s35394_360_23.jpg

banjo_dlc.jpg

banjokazooienutsbolts20nv4.jpg

banjo-kazooie-nuts-bolts-20081009023455627_640w.jpg


Yep, thats not NLs best shot tbh. This ones better (although it's a rubbish off screen shot)

MEKuY.jpg


But either way it's besides the point. It's so subjective. Some people can't see the improvements in lighting, some people think it looks the same as they've seen before.

There areas many effects which can improve the overall IQ of a game, which a ps360 cannot do and taken individually might not be much. But overall it can be a huge difference maker.

Personally I believe there'd a lot of lighting effects in show in NL and a couple in ZU which aren't possible on current gen consoles (ambient occlusion, true DOF) but that doesn't mean other parts of the games visuals patent lacking or don't lend themselves to showing off graphical oomph.

Remember, we're not comparing artistic direction here. You can't say "NL looks better than Nuts & Bolts" as you're comparing two very different art styles. Imo you can say, however, that some of the lighting in NL is a step up.
 

Oblivion

Fetishing muscular manly men in skintight hosery
No. Not for a console releasing almost 7 years later.

Didn't think so.




Got a general question for you techie folks. It's something that's puzzled me for years, but never actually bothered asking. Why is the jump in clockspeeds for the GPUs for both the 360/PS3 when compared to GC/Xbox not as great as the jump in clockspeeds for the CPUs.

Xbox GPU: 250 Mhz
Xenos: 500 Mhz


Xbox CPU: 733 Mhz
Xenon: triple core 3.2 Ghz


Now before anyone jumps on me, I get that clock speeds by themselves don't mean much, and that other things like architecture, and new feature sets and all this other stuff may be more important. I get that. But the same could be said about the CPUs. They got new architectures, new feature sets, etc. But their clock speeds are significantly higher.

Is there a reason for this?
 

SmokyDave

Member
Honestly, I'm really not seeing the 'wow' in either Nintendoland shot.

I want to see something that makes me question whether it's a bullshot or not.
 
Didn't think so.




Got a general question for you techie folks. It's something that's puzzled me for years, but never actually bothered asking. Why is the jump in clockspeeds for the GPUs for both the 360/PS3 when compared to GC/Xbox not as great as the jump in clockspeeds for the CPUs.

Xbox GPU: 250 Mhz
Xenos: 500 Mhz


Xbox CPU: 733 Mhz
Xenon: triple core 3.2 Ghz


Now before anyone jumps on me, I get that clock speeds by themselves don't mean much, and that other things like architecture, and new feature sets and all this other stuff may be more important. I get that. But the same could be said about the CPUs. They got new architectures, new feature sets, etc. But their clock speeds are significantly higher.

Is there a reason for this?

Not a techie but, Umm I dont really know. Just that GPU speeds were increasing slower than CPU speeds back then? Plus, they went with some REALLY high clock, low IPC CPU's...here 7 years later we're still barely pushing upper 3 ghz for the most part on the fastest PC CPU's.

For that matter it'll probably be the reverse this time. PS4's GPU is rumored at 800 mhz, while I wouldn't expect the CPU's to be >3.2 ghz (I'd expect them to be under in fact, possibly something like 2.4 ghz Jaguar cores)

BTW, the CPU's in X360 and PS3 are widely regarded as garbage (yeah, understanding you can do some things with Cell SPU's). They're in order, which makes them very difficult to program and is pretty uncommon anywhere else in the CPU world.

A 2.4 ghz Jaguar will probably be 2-3x Xenon cores on a per core basis according to someone on B3D, and we could be looking at 8 of them in Durango. And Jaguar cores would be among the puniest current CPU cores available...

The clockspeeds are gaudy, but in general the 360/PS3 CPU architecture is trash.
 

FyreWulff

Member
Not a techie but, Umm I dont really know. Just that GPU speeds were increasing slower than CPU speeds back then? Plus, they went with some REALLY high clock, low IPC CPU's...here 7 years later we're still barely pushing upper 3 ghz for the most part on the fastest PC CPU's.

For that matter it'll probably be the reverse this time. PS4's GPU is rumored at 800 mhz, while I wouldn't expect the CPU's to be >3.2 ghz (I'd expect them to be under in fact, possibly something like 2.4 ghz Jaguar cores)

Yeah, CPUs have hit a GHz wall.. hence the take-off of multi-core.
 
Didn't think so.

Got a general question for you techie folks. It's something that's puzzled me for years, but never actually bothered asking. Why is the jump in clockspeeds for the GPUs for both the 360/PS3 when compared to GC/Xbox not as great as the jump in clockspeeds for the CPUs.

Xbox GPU: 250 Mhz
Xenos: 500 Mhz


Xbox CPU: 733 Mhz
Xenon: triple core 3.2 Ghz


Now before anyone jumps on me, I get that clock speeds by themselves don't mean much, and that other things like architecture, and new feature sets and all this other stuff may be more important. I get that. But the same could be said about the CPUs. They got new architectures, new feature sets, etc. But their clock speeds are significantly higher.

Is there a reason for this?


It's my understanding that the actual clock speeds of CPUs aren't really that high. They're in the hundreds like GPUs, but there's a multiplier on chip that gives you those "3.2 Ghz". GPUs don't work differently.

I'm sure someone more tech savy can explain this better.
 
Didn't think so.




Got a general question for you techie folks. It's something that's puzzled me for years, but never actually bothered asking. Why is the jump in clockspeeds for the GPUs for both the 360/PS3 when compared to GC/Xbox not as great as the jump in clockspeeds for the CPUs.

Xbox GPU: 250 Mhz
Xenos: 500 Mhz


Xbox CPU: 733 Mhz
Xenon: triple core 3.2 Ghz


Now before anyone jumps on me, I get that clock speeds by themselves don't mean much, and that other things like architecture, and new feature sets and all this other stuff may be more important. I get that. But the same could be said about the CPUs. They got new architectures, new feature sets, etc. But their clock speeds are significantly higher.

Is there a reason for this?

Mainly because GPUs are much more focused on parallel execution. So, instead of doubling the clock rates, you can also double the amount of shader units/ROPs/etc. to automatically achieve the same doubled performance increase. That's often more efficent than a clock speed increase.

Also, the Xbox 1 CPU had OoOE and higher IPC than Xenon cores.
 

Shaheed79

dabbled in the jelly
Presumably that's in reference to lherre and Arkam, although I think the latter is an engineer rather than a developer?

If it is those two, then do you know which companies they work for, and what the two of them said, definitively, about the Wii U's final specs? I'm not talking about things like cosigning dev kit rumors, directly or indirectly, or making nondescript and vague generalizations, that don't tell us anything of real relevance, like "enhanced Broadway". I'm talking about them actually saying something like "These are what I know the final Wii U specs to be.", from having direct access to either the final DK or the retail unit, and then stating those specifications.

Knowing exactly which developer they work for is very important simply because some developers are not privy to the same classified information, and the most recent dev kits, that other dev studios, whom Nintendo greatly trusted, were. This is the reason why we had seen conflicting reports from anonymous devs and named devs, who were confirmed to be working closely with Nintendo.

I can imagine that the developers, who were not trusted enough by Nintendo to have certain privileged information, and, in turn, had to settle for older less capable dev kits, weren't all that happy with Nintendo. Probably for good reason. More than likely, they didn't mind expressing their disappointment in the hardware, in which they were forced to work with, as an "anonymous" developer source. Developers are human, and it shouldn't be all that surprising that some of them may result to such actions in order to deal with their individual frustration with Nintendo appearing to treat them as second class developers.

It is because of this, possible and even likely scenario, that I never gave much credence to anonymous developer sources and neither should anyone else. So it becomes exceedingly relevant to know which company a source works for and then understand the relationship said company has with the console company they are commenting on. This applies to PS4/720 as well.
 

The_Lump

Banned
Honestly, I'm really not seeing the 'wow' in either Nintendoland shot.

I want to see something that makes me question whether it's a bullshot or not.

There aren't any. And don't think anyone was 'wow'ing. That's my point really. Although it's obvious (to me) there are lighting effects which are new, it's subtle and isn't going to make people shit the bed. Yet.

Err... why aren't people just using the screens from the press page?

Cos they suck and everyone's seen them :D I'll try and lift a proper capture from the Euro ND video. Had some nice new footage and its where I got that shot from.

People aren't reading back to how the conversation got to this point. No one was trying to compare games, just admiring some nice new, subtle lighting effects is all.

And the fact its an of screen shot matters not. I was talking about the lighting and shadows - which are clearly visible.
 
It's my understanding that the actual clock speeds of CPUs aren't really that high. They're in the hundreds like GPUs, but there's a multiplier on chip that gives you those "3.2 Ghz". GPUs don't work differently.

I'm sure someone more tech savy can explain this better.

I think you are mixing that up with RAM clock speeds. They are indeed often specified with "effective" clock rates which are higher than the physical ones.
 
They've changed a lot as those consoles have gone through revisions. Off the top of my head ~200 watts at launch, more like 100 watts or less now.

Could Nintendo have chosen a bigger box, thus enabling more power consumption without the cooling problem, and given us better performance for about the same price?
 
I think you are mixing that up with RAM clock speeds. They are indeed often specified with "effective" clock rates which are higher than the physical ones.

No. CPUs have what it's called a PLL part on the chip (phase-locked loop, http://www.sentex.ca/~mec1995/gadgets/pll/pll.html ) that multiplies the clock speed. This is (was?) a very expensive and complicated part on a chip and it's probably the reason why GPUs don't have them, that and the fact that it's probably not necessary due to the actual GPU architecture.
 
Lighting on the U is clearly a step-up from last gen - you can't see it or admit it that's your problem. On the mobile so can't post it, but there's a 7 min long off-screen Pikmin Adventure video posted on NintendoWorldReport's YT channel which is a pretty nice example and has lots going on, someone miight want to post it.
 
No. CPUs have what it's called a PLL part on the chip (phase-locked loop, http://www.sentex.ca/~mec1995/gadgets/pll/pll.html ) that multiplies the clock speed. This is (was?) a very expensive and complicated part on a chip and it's probably the reason why GPUs don't have them, that and the fact that it's probably not necessary due to the actual GPU architecture.

The PLL generates/transforms the clock speed for the CPU, but that doesn't make it any less real. If a CPU is specified with 3 GHz, (most of) the pipeline is truly physically clocked at that 3 GHz.
 

v1oz

Member
Let me make something very clear. The majority of GPU/CPU's are, what can be defined as, "enhanced" versions of the GPU/CPU from their previous iterations.
Broadway was an enhanced version of Gekko. Truth be told we didn't see much improvement going from Gekko to Broadway. Remember Gekko is based off some variant of the PPC we saw in G3 Apple Macs nearly 15 years ago. It's almost like someone tried to bring back an enhanced version of the 68000 and keep that architecture going.
 
Can people praising that Nintendoland screenshot at least say what aspects they find so amazing rather than just putting out one liners? I'm struggling to see what is so amazing about it, I'm guessing it being really bright is making people mistake that for some kind of lighting improvement.
 

Neff

Member

I was struck by this image from the video also. The lighting seems extremely natural and complex compared to PS360. I'm very much looking forward to seeing the hardware perform first hand.
 

The_Lump

Banned
Broadway was an enhanced version of Gekko. Truth be told we didn't see much improvement going from Gekko to Broadway. Remember Gekko is based off some variant of the PPC we saw in G3 Apple Macs nearly 15 years ago. It's almost like someone tried to bring back an enhanced version of the 68000 and keep that architecture going.

Which is why it's patently untrue. It's impossible to add OoOE, SMT and clock it higher than 1GHz and still call it a broadway.

Can people praising that Nintendoland screenshot at least say what aspects they find so amazing rather than just putting out one liners? I'm struggling to see what is so amazing about it, I'm guessing it being really bright is making people mistake that for some kind of lighting improvement.


Read the last 3 pages dude. If you don't agree, no problem. But read back and see what people have said.
 

v1oz

Member
For a example this?

nintendo+land+mario+chase+screen+3.jpg


I haven't seen one PS3/360 game that looks just as good lighting wise, and I think the textures and detail are great.
There is hardly any geometry there. The models are so basic with texture maps for eyes and facial features. Even the iPhone could render that same exact scene.
 

Shaheed79

dabbled in the jelly
The lighting in most PS3 and 360 games actually weren't all that great. The games with the best lighting usually had to sacrifice in other areas like rendering resolution, texture quality, AA or something else.

Anyone who got to play the superior PC versions of many of the 360/PS3 games should have immediately noticed the large difference in the lighting models being used between each build. UE3 in particular had rather simplified lighting models, but instead used several tricks in order to give off the effect of respectable lighting.

There is hardly any geometry there. The models are so basic with texture maps for eyes and facial features. Even the iPhone could render that same exact scene.
Everything but the lighting model, yes it could. I'm not familiar with Ip5 though so maybe it can do the lighting as well. Realistically people won't begin seeing what the Wii U is truly capable of until 2nd generation games that are built from the ground up for Wii U or PC to Wii U, and only from good developers.
 

Theonik

Member
Not very hard. Disney pays as little as possible themselves. If you notice, their movies are "Disney DVD" and "Disney Blu Ray", they don't actually pay to use the DVD/BD trademarks.

You can make DVD/BD compliant disks, but if you want that delicious trademarked logo on them or your player you have to pay the licensing fee for that trademark and all the requirements that come with it, ie supporting java and all that stuff.
Errr, Disney is in the board of directors of the blu-ray disk asssociation... And it's not just licensing trademarks, that's part of what you get but there are lots of technical things you get in relation to the format itself. They can't use the format without paying royalties but they can license patents from companies in the association that have them and use a derivative format like they did on the Gamecube and Wii.
 

mrklaw

MrArseFace
I was struck by this image from the video also. The lighting seems extremely natural and complex compared to PS360. I'm very much looking forward to seeing the hardware perform first hand.

same, its the one that clearly stood out. Hope its realtime :)

The textures give a nice feel of cloth/sack, and the lighting is nice and soft on the hats.
 

The_Lump

Banned
same, its the one that clearly stood out. Hope its realtime :)

The textures give a nice feel of cloth/sack, and the lighting is nice and soft on the hats.

Yeah would be nice if someone could cut that part of the video so others can see. (on my phone so can't)
 
The PLL generates/transforms the clock speed for the CPU, but that doesn't make it any less real. If a CPU is specified with 3 GHz, (most of) the pipeline is truly physically clocked at that 3 GHz.

Oh I know, the other guy was just asking why CPU clock speeds kept going higher and higher and GPUs didn't, the fact that GPUs don't have a PPL is why.
 

FyreWulff

Member
Errr, Disney is in the board of directors of the blu-ray disk asssociation... And it's not just licensing trademarks, that's part of what you get but there are lots of technical things you get in relation to the format itself. They can't use the format without paying royalties but they can license patents from companies in the association that have them and use a derivative format like they did on the Gamecube and Wii.

I'm aware. Their DVDs and Blu Rays still technically aren't fully spec compliant (because they've done stuff to act as their own homecooked protection scheme, amongst other things), because it's Disney. They do what they want.

edit: pointed out on the VideoLAN forums with their DVD shenanigans:

http://forum.videolan.org/viewtopic.php?f=2&t=85150

Examining the discs more closely, they report sizes like 80GB. The most likely explanation is a deliberately broken TOC with multiple references to the same physical files. In addition to the incorrect size, the broken TOC also causes any DVD player to report a huge array of titles on the disc... only a few of which actually work. This has the effect that the only way to be able to pick the correct titles on these DVDs is to navigate through the DVD menu.

Looking at dmesg (Linux system log) confirms that faulty sectors have been deliberately added to the disc at certain points. When playing, the OS tries re-reading them again and again, and eventually reports read errors. But while it is trying to re-read, no data flows from the disc to the player, which is why the freezes happen.

Obviously, the protection scheme is either blatantly breaking the DVD standard, or utilizing some loopholes in it. Apparently the idea is that if the player reads only those sectors the DVD "playlist" requires, it will skip the faulty sectors and work just fine. The problem is that computer-based players tend to have a linear read-ahead cache - possibly already at the OS level - which doesn't account for any jump commands in the "playlist". If the cache size is such that part of it would be filled from the faulty sectors when playing the end of the chapter... I'm sure all of us see the problem.

(The explanation may have technical inaccuracies, but I think the basic idea is correct.)

The only workaround is to image the whole disc to the hard drive, ignoring read errors, and then play the image file (which will not generate any further read errors). Unfortunately, because this involves breaking (ineffective) encryption, doing it is illegal in most countries these days.

The usual ultimate irony is, of course, that only legitimate customers like us have these problems. Not sure if that is funny anymore or just sad.
 

Matt

Member
Errr, Disney is in the board of directors of the blu-ray disk asssociation... And it's not just licensing trademarks, that's part of what you get but there are lots of technical things you get in relation to the format itself. They can't use the format without paying royalties but they can license patents from companies in the association that have them and use a derivative format like they did on the Gamecube and Wii.

I was going to say, Disney calls them "Disney Blu-ray" in order to keep the Disney branding at the forefront, but their disks also have the standard Blu-ray branding and logos on them.
 

Hoo-doo

Banned
Lighting on the U is clearly a step-up from last gen - you can't see it or admit it that's your problem. On the mobile so can't post it, but there's a 7 min long off-screen Pikmin Adventure video posted on NintendoWorldReport's YT channel which is a pretty nice example and has lots going on, someone miight want to post it.

You realize off-screen footage will often make games look better than they really do, right?

Off screen footage from GTAIV for example will make it look pretty great but when you get direct feed screenshots/video where you can actually accurately see the aliasing, low-res texures, lack of draw distance and low geometry, it becomes a whole different thing.
 

Shaheed79

dabbled in the jelly
For example, just look what Capcom has been able to do with the 3ds after becoming accustomed to the hardware. From their 1st gen games like RE:Mercenaries and SSFIV, to their 2nd gen games like RE:Revelations and MH3G, to their 3rd generation EX Troopers (I'm going to give them a little more time before I judge MH4 since it's appears to be a different engine designed for larger open environments). Capcom has been able to pull off visuals on the 3ds that many people thought was impossible based on it's launch software, and EX Troopers/Revelations could even pass for decent looking Vita games.

I think people are going to be surprised at just what the Wii U can actually do visually once the A grade devs become accustomed to the hardware.
 

Log4Girlz

Member
Didn't think so.




Got a general question for you techie folks. It's something that's puzzled me for years, but never actually bothered asking. Why is the jump in clockspeeds for the GPUs for both the 360/PS3 when compared to GC/Xbox not as great as the jump in clockspeeds for the CPUs.

Xbox GPU: 250 Mhz
Xenos: 500 Mhz


Xbox CPU: 733 Mhz
Xenon: triple core 3.2 Ghz


Now before anyone jumps on me, I get that clock speeds by themselves don't mean much, and that other things like architecture, and new feature sets and all this other stuff may be more important. I get that. But the same could be said about the CPUs. They got new architectures, new feature sets, etc. But their clock speeds are significantly higher.

Is there a reason for this?

CPU's can often deal with code which cannot be parallelized, which means that to have a remarkably faster CPU, you had to dial up the mhz, so their architectures are optimized to run at a very fast speed. GPU's on the other hand deal with information which is very easily parallelized, so the architectures are optimized for that...a GPU running at CPU speeds would be uncoolable as a result.
 

majik13

Member
Honestly, I'm really not seeing the 'wow' in either Nintendoland shot.

I want to see something that makes me question whether it's a bullshot or not.

The shading and lighting on the clothe tunics looks very natural, feels soft. Though some of it may be due to the screengrab. Looks like there could be some GI going on, or some kind of environment bounce or reflection.

edit: also notice a little rip/tear in the hat of the blue link.
Also, looking at some screens and videos, it doesnt look as nice, oh well.
 

gofreak

GAF's Bob Woodward
The shading and lighting on the clothe tunics looks very natural, feels soft. Though some of it may be due to the screengrab. Looks like there could be some GI going on, or some kind of environment bounce or reflection.

I'm not seeing any first bounce approx in those shots.

I think I see an improved version of the bloom/'rim bloom' in some of the previous Mario games. I see reflections. I see some clean gouraud shading...or at least something that dampens the specular component, for a softer shading on cloth etc. But there's nothing technically stand-out about that. Nice textures. No soft shadowing.

I think it's a nice blend of well chosen assets and parameter choices, but technically I'm not sure I see what some others are claiming to see. Someone want to point it out?
 

Ryoku

Member
About Nintendo Land, I, and others, are referring not to the complex environment and stuff (because it's not that complex, honestly). We're referring to the lighting solution. I'm mostly basing my views of it off of a video of the Nintendo Land hub-world, which showcases the lighting extremely well. It's still not a direct-feed video, but in my opinion, it's the best we have to showcase the lighting in Nintendo Land so far. Nice subsurface scattering, particle reflections, soft shadows, and very nice radiosity effects.

This is the link: http://youtu.be/tPt27Ozbc4s?t=4m3s stops at 5:09

You can also (not really) use the E3 trailer: http://www.youtube.com/watch?v=gKGlHmEGql0 The Ridley animatron showcases the subsurface scattering I mentioned.

Not sure about the hub-world, but the games themselves are 60FPS, which kind of leads me to think that the hub-world is also 60FPS with a lot of characters and geometry going on.
 

The_Lump

Banned
I'm not seeing any first bounce approx in those shots.

I think I see an improved version of the bloom/'rim bloom' in some of the previous Mario games. I see reflections. I see some clean gouraud shading...or at least something that dampens the specular component, for a softer shading on cloth etc. But there's nothing technically stand-out about that. Nice textures. No soft shadowing.

I think it's a nice blend of well chosen assets and parameter choices, but technically I'm not sure I see what some others are claiming to see. Someone want to point it out?


In motion, almost looks like deferred shading. Which again isn't revolutionary, but combined with the apparent presence of true DOF, soft shadows and a smooth 60fps at 720p - will look rather nice. Even in the feed from the ND, it struck me as a noteworthy shot.
 
I'm not seeing any first bounce approx in those shots.

I think I see an improved version of the bloom/'rim bloom' in some of the previous Mario games. I see reflections. I see some clean gouraud shading...or at least something that dampens the specular component, for a softer shading on cloth etc. But there's nothing technically stand-out about that. Nice textures. No soft shadowing.

I think it's a nice blend of well chosen assets and parameter choices, but technically I'm not sure I see what some others are claiming to see. Someone want to point it out?

3-4 pages back.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Light bouncing of any sort falls under "GI-approx".
That said, the bounced light displayed in the shot is wonky: the golf ball gets only green (from either sides), while sackboy in the middle gets only red : /
 

gofreak

GAF's Bob Woodward
In motion, almost looks like deferred shading. Which again isn't revolutionary, but combined with the apparent presence of true DOF, soft shadows and a smooth 60fps at 720p - will look rather nice. Even in the feed from the ND, it struck me as a noteworthy shot.

It looks nice, sure. I'm just not seeing the back-up for some of the technical claims about the sophistication of the lighting.

Looking at the video above - again, I see nice bloom, bright lighting, perhaps more-frequent-than-average use of reflections and gloss, some sharp specular on some of the surfaces etc. That shouldn't be confused for GI or GI approximation though. I didn't see any evidence of that in the video. Just a perhaps unusual blend of techniques and art choices amongst what is more typical.
 

Log4Girlz

Member
About Nintendo Land, I, and others, are referring not to the complex environment and stuff (because it's not that complex, honestly). We're referring to the lighting solution. I'm mostly basing my views of it off of a video of the Nintendo Land hub-world, which showcases the lighting extremely well. It's still not a direct-feed video, but in my opinion, it's the best we have to showcase the lighting in Nintendo Land so far. Nice subsurface scattering, particle reflections, soft shadows, and very nice radiosity effects.

This is the link: http://youtu.be/tPt27Ozbc4s?t=4m3s stops at 5:09

You can also (not really) use the E3 trailer: http://www.youtube.com/watch?v=gKGlHmEGql0 The Ridley animatron showcases the subsurface scattering I mentioned.

Look at 4:42 at the handle of what appears to be a potted plant. Shadows suck in this game.
 
Is there a nice, clean 1280x720 in-game shot of Nintendoland anywhere? I can't seem to access the press page thingy.

Wouldn't help imo. All screens I have seen posted in the press page thread all seem to have been bullshotted to hell and back. Kinda annoyed over it.

So off screen and video is all we have I think.
 
I was struck by this image from the video also. The lighting seems extremely natural and complex compared to PS360. I'm very much looking forward to seeing the hardware perform first hand.

Doesn't look that special if you compare it with the Puppeteer trailer.
 

The_Lump

Banned
It looks nice, sure. I'm just not seeing the back-up for some of the technical claims about the sophistication of the lighting.

Looking at the video above - again, I see nice bloom, bright lighting, perhaps more-frequent-than-average use of reflections and gloss, some sharp specular on some of the surfaces etc. That shouldn't be confused for GI or GI approximation though. I didn't see any evidence of that in the video. Just a perhaps unusual blend of techniques and art choices amongst what is more typical.


Could be. We just won't know for sure I guess for another month or so.

To me it's the fact its doing this all at once and at a nice framerate & resolution that's most impressive.

Also, check out the Euro Nintendo Direct. Some nice footage of NL (including that Tri-Force shot) which is where I think I'm seeing deferred lighting & or ambient occlusion. Could be wrong but it's quite striking in motion.
 

gofreak

GAF's Bob Woodward
3-4 pages back.

I read from the posting of that screen from the Zelda game in NL.

I saw people throw around claims - 'GI', 'radiosity' - but no one explained where it was evident in the shots posted. At least as far as I read?

If someone wants to highlight a surface in the screenshots posted so far where they see a contribution from indirect lighting, please do. I've looked but have not found myself. So far the things I've seen people cite - 'soft' shading on cloth surfaces, reflections, gloss, shine - are not explained by that. But I'm open to someone pointing it out.
 
Top Bottom