• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Speculation Thread The Third: Casting Dreams in The Castle of Miyamoto

I understand your viewpoint but if the visuals on the main screen are only 10% better then Xbox360 then as far as I'm concerned, it's on par. The fact that it's also rendering to the padlet doesn't matter.

One thing does concern. If the CPU has the same performance as the 360 (based on rumour), and it's working on scenes for both screen and padlet, I can see that in some instances, games will perform less well then on PS360.

Which rumor, exactly, specifically states the Wii U CPU performs identically to the Xbox 360 CPU?
 

antonz

Member
I agree overall. While it might be great for nintendo, it's a burden on third parties. In the end, the costs greatly outweigh the benefits.

The fact that nintendo has been so actively speaking with 3rd parties as of late and their recent dedication towards providing middleware and incentives for development on their platform lead me to believe we wont see a low clocked, fixed function GPU.

It wouldn't be abnormal for nintendo to make such a move though.

Nintendos track record is just so bad with it that the middleware moves etc do worry me. The 3DS at one point had a fully fleshed out GPU then somewhere along the line late they went with PICA etc.

Yes the 3DS does well and fortunetly enough for them PICA is advanced enough for them to get MT engine mobile etc on it but the console space is a different beast than handhelds and if they want the dedicated western support they cant be doing the usual.
 
Which rumor, exactly, specifically states the Wii U CPU performs identically to the Xbox 360 CPU?

I recall wsippel posted something about some middleware documentation he read, stating the Wii U CPU was comparable to Xbox 360, occasionally superior.

Sounded like havok or something maybe as he stated said middleware was CPU limited. Also sounded like he got it from a public website possibly, EG official public documentation, though he didn't share.
 

guek

Banned
Yeah he said it was functionally identical in all areas, never inferior, and specifically superior in performing certain tasks. It might be that as time goes on, better performance will be achievable, but it wont be anything at all substantial.
 

aeroslash

Member
I understand your viewpoint but if the visuals on the main screen are only 10% better then Xbox360 then as far as I'm concerned, it's on par. The fact that it's also rendering to the padlet doesn't matter.

One thing does concern. If the CPU has the same performance as the 360 (based on rumour), and it's working on scenes for both screen and padlet, I can see that in some instances, games will perform less well then on PS360.

Aren't most games GPU limited? With a CPU on the same range as the 360, what would be the same are the calculations of things like AI and physics. Which IMO would only have to be calculed once, so in this aspect, the second screen wouldn't affect.

So, we need to look at the GPU to know how games will perform. But i think it is safe to say that AI and physics will be mostly the same as this gen.
 
Fixed function would be stupid vs. simply adding more shaders. Things like TEV can theoretically be extremely efficient and powerful, but it's such a burden to need to work so much harder than you would using programmable shaders. It's not out of the realm of possibility that the GPU could include a TEV unit in some capacity, but having the GPU be fixed function in a big way just flies in the face of everything we've heard from 3rd parties. Easy to develop for and dev friendly = programmable shaders and easy architecture, less fixed function.

So I'm inclined to believe that Nintendo added some of their own "features" on the GPU which may or may not be fixed function. Because making a HD console in 2012 with predominantly fixed function would be a worse move than the Cell was for Sony.

Things like heat are neither here nor there, though. Nintendo could totally redesign the Wii U to accommodate hotter chips (maybe on smaller fab).
 

Nibel

Member
Don't forget that the subscreen is essential for the power of the console.

I can see games which don't use it for complex stuff look much better on the TV.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Aren't most games GPU limited?

No. It completely depends on the engine. Crysis for instance is mostly CPU bound.

And if you're going to have rich visuals on the padlet, the CPU will still have to render 2 distinct viewpoints. You don't have to worry about doubling up on AI or game logic but you're still have to render to two separate tv's.

That's a burden that PS360 does not have to worry about.
 

aeroslash

Member
No. It completely depends on the engine. Crysis for instance is mostly CPU bound.

And if you're going to have rich visuals on the padlet, the CPU will still have to render 2 distinct viewpoints. You don't have to worry about doubling up on AI or game logic but you're still have to render to two separate tv's.

That's a burden that PS360 does not have to worry about.

That's exactly what i was trying to say. Wouldn't it be the GPU rendering the two scenes instead of the CPU?

Edit: sorry, i know most about graphics features, but don't know exactly how the systems architectures work.
 

guek

Banned
cyberheater, your boundless pessimism never gets old. I'm just going to say you're almost certainly going to be wrong and are just being a debbie downer, as always.
 

wsippel

Banned
I recall wsippel posted something about some middleware documentation he read, stating the Wii U CPU was comparable to Xbox 360, occasionally superior.

Sounded like havok or something maybe as he stated said middleware was CPU limited. Also sounded like he got it from a public website possibly, EG official public documentation, though he didn't share.
The document stated that their engines achieves the same performance (and exceeds it in some cases), which mostly leads to the conclusion that the CPU is at least on par and superior in some areas, but not completely outclassing Xenon. Remember, though, that the Xbox360 version had a lot more development time and is therefore further optimized, and the Wii U can also offload stuff to the DSP (and the IO processor I guess).
 

MDX

Member
This would be amazing. Although i don't see it happening, lighting is one of the most taxing operations in the CG world, but with very apparent results when improved. That would really make the graphics shine above what we have seen in this current gen. Hope t'he GI in the Garden and Zelda demo are because of something like that!

I was speculating that the Zelda demo was hinting that Nintendo has found a special
solution for lighting.
 

MDX

Member
its really stupid when u see all those (also available on wii ) games commercials and at the end of those commercial u hear ((ONLY FOR KINECT FOR XBOX 360) ,,,Microsoft is cashing on wii customers

thoughts ?? Agree disagree ?

They say that NOA is saving their marketing money for the WiiU and 3DS.
In a perfect world, third parties should have become the primary developers for the Wii
during its last years and make use of the large user base. Third parties dropped that ball a long time ago. Nintendo dropped it when they didnt bring over Xenoblade and the rest of their games over to other regions much sooner.

Supposedly they are not going to let Wii die off so soon, but where are the games? I see why they are tight-lipped about the WiiU but they have no reason to be tight-lipped about the Wii.
 

onesvenus

Member
Wouldn't it be the GPU rendering the two scenes instead of the CPU?
You are correct. The only extra work the CPU has to do while rendering both scenes instead of one is a small number of extra graphic calls.
cyberheater said:
the CPU will still have to render 2 distinct viewpoints
Are you implying that when games are displayed in 3d, the CPU tasks as AI, etc. change? Because that's already done rendering two different perspectives
 

HylianTom

Banned
cyberheater, your boundless pessimism never gets old. I'm just going to say you're almost certainly going to be wrong and are just being a debbie downer, as always.

Hey, I like having him around too!

Seeing people eat crow - be it the optimists or pessimists - is going to be one of the highlights of E3.

A lot of folks around here could see 360-times-seven, and they'll still do the whole "yeah, but.. it's Nintendo, so.." thing. But cyber strikes me as a level-headed guy, willing (and happy) to eat crow if he ends-up being incorrect; he may seem to be a pessimist, but I just can't group him in with the crowd of young'uns who automatically bash Nintendo.

{And I've predicted something of a half-step between 360 and 720 (bullshit-quantification-that-I'm-still-not-happy-with: 360-times-three, whatever that means..), but if I'm off, I'm ready to chow on some crow as well.}
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
CPUs and GPUs are pretty poor at handling ray-tracing, though, and there is dedicated ray-tracing hardware out there, such as this academic design and a ray-tracing co-processor from Caustic Graphics, who are now owned by Imagination Technologies (who make the PowerVR GPUs used in phones like the iPhone). Most of these technologies (and there are quite a few others, not to mention papers on the use of FPGAs for the purpose) are intended for full ray-traced rendering, though, calculating tens of millions of rays per second. A ray-processing unit on a standard GPU would only need to be able to calculate a few hundred thousand rays per second to match the performance of the video above, and perhaps a million or so to do so at a good framerate.

I don't know if this is technically feasible (blu will probably arrive shortly to tell me the many reasons it isn't), but it's interesting to speculate what might be coming down the tracks in terms of new GPU hardware.
Since you bring up the RT/PT subject: the issue with that is not so much in the amount or rays that need to be cast, the amount of bounces, or the parallelization of the problem (it's well parallelizable). It's in the narrowing down of potential ray targets. Apparently the 'bruteforce' approach to check every ray against every poly in the scene is not an option outside of simple examples (bad linear complexity), so RT's 'holy grail' has pretty much been an efficient search algorithm, with the added condition it should be of low repartitioning overhead (so it could handle dynamic scenes, not just 'still lifes').

That said, I'm not expecting any RT-dedicated transistors in the WiiU ; )
 
Thers is one thing I can't wrap my head around. Two and three generations ago, nintendo had the most powerful hardware (N64) or arguably the most competent hardware (GC) and still managed to make money. This was before the huges succes with Wii and DS. Why does it seems like the general opinion is that Nintendo can't afford to make a console that rivals PS4/720? Has hardware development costs risen that much? Are the tech guys at Nintendo simply incompetent? Do they try too hard to cram in their system in a small box??
 

Sadist

Member
Thers is one thing I can't wrap my head around. Two and three generations ago, nintendo had the most powerful hardware (N64) or arguably the most competent hardware (GC) and still managed to make money. This was before the huges succes with Wii and DS. Why does it seems like the general opinion is that Nintendo can't afford to make a console that rivals PS4/720? Has hardware development costs risen that much? Are the tech guys at Nintendo simply incompetent? Do they try too hard to cram in their system in a small box??
;)
 

antonz

Member
Thers is one thing I can't wrap my head around. Two and three generations ago, nintendo had the most powerful hardware (N64) or arguably the most competent hardware (GC) and still managed to make money. This was before the huges succes with Wii and DS. Why does it seems like the general opinion is that Nintendo can't afford to make a console that rivals PS4/720? Has hardware development costs risen that much? Are the tech guys at Nintendo simply incompetent? Do they try too hard to cram in their system in a small box??

I think they are letting some aspects of Japan life overwhelm their common sense. They talk about how the console is supposed to be invisible and all that so they keep working on these sleek small boxes that can be hidden away.

This is totally a Japanese mentality. The Rest of the world doesn't give a shit as long as we aren't talking Desktop tower sized cases. They design their tiny little case and basically they back themselves into a corner on hardware because of that console needs to be tiny mentality they now have. Hardware costs are not even close to being an issue. Nintendo could have easily and quite cost effectively made a Wii U that would blow the what seemed to be current Wii U out of the water.
 

HylianTom

Banned
Thers is one thing I can't wrap my head around. Two and three generations ago, nintendo had the most powerful hardware (N64) or arguably the most competent hardware (GC) and still managed to make money. This was before the huges succes with Wii and DS. Why does it seems like the general opinion is that Nintendo can't afford to make a console that rivals PS4/720? Has hardware development costs risen that much? Are the tech guys at Nintendo simply incompetent? Do they try too hard to cram in their system in a small box??

In order to make a console that would reach the heights that the competition's does, Nintendo would have to either charge a high price for their console, or they would have to take a significant loss on each console sold.. and Nintendo doesn't seem to like either option.

That, and Sony and Microsoft aren't gaming-centric companies. Especially in Microsoft's case (lesser Sony lately), these two megacompanies can absorb gaming-related losses with profits from their non-gaming divisions. Nintendo has no such luxury - they have no other divisions ready at the side to bail them out if they slip. I won't mince words: Microsoft can essentially buy its positon in the gaming industry, whereas Nintendo has to earn their position. Sony, meanwhile, demonstrated with the PS3 why losses on each unit sold can hurt in dramatic fashion; had it been Nintendo taking those kinds of losses, they'd be in deep Birdo doo-doo.

It should also be noted: back when Nintendo was more "on par" with the competiton (har har), its competition was another gaming company that also didn't have the luxury of buying its way into the industry. The death of Sega's role in the hardware industry serves as a reminder of how high the stakes are, and lingers in the back of everyone's mind, a powerful memory.

Either way, Nintendo arrives at Technological Level X at some point in time.. but they wait until it's less risky. I'm willing to wait the extra five years or so to get to that level, and I find it hysterical that graphics that were "excellent" just a few years ago are regarded as "crap" just because something better comes out. It strikes me as petty dick-measuring more than anything else.. unless these folks were on the sidelines of gaming, saying to themelves, "oh.. look at those 360-level graphics! Original Xbox graphics were crap, so I didn't play any video games back then, but now I finally feel like graphics are good enough to start my gaming hobby!"

I know that it sounds glib, but I say this in all seriousness: if someone is truly convinced that graphics and power are so incredibly important, they'd be on a PC, excuses be damned.

I don't mind them following this conservative business model, as it means that they are much more likely to be around well into the future. If 30 years goes by and I'm retired and still playing Nintendo games because they played it safe, I'll be thankful for Nintendo giving me a lifetime of those consistently excellent, memorable games.

File it under "silly fanman logic," that's fine. Those are my humble thoughts on the matter.
 

peter

Banned
Another thing i like to see on the next wiiu:

sonic.png


or: http://users.telenet.be/ganon/sonic/sonic.png

I think this is something that sega needs to do. They are great in gfx.
Look at mario, he aslo has many different games, like mario rpg. I''ll like to see sonic in an open world as someting like wonderboy in monsterworld
 
SquiddyBiscuit said:
Really?
I might be completely wrong now, but don't most consoles reveal price and date on E3?
Sometimes. Wii and DS didn't, though. If Wii U isn't releasing as late as November/December it would become more likely. GameCube and Xbox both had their dates revealed (don't recall about price), but both ended up pushed back a little afterwards anyway.
Gahiggidy said:
By the way, how may Flops does the 3DS run at?
I've just tried searching for this info, but no luck. Searching for PICA200 FLOPS finds people talking about systems flopping. Trying PICA200 "floating point" doesn't seem to find it either. It doesn't seem to be listed in the regular specs as seen here.
Thrillhouse said:
Thers is one thing I can't wrap my head around. Two and three generations ago, nintendo had the most powerful hardware (N64) or arguably the most competent hardware (GC) and still managed to make money. This was before the huges succes with Wii and DS. Why does it seems like the general opinion is that Nintendo can't afford to make a console that rivals PS4/720? Has hardware development costs risen that much? Are the tech guys at Nintendo simply incompetent? Do they try too hard to cram in their system in a small box??
N64 was superior in some ways to PS1/Saturn at a similar price, but was also 1.5 year newer and avoided the cost of the disc drive. GameCube stood up to its pricier competition because one was a 1.5 year old machine with esoteric design and the other was pretty rushed. These things don't apply so much now. You've got Nintendo and Microsoft working with the same hardware partners (and PS3/PS4 to a lesser extent), so if one is willing to release a year later, sell at a higher price, and lose more for each one they sell; they should definitely be able to get fancier results.
 

HylianTom

Banned
And now, something I like to do every once in a while: a look back at news from the Wii launch year.

On March 29th and 30th, 2006:

Revolution's Horsepower: Studios give us the inside scoop on the clock rates for Broadway and Hollywood. How do the CPU and GPU stack up on paper?
http://wii.ign.com/articles/699/699118p1.html

Insiders stress that Revolution runs on an extension of the Gekko and Flipper architectures that powered GameCube, which is why studios who worked on GCN will have no problem making the transition to the new machine, they say. IBM's "Broadway" CPU is clocked at 729MHz, according to updated Nintendo documentation. By comparison, GameCube's Gekko CPU ran at 485MHz. The original Xbox's CPU, admittedly a different architecture altogether, was clocked at 733MHz. Meanwhile, Xbox 360 runs three symmetrical cores at 3.2GHz.

Revolution's ATI-provided "Hollywood" GPU clocks in at 243MHz. By comparison, GameCube's GPU ran at 162MHz, while the GPU on the original Xbox was clocked at 233MHz. Sources we spoke with suggest that it is unlikely the GPU will feature any added shaders, as has been speculated.

The overall system memory numbers we reported last December have not greatly fluctuated, but new clarifications have surfaced. Revolution will operate using 24MBs of "main" 1T-SRAM. It will additionally boast 64MBs of "external" 1T-SRAM. That brings the total number of system RAM up to 88MBs, not including the 3MB texture buffer on the GPU. By comparison, GameCube featured 40MBs of RAM not counting the GPU's on-board 3MBs. The original Xbox included 64MBs total RAM. Xbox 360 and PlayStation 3 operate on 512MBs of RAM.


Meanwhile, Iwata was already talking about the new console, this time to CNN:
Nintendo president vows cheap games: Iwata discusses price hikes and ways to keep costs down as gaming enters the next generation.
http://money.cnn.com/2006/03/29/commentary/game_over/column_gaming/?cnn=yes

Satoru Iwata, president of Nintendo of Japan, told me last week that while the company has no control over what its partners ask for their games, "I cannot imagine any first party title could be priced for more than $50."

The Revolution won't have a hard drive, though, raising questions about where owners will be able to store games obtained from the system's "Virtual Console" (a service that will allow them to download virtually every game for previous Nintendo home systems, as well as select titles from the Sega Genesis and TurboGrafx systems).

Last week, Nintendo announced a partnership arrangement with Sega and Hudson Software to offer some of their games via the Virtual Console. Later, Iwata hinted more announcements might be coming – though he wasn't willing to discuss this too thoroughly.

~~~~~

It's fascinating to see the contrast between then and now. To see Iwata saying something, anything back then this early in the year.. it strikes me as odd.
 

antonz

Member
There is quite the difference in behavior between the 2 pre gen launches that is for sure. I kinda feel like Nintendo felt more threatened then and had to put on a strong face because everyone could see on paper the Wii was a joke as far as power compared to the competition.

This time around they seem to be on par with last gen so have a different set of issues to address
 
I said it a long, long time ago: I want to play a Zelda game where I can get lost and have no earthly idea of where I am. If I found myself in that situation, I wouldn't be scared or worried - I'd be enthralled.

I know that many roll their eyes at comparisons between Elder Scrolls games and Zelda games (and yes, much of that "What Zelda could learn from Skyrim" shit was nauseating), but if I could have the overworld of, say, Oblivion combined with the caves, tombs, fortresses, palaces, ruins, and towns of the best Zelda games, I'd be in absolute heaven.

Give me a huge variety of landscapes, a large selection of trees/bushes/vegetation, large ponds/streams/lakes with explorable beds, a metric ton of sidequests, perhaps a main quest where I have to make choices that will affect my abilities or paths in the story.. then give me some cool items early-on so that I actually have time to enjoy them in different settings, and give my Link character different skills that can be improved-upon as the game progresses, maybe even points in the game where I have to choose a specialty (bow-heavy Link? sword-heavy Link? magic-heavy Link?) that's what I would want in my ideal Zelda game.

(I just read that, and damn if I didn't ramble. But I seldom go on about where I'd like to see Zelda go. So I'm gonna leave it there. Not like any of that's ever going to happen.)

The Elder Scrolls games are the exact opposite of what I want Zelda to be like. I can appreciate that Oblivion and Skyrim are good, but I would go off Zelda in a heartbeat if it became a meandering trapse across huge landscapes with nothing in, constant grinding and stat/ability tweaking. My view is let Nintendo make Zelda whatever way they want to and let Bethesda RPGs be Bethesda RPGs.
 

HylianTom

Banned
The Elder Scrolls games are the exact opposite of what I want Zelda to be like. I can appreciate that Oblivion and Skyrim are good, but I would go off Zelda in a heartbeat if it became a meandering trapse across huge landscapes with nothing in, constant grinding and stat/ability tweaking. My view is let Nintendo make Zelda whatever way they want to and let Bethesda RPGs be Bethesda RPGs.

I didn't mention those things, but it looks like we agree on stat tweaking and grinding for Zelda games. Great.

I suspect that Nintendo will stick to making not-too-large, Habitrail-inspired Zelda overworlds such that the possibility of getting lost or having to explore for too long is pretty insignificant. So fear not.
 

DarkChild

Banned
This would be amazing. Although i don't see it happening, lighting is one of the most taxing operations in the CG world, but with very apparent results when improved. That would really make the graphics shine above what we have seen in this current gen. Hope t'he GI in the Garden and Zelda demo are because of something like that!
There is no GI in those demos.
 
Thers is one thing I can't wrap my head around. Two and three generations ago, nintendo had the most powerful hardware (N64) or arguably the most competent hardware (GC) and still managed to make money. This was before the huges succes with Wii and DS. Why does it seems like the general opinion is that Nintendo can't afford to make a console that rivals PS4/720? Has hardware development costs risen that much? Are the tech guys at Nintendo simply incompetent? Do they try too hard to cram in their system in a small box??

It's certainly not incompetence.
You don't spend billions on something unless that's exactly what you want.

It's more that they're trying to stop the hemorrhaging of the industry.
Look at this thread.
A list of every studio that has closed in the last 6 years. Most within the last 3.
Gaming can not support these giant budget wars, just as much as it can't support $.99/Free gaming.
We need that middle ground. Those great experiences that aren't $200 million dollars to make.
Companies can't keep operating in an industry where one failure puts them out.
And MS making "AAAA" games and trying to push huge leaps in console performance that just keep increasing production costs, is not helping.

I know this comes off as fanboy rhetoric, but I look at that list. I hear THQ struggling and losing people from amazing studios like Vigil and Relic, and I just shake my head.
It can't keep going this way. Something will give. And it won't be pleasant.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
GI is not something you can notice as easy as motion blur and I thought AlStrong already said demos didn't have GI.
The garden demo has very apparent second-order (bounced) light on the bird as it flies over the orchards. Whether that's mimicked via local sources or there's a fancier algorithm at work is unknown.
 

udivision

Member
It's certainly not incompetence.
You don't spend billions on something unless that's exactly what you want.

It's more that they're trying to stop the hemorrhaging of the industry.
Look at this thread.
A list of every studio that has closed in the last 6 years. Most within the last 3.
Gaming can not support these giant budget wars, just as much as it can't support $.99/Free gaming.
We need that middle ground. Those great experiences that aren't $200 million dollars to make.
Companies can't keep operating in an industry where one failure puts them out.
And MS making "AAAA" games and trying to push huge leaps in console performance that just keep increasing production costs, is not helping.

I know this comes off as fanboy rhetoric, but I look at that list. I hear THQ struggling and losing people from amazing studios like Vigil and Relic, and I just shake my head.
It can't keep going this way. Something will give. And it won't be pleasant.

That'd make more sense if Nintendo had crazy awesome third party support.
I don't think they're doing it for the good of the industry, so I almost feel like that argument holds little weight, and is more of a coincidence...
 

DarkChild

Banned
The garden demo has very apparent second-order (bounced) light on the bird as it flies over the orchards. Whether that's mimicked via local sources or there's a fancier algorithm at work is unknown.
I can notice it, although I don't understand why would green bounce from grass be so strong. Never mind, we will find out.

EDIT.

Or is it from water?
 

aeroslash

Member
GI is not something you can notice as easy as motion blur and I thought AlStrong already said demos didn't have GI.

What? When the bird is overflying the lake there's very aparent GI on the bird's belly. You can see it changing the color as the light bounces.

Edit: What blu said..

Edit 2:

I can notice it, although I don't understand why would green bounce from grass be so strong. Never mind, we will find out.

EDIT.

Or is it from water?

Then why you said there wasn't GI? It's pretty clear. I don't thing it is so strong, it has a very nice but subtle effect. You definitely have to look for it in order to see it, which in my opinion means it's ok.
 
That'd make more sense if Nintendo had crazy awesome third party support.
I don't think they're doing it for the good of the industry, so I almost feel like that argument holds little weight, and is more of a coincidence...

How does that make sense?
Third parties are stupid, we've established that. They refused to support the lead console. They keep on hiking up their budgets just to lay off hundreds of employees. They make horrible business decisions when it comes to DLC.

Ever notice that while every other company is laying people off, including MS and Sony, that Nintendo is expanding? There's a reason for that, because they know how the industry is suppose to work, even if that's contrary to what third parties think.
 

DarkChild

Banned
Then why you said there wasn't GI? It's pretty clear. I don't thing it is so strong, it has a very nice but subtle effect. You definitely have to look for it in order to see it, which in my opinion means it's ok.
Because I only watcher Zelda demo and because this still doesn't mean it is dynamic GI, could be "cheating" like Crytek or Epic do on consoles.
 

udivision

Member
How does that make sense?
Third parties are stupid, we've established that. They refused to support the lead console. They keep on hiking up their budgets just to lay off hundreds of employees. They make horrible business decisions when it comes to DLC.

Ever notice that while every other company is laying people off, including MS and Sony, that Nintendo is expanding? There's a reason for that, because they know how the industry is suppose to work, even if that's contrary to what third parties think.

Sorry, I didn't read it as "Nintendo realizes the fallacy of the crazy budget drive which is why they're keeping their selves in check." That's closer to what you were saying and is true. For some reason I thought you were making the argument that Nintendo was doing it for the good of the industry.
 

ozfunghi

Member
No. It completely depends on the engine. Crysis for instance is mostly CPU bound.

And if you're going to have rich visuals on the padlet, the CPU will still have to render 2 distinct viewpoints. You don't have to worry about doubling up on AI or game logic but you're still have to render to two separate tv's.

That's a burden that PS360 does not have to worry about.

When i'm playing around with physics tests (through CPU) in my 3D application, it doesn't matter how many different viewports are on. Same should go for AI. So if it is a different view of the same scene, i doubt your hypothesis is correct. Or please elaborate.

If there is a totally different scene on the uPad, then obviously you have a point.
 

Donnie

Member
I understand your viewpoint but if the visuals on the main screen are only 10% better then Xbox360 then as far as I'm concerned, it's on par. The fact that it's also rendering to the padlet doesn't matter.

Clearly it does matter when trying to define the power of the console. It implies that the system is at last twice as powerful as 360. I don't know how many times I've had to say this to you but that extra power doesn't have to be used to render a second scene to the controller. It can be used to improve the image on the TV if that's what the developer wants to do.

And if you're going to have rich visuals on the padlet, the CPU will still have to render 2 distinct viewpoints. You don't have to worry about doubling up on AI or game logic but you're still have to render to two separate tv's.

That's a burden that PS360 does not have to worry about.

You're getting confused here. The CPU doesn't render any viewpoint, the GPU does that. Any extra use of the CPU when producing two scenes will be marginal. More than made up for by the fact that it'll be more powerful and won't have to process sound, 360 uses at least 16% of its CPU (1 thread) for sound processing.
 

aeroslash

Member
Clearly it does matter when trying to define the power of the console. It implies that the system is at last twice as powerful as 360. I don't know how many times I've had to say this to you but that extra power doesn't have to be used to render a second scene to the controller. It can be used to improve the image on the TV if that's what the developer wants to do.



You're getting confused here. The CPU doesn't render any viewpoint, the GPU does that. Any extra use of the CPU when producing two scenes will be marginal. More than made up for by the fact that it'll be more powerful and won't have to process sound, 360 uses at least 16% of its CPU (1 thread) for sound processing.


Thanks for clearing this up. I was sure this was the case. Let's see if cyberheater reads it.
 
Top Bottom