• Register
  • TOS
  • Privacy
  • @NeoGAF

jaosobno
Member
(11-19-2012, 03:20 PM)
jaosobno's Avatar

Originally Posted by nikatapi

Don't expect a huge reduction in the OS footprint given the internet browser's tab feature. Having multiple tabs open takes quite a lot of memory in addition to having all the miiverse-nintendo network stuff running all the time.

You could always limit the amount of max open tabs.
skinnyrattler
Junior Member
(11-19-2012, 03:23 PM)
skinnyrattler's Avatar

Originally Posted by Captain Smoker

Thanks for this thread.

It's funny how noone comments in this thread because it hasn't some lurid statements in the thread title. (but that's ok, better discussions)

go on.

I feel like we need a separate but equal Wii u thread on this board. Not that I don't want to hear criticism but I need level headed criticism. In trying to make game decisions. I watched a video of cod and saw a good game. I worry the push for tech prevents people from being level headed about the difference between excellent graphics and good graphics. Reading the other thread, I got the impression that most third party games are dog shit and shouldn't be bought for any reason.
lwilliams3
Member
(11-19-2012, 03:27 PM)
lwilliams3's Avatar
Thans for making this thread, blu. It is nice to be able to read and discuss a civil conversation about what we know about the Wii U so far. I hope that we will get some more facts this week.
mrklaw
MrArseFace
(11-19-2012, 03:29 PM)
mrklaw's Avatar

Originally Posted by nikatapi

Don't expect a huge reduction in the OS footprint given the internet browser's tab feature. Having multiple tabs open takes quite a lot of memory in addition to having all the miiverse-nintendo network stuff running all the time.

1) reload tabs if you're playing a game and have other stuff open
2) shut the browser down

all you have to store is the current tabs' URLs, you can resume from there.
outunderthestars
He's not our sharpest knife.
(11-19-2012, 03:29 PM)
outunderthestars's Avatar

Originally Posted by Goodlife

How many GB's do the top end PC games require?

Take BF3 for example.... for it's Max settings it requires 4GB of RAM.
Bare in mind that a good chunk of that is going to be required for running the OS etc while the game plays.

The question shouldn't be how much ram do cutting edge games currently require, but how much ram will they require in 5 years. The Wii U looks like it will end up being as woefully under-powered as the original Wii was. :(
jaosobno
Member
(11-19-2012, 03:30 PM)
jaosobno's Avatar
Any news that we could get more in depth analysis of WiiU this week? CPU and GPU details are still very obscure and reviews are very general in their description of hardware.

Although, Nintendo could just cut the crap and provide us with the final specs. We're gonna get them anyway.
Durante
I'm taking it FROM here, so says Mr. Stewart
(11-19-2012, 03:33 PM)
Durante's Avatar

Originally Posted by blu

  • AMD R700-originating GPU (R700 is AMD architecture codename 'Wekiva'), evolved into its own architecture (AMD architecture codename 'Mario'), relying on MEM1 for framebuffer purposes, but also for local render targets and scratch-pad purposes.
  • Memory access specifics: both MEM1 and MEM2 are read/write accessible by the CPU, both subject to caching.

Can you quote the sources for these 2? Not doubting them really, seems reasonable, but since you put them as "hard facts" I'd assume you have solid sources -- and I hadn't seen these confirmed anywhere.

Originally Posted by skinnyrattler

Reading the other thread, I got the impression that most third party games are dog shit and shouldn't be bought for any reason.

You have to be more specific. With some, like CoD, it seems the issues are smaller, but for some (the worst offender being Epic Mickey 2) the performance truly is "dog shit".

But generally I think discussing software issues (or the lack thereof) should be kept out of this thread, let's focus on the hardware.
Last edited by Durante; 11-19-2012 at 03:35 PM.
blu
Member
(11-19-2012, 03:34 PM)
blu's Avatar

Originally Posted by ozfunghi

Blu, since you are one of the few sane people that actually know what they are talking about, do you have any ideas as to how Nintendo expects developers to put the puzzle together, so that the so called bottlenecks (such as BW for instance) can be overcome to output games of 360+ quality?

If i read the statements from Shin'n, they seem happy about the way the memory is set up, about CPU/GPU balance etc... yet what nitwits like me are seeing in those numbers, it's looking anything but optimistic. I mean it can't be as simple as slapping on 32 MB of eDRAM on there, right? Which is only a 22 MB increase over XB360.

You shouldn't look at WiiU's eDRAM as 360's eDRAM increased. The latter was merely a framebuffer (though Xenos could output directly to main RAM as well, but you wouldn't use that for normal fb purposes). On the WiiU the eDRAM will likely also host local render targets than never have to leave their spot during their lifetime (ie. never have to be resolved to main RAM). That means such render targets could be use BW and latency beyond the capabilities of consoles we have seen. For example, a 2k * 2k depth-buffer render target (which, depending on the shadow technique employed, could result in very nice shadows at 720p), would occupy a tad over 15MB - depending on the rendering particularities of the game (size of main render targets, other uses for eDRAM, etc), that could result in quality of shadows never seen on consoles before. But this is an ultra-trivial example - way more interesting are the things that the CPU and GPU can do between themselves. Think CPU reading back render targets at impossible speeds, CPU passing down data to the GPU (issuing/modifying draw calls) ultra fast, etc. All those things require modifications to existing pipelines, though.


Originally Posted by Durante

Can you quote the sources for these 2? Not doubting them really, seems reasonable, but since you put them as "hard facts" I'd assume you have solid sources -- and I hadn't seen these confirmed anywhere.

Those hard facts are in the 'leaks vouched by gaffers' category. And I personally vouch for those two. You can take them on faith, or reject them, but I'm not at liberty to reveal the sources.
Last edited by blu; 11-19-2012 at 03:36 PM.
dan2026
Banned
(11-19-2012, 03:34 PM)
I tell you what, if I were Sony and Microsoft, I'd be watching this saga unfold very carefully.
cyberheater
PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 XBOX PS4 PS4
(11-19-2012, 03:34 PM)
cyberheater's Avatar

Originally Posted by Durante

Can you quote the sources for these 2? Not doubting them really, seems reasonable, but since you put them as "hard facts" I'd assume you have solid sources -- and I hadn't seen these confirmed anywhere.

I was wondering the same. I can see why the GPU would have access but not sure if the CPU would have direct access.
Kenka
Member
(11-19-2012, 03:35 PM)
Kenka's Avatar
Great OP blu.

Originally Posted by blu

You shouldn't look at WiiU's eDRAM as 360's eDRAM increased. The latter was merely a framebuffer (though Xenos could output directly to main RAM as well, but you wouldn't use that for normal fb purposes). On the WiiU the eDRAM will likely also host local render targets than never have to leave their spot during their lifetime (ie. never have to be resolved to main RAM). That means such render targets could be use BW and latency beyond the capabilities of consoles we have seen. For example, a 2k * 2k depth-buffer render target (which, depending on the shadow technique employed, could result in very nice shadows at 720p), would occupy a tad over 15MB - depending on the rendering particularities of the game (size of main render targets, other uses for eDRAM, etc), that could result in quality of shadows never seen on consoles before. But this is an ultra-trivial example - way more interesting are the things that the CPU and GPU can do between themselves. Think CPU reading back render targets at impossible speeds, CPU passing down data to the GPU (issuing/modifying draw calls) ultra fast, etc. All those things require modifications to existing pipelines, though.

By "pipelines" you are talking about middlewares, engines and stuff? If true, then it might be harder to port X360 games on the console than previously thought.
FoxHimself
Member
(11-19-2012, 03:36 PM)
FoxHimself's Avatar

Originally Posted by jaosobno

Any news that we could get more in depth analysis of WiiU this week? CPU and GPU details are still very obscure and reviews are very general in their description of hardware.

Although, Nintendo could just cut the crap and provide us with the final specs. We're gonna get them anyway.

We still don't know precise 3DS specs. Don't hold your breath.
dark10x
60 fps 60 fps 60 fps 60 fps 30 fps 60 fps 60 fps 30 fps
(11-19-2012, 03:36 PM)
dark10x's Avatar

Originally Posted by outunderthestars

The question shouldn't be how much ram do cutting edge games currently require, but how much ram will they require in 5 years. The Wii U looks like it will end up being as woefully under-powered as the original Wii was. :(

There is a silver lining here...

There were plenty of nice looking Wii games released but nearly all of them were compromised by the awful Wii video output. The low resolution and poor video DAC resulted in ugly, blurry image quality across the board. When those same games were viewed using Dolphin running at a higher resolution, however, many of them became quite attractive.

So with the WiiU now able to output in HD resolutions we have removed the awful 480p barrier which limited the original Wii. Even if Nintendo were to release games pushing visuals no more advanced than what they delivered on Wii, at least the image quality could be much cleaner. That alone will make a huge difference, I think.
dumbo
Junior Member
(11-19-2012, 03:38 PM)

Originally Posted by blu

Think CPU reading back render targets at impossible speeds, CPU passing down data to the GPU (issuing/modifying draw calls) ultra fast, etc. All those things require modifications to existing pipelines, though.

Probably an obvious question - it seems the tablet recieves a compressed feed from the console, so I assume that it's screen is also rendered by the primary GPU (especially if they share assets etc)... is it likely that the framebuffer etc for the tablet is also stored/managed in the eDRAM?
Relix
he's Virgin Tight™
(11-19-2012, 03:38 PM)
Relix's Avatar
Thank you. Was getting bad to swim all the shitty posts on the other threads
Goodlife
Member
(11-19-2012, 03:39 PM)
Goodlife's Avatar

Originally Posted by outunderthestars

The question shouldn't be how much ram do cutting edge games currently require, but how much ram will they require in 5 years. The Wii U looks like it will end up being as woefully under-powered as the original Wii was. :(

PS360 are 7 years old now. The 360 has 512mb, the PS3 less than that.
I know they are starting to show their age a bit, but I'd hardly call them "woefully under-powered" especially when they are still knocking out games like Halo4.
Corky
Nine out of ten orphans can't tell the difference.
(11-19-2012, 03:39 PM)
Corky's Avatar
So was there any rumor that was close to nailing the specs?
z0m3le
Junior Member
(11-19-2012, 03:43 PM)
z0m3le's Avatar

Originally Posted by outunderthestars

The question shouldn't be how much ram do cutting edge games currently require, but how much ram will they require in 5 years. The Wii U looks like it will end up being as woefully under-powered as the original Wii was. :(

Considering that Wii U is likely designed for 720p (ram speed hints at this) and the reality that the System's RAM will be freed up for games as time goes on, (happens with all consoles) and the general clunkyness of Wii U's OS... I would assume half of the RAM could be freed up relatively easily, even if it means moving browser tabs into flash memory later on.

1080p is a bit over 2.25 this number, so you are looking at ~3.75GB of Game RAM needed for 1080p to match just 1.5GB RAM for 720p.

It might not work out this way for Nintendo, but they are likely expecting 3rd parties to release 1080p versions of their Wii U games on PS4/XB3, which if ~500-600GFLOPs of GPU power is available to the Wii U, a similar game running 1080p would need 1.5-1.8TFLOPs form the GPU. This is not talking about any other bottlenecks a system comparison would likely have, but without real numbers for Wii U, and completely imaginary numbers for other "current gen" consoles. (they can change and be finely tweaked at this point even if I had a dev kit in front of me)
suikodan
Member
(11-19-2012, 03:43 PM)
suikodan's Avatar
I have some questions regarding the system:

MiiVerse: I can access it via the main menu. However when I'm in-game and try to post something to MiiVerse, it says to go and configure it first. What gives? Am I missing something?

WiiMote pointer: Can I use the WiiMote to point at the screen in the main menu for example? I don't want to use the Wiipad at all times, especially when I'm playing coop Mario with my wife. I know I can use the + to navigate but it feels like using a mouse only for its buttons.

Also, I wasn't able to play the Zelda minigame in NintendoLand since it wanted me to point the WiiMotes at the screen but even as I was pointing, it didn't seem to register it even though the sensor bar sits on top on my TV and I was pointing at the right place.

Thanks to anyone that can answer me.
Durante
I'm taking it FROM here, so says Mr. Stewart
(11-19-2012, 03:44 PM)
Durante's Avatar

Originally Posted by dumbo

Probably an obvious question - it seems the tablet recieves a compressed feed from the console, so I assume that it's screen is also rendered by the primary GPU (especially if they share assets etc)... is it likely that the framebuffer etc for the tablet is also stored/managed in the eDRAM?

That's very likely, at least for games that do more complex graphics on the gamepad. Given what we know about the main RAM pool BW it wouldn't be a good idea to keep a framebuffer there (at least if you also want to use that BW for other things).

However, I could imagine that games could try to optimize their eDRAM usage by only keeping things like that in the eDRAM for a fraction of a frame, and put other data in afterwards.
jaosobno
Member
(11-19-2012, 03:50 PM)
jaosobno's Avatar

Originally Posted by blu

he shared access to MEM1 pool by the GPU and CPU alike indicated the two units are meant to interact at low latency, not normally seen in previous console generations. Definitely a subject for interesting debates this one is.

So, eDRAM is used as sort of ultra high speed buffer for data transfer between CPU and GPU? So where is the framebuffer? In eDRAM too?
Durante
I'm taking it FROM here, so says Mr. Stewart
(11-19-2012, 03:52 PM)
Durante's Avatar
Talking about the eDRAM, can we agree that its use is basically required just to achieve parity with PS3/360? Because if so, then I can't imagine that being a very good thing for developer support. In the end it boils down to manual caching / memory management, which was very unpopular on PS3. Clearly it's easier to deal with a single 32 MB buffer than 6 256 kB buffers, but the central idea is still that of a user-managed scratchpad memory.

Now, personally, I love that idea and the associated programming challenges and opportunities (and loved it back in Cell, and when doing GPGPU), but I wonder if the general game developer mindset has changed enough to make it viable as a central pillar of a system's design.

I wish we had the exact bandwidth/latency of the EDRAM to GPU and CPU.
gofreak
GAF's Bob Woodward
(11-19-2012, 03:52 PM)
gofreak's Avatar

Originally Posted by USC-fan

GPU aka GPGPU

r700 based at 40nm

We have performance per watt figures for a r700 base gpu at 40nm.

4770 at 40nm is 12 glfops per watt is 137 mm˛ die with 826m
edram takes up at least 37 mm˛ and the wiiu gpu is 156 mm˛.

That leaves at most 100-110 mm˛ for the gpu. Since you have other thing on the die also.

So now we have the power number, gpu on 40nm process and still a r700 based. Man I called all of this....

If you used the whole 33 watts for just gpu you are looking at 396 gflops which is impossible.

More likely using 20-25w at most for gpu, so you have a range of 240 - 300 glfops. Xbox 360 and ps3 are 240-250 gflops.

My copy/paste from the anandtech thread

Anyone else want to comment on this?

It would be pretty disappointing if the GPU was in the 1.x range of PS3/360 rather than 2+x at least.

Where are the folks with supposed 'insider knowledge'?
muu
Member
(11-19-2012, 03:52 PM)

Originally Posted by dan2026

This is what concerns me. When the new Xbox and PS consoles come out and are comparable in power. The Wii U is going to have to take over the market for devs to give a damn about it.

At the very best you'll probably see crap ports of games built for other systems.

This assumes gangbuster sales of the competitors as well as Nintendo failing. Technical issues prevented proper wii ports but the greater issue was that the ps2 had a landslide victory preceding the ps3. Dev life cycles are too long to wildly change directions. We are not in the same situation today, and much if this 'fear' is unfounded. And this belongs in a sales thread.

From a tech standpoint I'm more interested in the game pad. What do we know about the thing at this point? Is it the frequency range that is the cause of performance degration over walls or something else? What's the prospect for range extenders in the future?
Last edited by muu; 11-19-2012 at 03:55 PM.
Durante
I'm taking it FROM here, so says Mr. Stewart
(11-19-2012, 03:54 PM)
Durante's Avatar

Originally Posted by gofreak

Anyone else want to comment on this?

I think it can be a bit faster than that, if it's lower clocked / lower voltage but wider. Remember that power consumption scales linearly with die size but quadratic in voltage. Also, the 33 Watts are for NSMBU, it would be nice to have a measurement for something slightly more challenging such as Zombi U.

Still, the pre-release 600 GFlop rumours are now very unlikely.
KageMaru
Member
(11-19-2012, 03:58 PM)
KageMaru's Avatar
Great thread blu! Hopefully the amount of shit is kept to a minimum in this thread.

One thing I've always been curious about is the one dev comment saying the CPU has less threads than the CPU in the PS360. So I'm curious if each core is one thread or if maybe the one CPU with the larger cache is possibly dual threaded while the other two cores are single threaded equaling 4 threads total and still less than the CPUs in the current console.

Originally Posted by z0m3le

Considering that Wii U is likely designed for 720p (ram speed hints at this) and the reality that the System's RAM will be freed up for games as time goes on, (happens with all consoles) and the general clunkyness of Wii U's OS... I would assume half of the RAM could be freed up relatively easily, even if it means moving browser tabs into flash memory later on.

1080p is a bit over 2.25 this number, so you are looking at ~3.75GB of Game RAM needed for 1080p to match just 1.5GB RAM for 720p.

It might not work out this way for Nintendo, but they are likely expecting 3rd parties to release 1080p versions of their Wii U games on PS4/XB3, which if ~500-600GFLOPs of GPU power is available to the Wii U, a similar game running 1080p would need 1.5-1.8TFLOPs form the GPU. This is not talking about any other bottlenecks a system comparison would likely have, but without real numbers for Wii U, and completely imaginary numbers for other "current gen" consoles. (they can change and be finely tweaked at this point even if I had a dev kit in front of me)

Rendering is not that simple and it's HIGHLY unlikely we'll be seeing PS4/720 down ports running at 1080p on the Wii-U. If anything, lower resolution will be necessary to help compensate for the gap in power between the consoles.
Last edited by KageMaru; 11-19-2012 at 06:08 PM.
randomengine
protecting the average person from the looming threat of Japanese cultural contamination
(11-19-2012, 03:59 PM)
randomengine's Avatar
Here is my concern.

Blu-ray and blu-ray-like disc readers are slow. That's been the major problem with PS3, but games can be installed on the HDD on the PS3.

I fear that with the small amount of storage, and the blu-ray-like drive, that games will have issues that won't be easily resolved through installing.

Is the Wii U blu-ray-like drive much faster than the PS3 one?
pestul
Member
(11-19-2012, 03:59 PM)
pestul's Avatar

Originally Posted by jaosobno

Any news that we could get more in depth analysis of WiiU this week? CPU and GPU details are still very obscure and reviews are very general in their description of hardware.

Although, Nintendo could just cut the crap and provide us with the final specs. We're gonna get them anyway.

Probably not.
Basileus777
Member
(11-19-2012, 04:00 PM)
Basileus777's Avatar

Originally Posted by Kenka

Great OP blu.


By "pipelines" you are talking about middlewares, engines and stuff? If true, then it might be harder to port X360 games on the console than previously thought.

There's also this quote from Digital Foundry which is also worrisome.

One developer working on a key AAA franchise port told us anonymously that the Nintendo toolchain is "fighting us every step of the way", suggesting that plenty of work still needs to be done in getting development workflow up to scratch. Will the tools improve in time? Will publishers have the time and the financial incentive to stick with it?

gofreak
GAF's Bob Woodward
(11-19-2012, 04:01 PM)
gofreak's Avatar

Originally Posted by Durante

I think it can be a bit faster than that, if it's lower clocked / lower voltage but wider. Remember that power consumption scales linearly with die size but quadratic in voltage. Also, the 33 Watts are for NSMBU, it would be nice to have a measurement for something slightly more challenging such as Zombi U.

Still, the pre-release 600 GFlop rumours are now very unlikely.

Thanks. I'll mentally peg it at 350-400Gflops for now. We'll see if any die shots come out, might help us get more specific.
mrklaw
MrArseFace
(11-19-2012, 04:01 PM)
mrklaw's Avatar

Originally Posted by Durante

Talking about the eDRAM, can we agree that its use is basically required just to achieve parity with PS3/360? Because if so, then I can't imagine that being a very good thing for developer support. In the end it boils down to manual caching / memory management, which was very unpopular on PS3. Clearly it's easier to deal with a single 32 MB buffer than 6 256 kB buffers, but the central idea is still that of a user-managed scratchpad memory.

Now, personally, I love that idea and the associated programming challenges and opportunities (and loved it back in Cell, and when doing GPGPU), but I wonder if the general game developer mindset has changed enough to make it viable as a central pillar of a system's design.

I wish we had the exact bandwidth/latency of the EDRAM to GPU and CPU.

this is my thinking. Seems a no brainer that edram is essential for parity, given the slower main memory.

And then you're into how much developers will push to make the most of the hardware. Ticking clock on that one I guess, with the 720/PS4 over the horizon.

Actually, its more 'how much publishers are willing to pay developers to push. I'm sure developers would love to play with the architecture, but many ports will be done on contracts with costs that won't allow that freedom.
pulsemyne
Member
(11-19-2012, 04:02 PM)
what seems odd is the differant memory types used in differant machines. One machine stripped down was using samsung gDDR 3 while another was using normal DD3. Seems a bit odd. The samsung operates at 1.35 volts while DDR 3 operates at 1.5 volts.
Durante
I'm taking it FROM here, so says Mr. Stewart
(11-19-2012, 04:03 PM)
Durante's Avatar

Originally Posted by KageMaru

Rendering is not that simple and it's HIGHLY unlikely we'll be seeing PS4/720 down ports running at 1080p on the Wii-U. If anything, the lower resolution will be necessary to help compensate for the gap in power before the consoles.

I think his point was that games targeting 1080p on the big consoles could be ported to run at 720p on Wii U.

I think it's not a bad idea in general, and I made posts to that effect a while ago. The issues are that
- it assumes developers will be going for 1080p on those consoles
- it only helps in scaling graphics, which are pretty easy to scale in the first place. General purpose code is much harder to scale, and I doubt the Wii U CPU (and its bandwidth) will help there vis-a-vis PS4/720 (Even though the latter will probably also feature disappointing CPUs, at least IMHO)

Originally Posted by pulsemyne

what seems odd is the differant memory types used in differant machines. One machine stripped down was using samsung gDDR 3 while another was using normal DD3. Seems a bit odd. The samsung operates at 1.35 volts while DDR 3 operates at 1.5 volts.

The small "g" is just something that manufacturer uses to denote a specific target application.
The Boat
Member
(11-19-2012, 04:04 PM)
The Boat's Avatar

Originally Posted by randomengine

Here is my concern.

Blu-ray and blu-ray-like disc readers are slow. That's been the major problem with PS3, but games can be installed on the HDD on the PS3.

I fear that with the small amount of storage, and the blu-ray-like drive, that games will have issues that won't be easily resolved through installing.

Is the Wii U blu-ray-like drive much faster than the PS3 one?

Wii U drive is 22.5 MB/s is 9 MB/s.
EDIT: I also don't think storage will be a problem.
randomengine
protecting the average person from the looming threat of Japanese cultural contamination
(11-19-2012, 04:05 PM)
randomengine's Avatar

Originally Posted by The Boat

Wii U drive is 22.5 MB/s is 9 MB/s.
EDIT: I also don't think storage will be a problem.

That sounds a hell of a lot better. :D
randomengine
protecting the average person from the looming threat of Japanese cultural contamination
(11-19-2012, 04:08 PM)
randomengine's Avatar
I think the Wii U will be more successful with this level of power than most people think. I think it is ingenious actually.

Think about it. The next generation is going to come out. Costs will explode, but profits will lag for quite a long period of time. Wii U will be the safe bet. It extends the life of the current generation, so everyone who wants to make games at the current level of power can do so, much more cheaply and at a higher profit, while at the same time not making games for an old, dead system. I can see a lot of smaller Japanese developers who absolutely don't want to absorb next-gen costs to use Wii U as a safe haven.
ozfunghi
Member
(11-19-2012, 04:09 PM)
ozfunghi's Avatar

Originally Posted by Basileus777

There's also this quote from Digital Foundry which is also worrisome.

Actually... i don't find that worrisome, it shows the potential is a lot better than what we're seeing right now in straight ports (made by small teams on short notice).
AmFreak
Member
(11-19-2012, 04:13 PM)

Originally Posted by 1-D_FTW

I have a serious question for buyers: Looking at that Andtech benchmark, would you guys have gladly sacrificed backwards compatibility for a better CPU (and less bottlenecks)? Can't help but think they wouldn't have been better off going with even an ARM based CPU if power efficiency was their top concern. Do you feel this was a valid tradeoff?

I fail to see how the slow cpu has anything to do with bc. The Wii's cpu is a standard PowerPC cpu with extensions to it's fpu. They should have been able to work something out. Even if it was impossible, they could have:
a) built a seperate Broadway chip on board.
b) make only one of the 3 cores Broadway compatible.
c) increase the clock of the chip (may be not possible)
d) increase the core count.
TheGuardian
Member
(11-19-2012, 04:15 PM)
TheGuardian's Avatar

Originally Posted by randomengine

I think the Wii U will be more successful with this level of power than most people think. I think it is ingenious actually.

Think about it. The next generation is going to come out. Costs will explode, but profits will lag for quite a long period of time. Wii U will be the safe bet. It extends the life of the current generation, so everyone who wants to make games at the current level of power can do so, much more cheaply and at a higher profit, while at the same time not making games for an old, dead system. I can see a lot of smaller Japanese developers who absolutely don't want to absorb next-gen costs to use Wii U as a safe haven.

Nothing is preventing publishers from just having some less demanding games being cross generation, with a PS3/360/WiiU version and a pumped up PS4/720/PC version (higher resolution, better framerate, more effects/etc.)
Kenka
Member
(11-19-2012, 04:15 PM)
Kenka's Avatar
I am hesitating, I don't know exactly what Nintendo was trying to achieve if indeed the gap in raw power between the PS3 and the WiiU is narrow. An explanation for that would be that on one hand, they can keep costs and heat exhaust low, and propose similar graphics while on the other end, they can push expensive control devices. That would mean that they really believed in the Gamepad since they have gone the low-power console way to favour a control scheme.

But the problem is that this same console seems to not be easy to develop for doesn't and cater on technicians that have worked on past generation hardware, which it should in order to fulfill its potential! I hope the latter assumption is incorrect but the memory architecture suggests Nintendo went its own way.
Father_Brain
Samus made me a Widower :(
(11-19-2012, 04:19 PM)
Father_Brain's Avatar

Originally Posted by ozfunghi

Actually... i don't find that worrisome, it shows the potential is a lot better than what we're seeing right now in straight ports (made by small teams on short notice).

I'd keep my expectations very, very low about whether that potential will be realized, particularly if we're talking about Western third parties.
Basileus777
Member
(11-19-2012, 04:19 PM)
Basileus777's Avatar

Originally Posted by ozfunghi

Actually... i don't find that worrisome, it shows the potential is a lot better than what we're seeing right now in straight ports (made by small teams on short notice).

The potential may be there, but will publishers put the money and effort into it, especially if Nintendo's own tools are an obstacle? That's what is concerning.
RukusProvider
Member
(11-19-2012, 04:19 PM)
RukusProvider's Avatar
Important to keep in mind that the GPU will have to serve 2 screens (tv and tablet). This will take resources from the main game presentation and ultimately limit it.

I wonder if any devs will make a game for the primary screen only to use maximum system power?
mrklaw
MrArseFace
(11-19-2012, 04:20 PM)
mrklaw's Avatar

Originally Posted by randomengine

I think the Wii U will be more successful with this level of power than most people think. I think it is ingenious actually.

Think about it. The next generation is going to come out. Costs will explode, but profits will lag for quite a long period of time. Wii U will be the safe bet. It extends the life of the current generation, so everyone who wants to make games at the current level of power can do so, much more cheaply and at a higher profit, while at the same time not making games for an old, dead system. I can see a lot of smaller Japanese developers who absolutely don't want to absorb next-gen costs to use Wii U as a safe haven.

smaller/indie devs have been producing great looking games on PS3/360 already this gen, and that will continue next gen. You don't have to spend a fortune to develop a game.


Originally Posted by Durante

I think his point was that games targeting 1080p on the big consoles could be ported to run at 720p on Wii U.

I think it's not a bad idea in general, and I made posts to that effect a while ago. The issues are that
- it assumes developers will be going for 1080p on those consoles
- it only helps in scaling graphics, which are pretty easy to scale in the first place. General purpose code is much harder to scale, and I doubt the Wii U CPU (and its bandwidth) will help there vis-a-vis PS4/720 (Even though the latter will probably also feature disappointing CPUs, at least IMHO)

The small "g" is just something that manufacturer uses to denote a specific target application.


if PS4/720 developers go for 720p for image quality reasons or being able to throw more at the screen, then that argument for WiiU is weakened.

Originally Posted by RukusProvider

Important to keep in mind that the GPU will have to serve 2 screens (tv and tablet). This will take resources from the main game presentation and ultimately limit it.

I wonder if any devs will make a game for the primary screen only to use maximum system power?


there are several games already which are primarily on the TV, and just mirror the image to the gamepad. I'm assuming that isnt' redrawing anything, and that sending a mirrored display out to the gamepad is trivial in terms of processor cost
SapientWolf
Member
(11-19-2012, 04:26 PM)
SapientWolf's Avatar

Originally Posted by Durante

Talking about the eDRAM, can we agree that its use is basically required just to achieve parity with PS3/360? Because if so, then I can't imagine that being a very good thing for developer support. In the end it boils down to manual caching / memory management, which was very unpopular on PS3. Clearly it's easier to deal with a single 32 MB buffer than 6 256 kB buffers, but the central idea is still that of a user-managed scratchpad memory.

Now, personally, I love that idea and the associated programming challenges and opportunities (and loved it back in Cell, and when doing GPGPU), but I wonder if the general game developer mindset has changed enough to make it viable as a central pillar of a system's design.

I wish we had the exact bandwidth/latency of the EDRAM to GPU and CPU.

Could that be the reason why the multiplayer portion of BLOPS 2 didn't seem to suffer from heavy framerate issues like the other launch titles? Because it was designed to have a framebuffer that can fit in eDRAM?

I don't see how the memory bandwidth wouldn't be a huge bottleneck in ports.
Gahiggidy
My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
(11-19-2012, 04:34 PM)
Gahiggidy's Avatar
I remain optimistic. Nintendo has some very shred engineers. People who aren't influenced by peer pressure and just focus on designing a system that gets the most bang for the buck.
Electivirus
Member
(11-19-2012, 04:34 PM)
Electivirus's Avatar

Originally Posted by 1-D_FTW

I have a serious question for buyers: Looking at that Andtech benchmark, would you guys have gladly sacrificed backwards compatibility for a better CPU (and less bottlenecks)? Can't help but think they wouldn't have been better off going with even an ARM based CPU if power efficiency was their top concern. Do you feel this was a valid tradeoff?

I haven't the slightest idea what half of that sentence means (I'm far from the most tech-oriented gamer out there), so probably not. BC is a pretty big requirement for me. :P
Durante
I'm taking it FROM here, so says Mr. Stewart
(11-19-2012, 04:36 PM)
Durante's Avatar

Originally Posted by SapientWolf

Could that be the reason why the multiplayer portion of BLOPS 2 didn't seem to suffer from heavy framerate issues like the other launch titles? Because it was designed to have a framebuffer that can fit in eDRAM?

I don't see how the memory bandwidth wouldn't be a huge bottleneck in ports.

The issue isn't whether or not the FB can fit -- every reasonable FB for the Wii U will fit in 32 MB. The issue is that to match e.g. 360 (it's a much more comparable base than PS3) developers will have to fit more than just the FB into the eDRAM.
orioto
Good Art™
(11-19-2012, 04:38 PM)
orioto's Avatar

Originally Posted by randomengine

I think the Wii U will be more successful with this level of power than most people think. I think it is ingenious actually.

Think about it. The next generation is going to come out. Costs will explode, but profits will lag for quite a long period of time. Wii U will be the safe bet. It extends the life of the current generation, so everyone who wants to make games at the current level of power can do so, much more cheaply and at a higher profit, while at the same time not making games for an old, dead system. I can see a lot of smaller Japanese developers who absolutely don't want to absorb next-gen costs to use Wii U as a safe haven.

This can be true in Japan, if the WiiU quickly become more popular than the PS3. But the problem in the west is that for that to happen, the WiiU (and that was the risk from the beginning) won't be perceived as anything "new" by gamers if it's no at least slightly more cool graphically. This "sub ps360" first impression is deadly in that case. A big portion of the audience for those games already have a PS3, a 360 or a PC, and that's before potential massive price drop making them even more mainstream. They won't see any reason to buy a WiiU, who has the disadvantages of "new" without the advantages...

Its fate could be decided right now.
That's why i'm curious if this whole mess is a real hardware long term problem or just the rushed launch effect. At least we can see things are different in that case, that for the Wii. It's pretty clear. The Wii had so much curiosity for its innovation that the tech talk wasn't predominant. But the WiiU is more judged by gamers on the same ring as other consoles, with the same criteria, and that hurts. I always said Nintendo has to choose a side. They keep playing the blue ocean or they play the bad boy rules, but they can't do half one and the other. We'll see.
Majukun
Member
(11-19-2012, 04:42 PM)
Majukun's Avatar

Originally Posted by randomengine

I think the Wii U will be more successful with this level of power than most people think. I think it is ingenious actually.

Think about it. The next generation is going to come out. Costs will explode, but profits will lag for quite a long period of time. Wii U will be the safe bet. It extends the life of the current generation, so everyone who wants to make games at the current level of power can do so, much more cheaply and at a higher profit, while at the same time not making games for an old, dead system. I can see a lot of smaller Japanese developers who absolutely don't want to absorb next-gen costs to use Wii U as a safe haven.

that's what nintendo hoped would have been the case with the wii

It wasn't

it cost less to make one game on one console and port it to the others than make an exclusive games..basically it's what killed the wii.

Thread Tools