• Register
  • TOS
  • Privacy
  • @NeoGAF

IdeaMan
My source is my ass!
(09-18-2012, 06:09 PM)
IdeaMan's Avatar

Originally Posted by Always-honest

So now everybody has to act like an unreasonable jackass..?

ideaman, i have a question you might be willing to answer.
There seem to be quite a fair amount of jaggies in the direct capture screens. Do you know anything about the Anti alias capabilities of the Wii-?. I'm a bit worried to be honest. I'd hate see the nice HD graphics ruined by horrible jaggies. Have you heard people talk about adding AA in the industry?

I've been told the reasons since E3 and tried to reassure people who asked why there wasn't AA in a lot of games in several threads, by repeating the words of my sources "there will be AA at launch". True to be told, i haven't understood everything, so i needed some insight of people like alstrong or blu on AA before revealing more, and just forgot doing it as i'm more busy than pre-E3 nowadays.
Always-honest
always-end-with-a-swirl
(09-18-2012, 06:11 PM)
Always-honest's Avatar

Originally Posted by IdeaMan

I've been told the reasons since E3 and tried to reassure people who asked why there wasn't AA in a lot of games in several threads, by repeating the words of my sources "there will be AA at launch". True to be told, i haven't understood everything, so i needed some insight of people like alstrong or blu on AA before revealing more, and just forgot doing it as i'm more busy than pre-E3 nowadays.

Okay thanx. There is hope
IdeaMan
My source is my ass!
(09-18-2012, 06:11 PM)
IdeaMan's Avatar

Originally Posted by Stewox

My biggest concern is that most of that 1GB OS RAM will be most of the time sitting there doing nothing. I could go on to discuss this in detail but it's a bit too early, don't want to speculate too violently, need more info first.

I hope they will put in place some kind of dynamic allocating of memory resources, to not waste let's say 0,5GB of ram from this 1GB pool "in case this COD player would need to switch to the browser and store/cache the game". Well, it's true IF in those 1GB of ram, there is indeed a huge "ghost" room reserved for multi-tasking.

We can even imagine a "hardware hardcore mode", where a dev would choose to force a real pause/quit of its game rather than multi-tasking, to fully use this reserved space.

All this is relevant if Nintendo will indeed use this 1GB of ram for "OS" this way.
Last edited by IdeaMan; 09-18-2012 at 06:16 PM.
SapientWolf
Member
(09-18-2012, 06:13 PM)
SapientWolf's Avatar

Originally Posted by Stewox

My biggest concern is that most of that 1GB OS RAM will be most of the time sitting there doing nothing. I could go on to discuss this in detail but it's a bit too early, don't want to speculate too violently, need more info first.

If it's slower memory then it may not be much use for gaming. It would be pretty unwise to carve out 1GB of GDDR3 just for an OS.
Snakeyes
Member
(09-18-2012, 06:15 PM)
Snakeyes's Avatar

Originally Posted by superchunk

Its not like 360 had "next-gen" visuals from its 3rd party current-gen ports in 2005.

I'd say they all looked at least slightly better (except for Gun) without taking the resolution bump into account.
Stewox
Banned
(09-18-2012, 06:15 PM)

Originally Posted by IdeaMan

I hope they will put in place some kind of dynamic allocating of memory resources, to not waste let's say 0,5GB of ram from this 1GB pool "in case this COD player would need to switch to the browser and store/cache the game".

Just pray it's the same kind of memory. I'm positive.


EDIT:

Originally Posted by SapientWolf

If it's slower memory then it may not be much use for gaming. It would be pretty unwise to carve out 1GB of GDDR3 just for an OS.

Yeah, *beaten*.

Also, it's their multi-tasking suspend stuff that we were talking about, that's why it's such a big chunk but they need to find solutions.
Last edited by Stewox; 09-18-2012 at 06:18 PM.
IdeaMan
My source is my ass!
(09-18-2012, 06:18 PM)
IdeaMan's Avatar

Originally Posted by Stewox

Just pray it's the same kind of memory.


EDIT:


Yeah, *beaten*.

Well they could always unload some things that are less demanding in terms of ram speed on those less quick chip, if it's designed that way.
Fourth Storm
Member
(09-18-2012, 06:25 PM)
Fourth Storm's Avatar

Originally Posted by SapientWolf

If it's slower memory then it may not be much use for gaming. It would be pretty unwise to carve out 1GB of GDDR3 just for an OS.

Indeed. That so much is reserved definitely points to DDR3. The other thing is (and this is where my knowledge of matters is more fuzzy, but I've read some beyond3D posts on the subject :P ) that increasing RAM size without significantly increasing bandwidth will only net you so much. There comes a point where the benefits become more limited in application (you could load up the next level with the extra RAM say, but you're not going to get more frames per second) and a 2 GB to 25-30 GB/s ratio seems way off for gaming applications. 1 GB w/ that bandwidth ballpark makes more sense.
blu
Member
(09-18-2012, 07:26 PM)
blu's Avatar

Originally Posted by Fourth Storm

Indeed. That so much is reserved definitely points to DDR3. The other thing is (and this is where my knowledge of matters is more fuzzy, but I've read some beyond3D posts on the subject :P ) that increasing RAM size without significantly increasing bandwidth will only net you so much. There comes a point where the benefits become more limited in application (you could load up the next level with the extra RAM say, but you're not going to get more frames per second) and a 2 GB to 25-30 GB/s ratio seems way off for gaming applications. 1 GB w/ that bandwidth ballpark makes more sense.

More memory will not give you more frames per second anyway, unless we're talking scenarios where your scene assets do not fit in your RAM, so you'd have to do mid-frame fetches of assets from storage, in which case storage speed becomes the bottleneck. But how is 2GB not better than 1GB at the same BW? Yes, there might be a diminishing return with increasing the amount to, say, 32GB from 16GB as you have to fill up that memory with something. At 25GB storage you can try to preload most of your content off the disk, but at 22.5MB/s that's 18.5 minutes to read the entire disk, so you need to do that in the same streaming fashion anyway. But we're not talking such amounts here - we're talking 1GB vs 2GB. Trust me, the latter is definitely better than the former.
DieH@rd
Member
(09-18-2012, 07:37 PM)
DieH@rd's Avatar
Anyone remembers "Crytek wants 8GB of RAM in next-gen consoles" and the shitstorm that ensued that news? :)

People there fought that 2gb is enough, I for one immediately accepted 8gig vision [downscaled to 4gigs if they gonna stick with gddr5]. Screw the complicated mobo design, this has to be done! :D
Xun
Member
(09-18-2012, 07:39 PM)
Xun's Avatar
I know the GPU should be hopefully quite meaty, but any idea of how the CPU will stack up against the PS3/360?
Fourth Storm
Member
(09-18-2012, 07:44 PM)
Fourth Storm's Avatar

Originally Posted by blu

More memory will not give you more frames per second anyway, unless we're talking scenarios where your scene assets do not fit in your RAM, so you'd have to do mid-frame fetches of assets from storage, in which case storage speed becomes the bottleneck. But how is 2GB not better than 1GB at the same BW? Yes, there might be a diminishing return with increasing the amount to, say, 32GB from 16GB as you have to fill up that memory with something. At 25GB storage you can try to preload most of your content off the disk, but at 22.5MB/s that's 18.5 minutes to read the entire disk, so you need to do that in the same streaming fashion anyway. But we're not talking such amounts here - we're talking 1GB vs 2GB. Trust me, the latter is definitely better than the former.

Point well taken. I agree more is better even at the same bandwidth, but you don't think there is a diminishing return beyond 1 or 1.5 GB or RAM w/ low bandwidth? From what I have read, the golden ratio of capacity:bandwidth could not be agreed upon, but having higher bandwidth memory definitely seemed to impact performance. Forgetting about the reserved amount for system applications for a moment, wouldn't you rather have 1 GB of high bandwidth GDDR5 than 2 GB of DDR3 at half the throughput? Obviously, performance is affected by the GPU and entire system as well (plus the benefits of the eDRAM, which should go beyond Xenos, as you so kindly explained), but we don't know at this point what the main graphical bottleneck of the system is.
FyreWulff
Banned
(09-18-2012, 07:44 PM)
FyreWulff's Avatar

Originally Posted by Stewox

... ff..finally, sort of, FCC IDs not revealed here obviously.

related post: http://www.neogaf.com/forum/showpost...&postcount=818





Definitely not bluetooth :)

This makes sense, as someone else on GAF crunched the numbers and figured out that the bandwidth needed to push the images out to the gamepads was about the same as Wireless-N's upper limit, so 5GHz networking makes sense.

Also means GamePads shouldn't be murdered by microwaves being run.
frankie_baby
Banned
(09-18-2012, 07:44 PM)

Originally Posted by Xun

I know the GPU should be hopefully quite meaty, but any idea of how the CPU will stack up against the PS3/360?

With the common opinion of "wii u is using enhanced Broadways so it must be bad" going round I doubt you'll get any sensible answer
Fourth Storm
Member
(09-18-2012, 07:47 PM)
Fourth Storm's Avatar

Originally Posted by FyreWulff

This makes sense, as someone else on GAF crunched the numbers and figured out that the bandwidth needed to push the images out to the gamepads was about the same as Wireless-N's upper limit, so 5GHz networking makes sense.

Also means GamePads shouldn't be murdered by microwaves being run.

Yes, but that's the A/V signal. What is sending the actual input data to and from the Gamepad? I always assumed Bluetooth for that.
Thraktor
Member
(09-18-2012, 07:55 PM)
Thraktor's Avatar

Originally Posted by Stewox

... ff..finally, sort of, FCC IDs not revealed here obviously.

related post: http://www.neogaf.com/forum/showpost...&postcount=818





Definitely not bluetooth :)

Interesting that they're going with WUP for the product codes. In the past they've almost always used the console's codename, so I was expecting something like CFE.
Graphics Horse
graphics horse
graphics horse
does whatever a
graphics horse does
(09-18-2012, 08:16 PM)
Graphics Horse's Avatar

Originally Posted by Thraktor

Interesting that they're going with WUP for the product codes. In the past they've almost always used the console's codename, so I was expecting something like CFE.

Perhaps because this has been the longest ever delay between final name reveal and release, or feels like it.
Daschysta
Member
(09-18-2012, 08:19 PM)
Daschysta's Avatar

Originally Posted by SapientWolf

If it's slower memory then it may not be much use for gaming. It would be pretty unwise to carve out 1GB of GDDR3 just for an OS.

Unless they were being conservative and always intended to allocate more for gaming once they had a true idea of the footprint the OS would require during everyday use. They did precisely that with the 3DS, it is hardly unprecedented.
Rolf NB
Member
(09-18-2012, 08:21 PM)
Rolf NB's Avatar

Originally Posted by frankie_baby

With the common opinion of "wii u is using enhanced Broadways so it must be bad" going round I doubt you'll get any sensible answer

But that is a sensible answer. Should be on par with Xbox 360 CPU-wise.
KojiKnight
Member
(09-18-2012, 08:23 PM)
KojiKnight's Avatar

Originally Posted by Thraktor

Interesting that they're going with WUP for the product codes. In the past they've almost always used the console's codename, so I was expecting something like CFE.

What's the P stand for? Remember, Wii U had several supposed codenames (Stream, Cafe, etc)... The P still sounds to me like a left over from some sort of code name.
Proc
Member
(09-18-2012, 08:24 PM)
Proc's Avatar
Pre-ordred the pro model today at ebgames...figured I'd reserve one just in case I end up wanting one. I'm on the fence right now.
Graphics Horse
graphics horse
graphics horse
does whatever a
graphics horse does
(09-18-2012, 08:26 PM)
Graphics Horse's Avatar

Originally Posted by Daschysta

Unless they were being conservative and always intended to allocate more for gaming once they had a true idea of the footprint the OS would require during everyday use. They did precisely that with the 3DS, it is hardly unprecedented.


I keep hearing this about the 3DS, but I thought it was the second CPU they opened up for devs? Wasn't the available RAM 96mb from the start, or is it more now?
ColdBlooder
Banned
(09-18-2012, 08:28 PM)
ColdBlooder's Avatar

Originally Posted by Van Owen

We've already seen Nintendo games on Wii U, and they could be done on 360.

Iwata said at an Investors meeting that NOT EVERY GAME GETS AN AAA BUDGET.

Theres your explanation.
Van Owen
Member
(09-18-2012, 08:30 PM)
Van Owen's Avatar

Originally Posted by ColdBlooder

Iwata said at an Investors meeting that NOT EVERY GAME GETS AN AAA BUDGET.

Theres your explanation.

Well I'm glad they're taking Wii U's launch seriously with cheaply made games? lol

When Retro's game looks like a good PS3/360 title it will be "IT'S THEIR FIRST GAME ON NEW HARDWARE".
ColdBlooder
Banned
(09-18-2012, 08:31 PM)
ColdBlooder's Avatar

Originally Posted by KojiKnight

What's the P stand for? Remember, Wii U had several supposed codenames (Stream, Cafe, etc)... The P still sounds to me like a left over from some sort of code name.

Well since the "Deluxe Set" is called "Premium Set" in Japan and Europe and the fact that you see a black Gamepad here, we could assume it means "Wii U Premium"

White hardware may be labeled "WUB" For Wii U Basic
ColdBlooder
Banned
(09-18-2012, 08:33 PM)
ColdBlooder's Avatar

Originally Posted by Van Owen

Well I'm glad they're taking Wii U's launch seriously with cheaply made games? lol

When Retro's game looks like a good PS3/360 title it will be "IT'S THEIR FIRST GAME ON NEW HARDWARE".

Because the mainstream audience dosenīt care about "LOOK AT DEM GRAFIX" like the few % of hardcorezzz like here on NeoGAF.
DragonSworne
Satoru Iwata and his Trilateral Commission cronies are suppressing the truth about Retro. Wake up, sheeple!
(09-18-2012, 08:33 PM)
DragonSworne's Avatar

Originally Posted by Van Owen

Well I'm glad they're taking Wii U's launch seriously with cheaply made games? lol

When Retro's game looks like a good PS3/360 title it will be "IT'S THEIR FIRST GAME ON NEW HARDWARE".

And when it looks better people will say "360 and PS3 could do it".
Graphics Horse
graphics horse
graphics horse
does whatever a
graphics horse does
(09-18-2012, 08:37 PM)
Graphics Horse's Avatar

Originally Posted by ColdBlooder

Well since the "Deluxe Set" is called "Premium Set" in Japan and Europe and the fact that you see a black Gamepad here, we could assume it means "Wii U Premium"

White hardware may be labeled "WUB" For Wii U Basic

Wub wub.

No I don't think the premium and basic batteries have different product codes :p (WUP-012)

WUPad, unless the main unit is also WUP, then I dunno.
Thraktor
Member
(09-18-2012, 08:37 PM)
Thraktor's Avatar

Originally Posted by SapientWolf

If it's slower memory then it may not be much use for gaming. It would be pretty unwise to carve out 1GB of GDDR3 just for an OS.

They wouldn't be using GDDR3 in any case, but I'd put a good amount of money on it being a single pool of DDR3 with a 128 bit bus. The only questions are really the latency (my guess is this is what Nintendo's primarily concerned with) and clock speed (which is likely to be a clean multiple/divisor of GPU and CPU clocks).
Van Owen
Member
(09-18-2012, 08:38 PM)
Van Owen's Avatar

Originally Posted by ColdBlooder

Because the mainstream audience dosenīt care about "LOOK AT DEM GRAFIX" like the few % of hardcorezzz like here on NeoGAF.

lol ok, tell that to Wii third party sales that weren't casual crap.
frankie_baby
Banned
(09-18-2012, 08:41 PM)

Originally Posted by Rolf NB

But that is a sensible answer. Should be on par with Xbox 360 CPU-wise.

But 3 enhanced broadways could be noticeably better than the 360 CPU but just different
Shokio
Junior Member
(09-18-2012, 08:42 PM)
Shokio's Avatar

Originally Posted by DragonSworne

And when it looks better people will say "360 and PS3 could do it".

Lol pretty much. For some people, NO Wii U game will EVER look better than the 360 and PS3 due to their extreme bias.
KojiKnight
Member
(09-18-2012, 08:45 PM)
KojiKnight's Avatar

Originally Posted by ColdBlooder

Well since the "Deluxe Set" is called "Premium Set" in Japan and Europe and the fact that you see a black Gamepad here, we could assume it means "Wii U Premium"

White hardware may be labeled "WUB" For Wii U Basic

*goes to look at the back of the white controller image* in the mean time, flaw to your logic, how will GAMES be listed? WUG? That would be the first time in Nintendo's history that a system/accessories/games have had different 3 digit identifiers.

(edit) White controller is also has WUP on the back.
Last edited by KojiKnight; 09-18-2012 at 08:47 PM.
Van Owen
Member
(09-18-2012, 08:47 PM)
Van Owen's Avatar

Originally Posted by Shokio

Lol pretty much. For some people, NO Wii U game will EVER look better than the 360 and PS3 due to their extreme bias.

You realize this could easily be flipped, right...?
KojiKnight
Member
(09-18-2012, 08:51 PM)
KojiKnight's Avatar
Are there any shots of the back of any Wii U boxes yet? I'd love to see if they too use the WUP 3 digit code, I'm going to assume they do though since Nintendo has always used the 3 letter code.

The only difference between the black and white controllers serial is the initial 3 letters (LTA for white, LTB for black).
blu
Member
(09-18-2012, 08:51 PM)
blu's Avatar

Originally Posted by Fourth Storm

Point well taken. I agree more is better even at the same bandwidth, but you don't think there is a diminishing return beyond 1 or 1.5 GB or RAM w/ low bandwidth? From what I have read, the golden ratio of capacity:bandwidth could not be agreed upon, but having higher bandwidth memory definitely seemed to impact performance.

Of course. I was discussing the 'same BW' scenario though.

Forgetting about the reserved amount for system applications for a moment, wouldn't you rather have 1 GB of high bandwidth GDDR5 than 2 GB of DDR3 at half the throughput?

It really depends on various use-case factors, such as the type/genre of game/app, art direction, etc. Basically, it varies from title to title. If you asked me to choose disregarding all such use-case factors, though, I'd probably go with the 2GB of DDR3 for a system like the WiiU where that RAM would not be the sole BW provider. If that condition was not true, though, and that RAM was the sole BW provider, then I'd go with the GDDR5.

Obviously, performance is affected by the GPU and entire system as well (plus the benefits of the eDRAM, which should go beyond Xenos, as you so kindly explained), but we don't know at this point what the main graphical bottleneck of the system is.

We surely don't. That said, even if the system was the most well-rounded console imaginable, the use-case factorisation applies here as well - some game will still be RAM BW limited, another - GPU ALU limited, yet another - CPU limited, and yet another - storage access speed limited (apparently various combinations of those factors during various runtime timestamps are also valid). The job of the engineers responsible for the hw design is not to make a console nobody could ever hit a bottleneck on. It's to avoid making a system where no matter what people do, they always hit the same few bottlenecks time and time again.
Log4Girlz
I recently went to my friends house to check out his wii. I was generally impressed. It was larger than I expected though.
(09-18-2012, 08:53 PM)
Log4Girlz's Avatar
So if the whole system tends to run at 40 watts, what are we expecting out of the GPU, 15-20 watts?
Graphics Horse
graphics horse
graphics horse
does whatever a
graphics horse does
(09-18-2012, 08:56 PM)
Graphics Horse's Avatar

Originally Posted by KojiKnight

Are there any shots of the back of any Wii U boxes yet? I'd love to see if they too use the WUP 3 digit code, I'm going to assume they do though since Nintendo has always used the 3 letter code.

The only difference between the black and white controllers serial is the initial 3 letters (LTA for white, LTB for black).

This link mentions a 250 volt adapter with the product code WUP-002. Both adapters have WUP, so the Wii U box, (guessing WUP-001) will too.
Shokio
Junior Member
(09-18-2012, 08:56 PM)
Shokio's Avatar

Originally Posted by Van Owen

You realize this could easily be flipped, right...?

Except for those saying the games look better would make much more sense, considering the stronger hardware.
Jonm1010
Member
(09-18-2012, 08:57 PM)
Jonm1010's Avatar

Originally Posted by ColdBlooder

Because the mainstream audience dosenīt care about "LOOK AT DEM GRAFIX" like the few % of hardcorezzz like here on NeoGAF.

I like Nintendo damage controllers.

"OMG look how beautiful Mario galaxy looks. Looks better then some 360 games."

guy* so wiiU is gonna be stomped by ps4/720

"no one cares about graphics!"
ColdBlooder
Banned
(09-18-2012, 08:57 PM)
ColdBlooder's Avatar

Originally Posted by Shokio

Except for those saying the games look better would make much more sense, considering the stronger hardware.

Let him dream

In his world a ~600gflop GPU with a far more modern feature set produces shittier graphics than a 241 gflop gpu wich lacks most of modern features.
Thunder Monkey
(09-18-2012, 08:59 PM)
Thunder Monkey's Avatar

Originally Posted by Van Owen

Well I'm glad they're taking Wii U's launch seriously with cheaply made games? lol

When Retro's game looks like a good PS3/360 title it will be "IT'S THEIR FIRST GAME ON NEW HARDWARE".

I have no doubt it will look like a top tier PS3/360 title at the very least.

Though as EatChildren has said it may not have a tech focused artstyle.

But make no mistakes my friend. Retro and EAD Tokyo are not studios to be trifled with. When it comes to art design and technical competence these two studios are Nintendo's best bar none.

The games probably won't look a generation removed from the PS3 or 360's best. But they will be gorgeous games all the same.
ColdBlooder
Banned
(09-18-2012, 08:59 PM)
ColdBlooder's Avatar

Originally Posted by Jonm1010

I like Nintendo damage controllers.

"OMG look how beautiful Mario galaxy looks. Looks better then some 360 games."

guy* so wiiU is gonna be stomped by ps4/720

"no one cares about graphics!"

Quote me where i said that!

And Beautiful graphics are a nice addition but if i had to choose shitty graphics over superior gameplay, i would choose the gameplay...

E3 2013 canīt come soon enough but not for the reason you might think.
blu
Member
(09-18-2012, 09:01 PM)
blu's Avatar

Originally Posted by Log4Girlz

So if the whole system tends to run at 40 watts, what are we expecting out of the GPU, 15-20 watts?

High 20's.
Van Owen
Member
(09-18-2012, 09:02 PM)
Van Owen's Avatar

Originally Posted by Shokio

Except for those saying the games look better would make much more sense, considering the stronger hardware.

By a small margin.
Log4Girlz
I recently went to my friends house to check out his wii. I was generally impressed. It was larger than I expected though.
(09-18-2012, 09:02 PM)
Log4Girlz's Avatar

Originally Posted by blu

High 20's.

Do we have an idea what realistically the chip could do in terms of GFLOPS per Watt? I have no idea what is reasonable nowadays.
Jonm1010
Member
(09-18-2012, 09:03 PM)
Jonm1010's Avatar

Originally Posted by ColdBlooder

Quote me where i said that!

And Beautiful graphics are a nice addition but if i had to choose shitty graphics over superior gameplay, i would choose the gameplay...

E3 2013 canīt come soon enough but not for the reason you might think.

Graphics and gameplay aren't mutually exclusive.

And it was just a harmless jab, not necessarily directed at you.
KojiKnight
Member
(09-18-2012, 09:04 PM)
KojiKnight's Avatar

Originally Posted by Graphics Horse

This link mentions a 250 volt adapter with the product code WUP-002. Both adapters have WUP, so the Wii U box, (guessing WUP-001) will too.

Thank you sir, so that more or less confirms the three letter code, but it still doesn't tell us what the P stands for... I'm probably far more interested in this than I should be... but I've always been fond of them ever since I read an article on what the DOL stood for on the back of GCN game cases (then later confirming form yself that every other system Nintendo made did the same)
Thunder Monkey
(09-18-2012, 09:04 PM)
Thunder Monkey's Avatar

Originally Posted by Van Owen

By a small margin.

One might even say a marginal margin.

Look, there will be some things WiiU undoubtedly does better. Texture res can be higher, world size can be larger. But we are still talking last gen + visuals.

I happen to think if you have trouble seeing your vision to fruition on that capable of hardware... you're doing something really wrong.
XtremeXpider
Member
(09-18-2012, 09:07 PM)
XtremeXpider's Avatar

Originally Posted by ColdBlooder

In his world a ~600gflop GPU with a far more modern feature set produces shittier graphics than a 241 gflop gpu wich lacks most of modern features.

Are you sure of those specs?

Thread Tools