Size =/= Power
"Check the box."Guys it's just a box.
/reggie
I'm just being a pessimist, size isn't everything my gf tells me.
I'm just being a pessimist, size isn't everything my gf tells me.
What's wrong with skinny arms? My gf loves them, I know some girls that loves skinny arms and hate arms with too much muscles.You must have skinny arms. There's a thread on the OT side that could help you with that.
What's wrong with skinny arms? My gf loves them, I know some girls that loves skinny arms and hate arms with too much muscles.
Yes, many publishers have reported loses over the years.Well, AS A WHOLE, yes publishers did make a lot of money this generation. But that's mostly Nintendo's profit. It would be like saying this classroom full of D-students has a B-average just because one genius is in it.
There were years where the only reason the industry as a whole put any numbers on the board was because Nintendo counteracted all the losses from the other publishers. Heck even Activision lost money last year, and they had CODBLOPS, Starcraft II, AND WoW tithes.
That's just a company selling their product, saying nice things about it.Well you dont have to believe what Nintendo stated about the Wii size:
I agree.I think Nintendo has done what it can to keep the size down, but... what they have decided to pack in there makes it a problem. We also know that the WiiU can only be placed in one position, and it has more vents. And they have done that to capture the market that they lost with the Wii.
In the context I'm talking about Gameline wouldn't fit, though I admit I learned something new. It seems to have been for downloading games only at the time. Nintendo's was used for stock trading, weather updates, banking, and some other things we do on the internet.
A lot of changes had to be made this gen and we're stuck in the middle of a big economic crysis, but you are talking as if everyone in the industry except Nintendo has been bleeding money for the last six years. If that were the case everyone would have left the industry already.
Just like in other big industries, in the long run there will be only a few big publishers left in the videogame industry making the big blockbuster titles and a bunch of indie developers making low budget games.
There were plenty of Service's like this, long before the Nintendo one, in the 80s. The Source, Compuserve or Quantum Link for example.
whats the rumoured value for wii u?
32MB and no, the "Loop" won't have 100MB of eDRAM.
Found what I was looking for and as I thought it was in response to one of wsippel's posts.
http://forum.beyond3d.com/showpost.php?p=1573832&postcount=191
From the same place:
Quote:
Originally Posted by brain_stew
Chatter about a significant amount of eDRAM is spot on. Final hardware will have enough for either 720p w/ MSAA or 1080p rendering in a single pass. I don't think anyone will really be disappointed with the memory/cache setup once more specific details leak
So it is fine yippee
fat G-buffer DS
1920*1080*4xMSAA*4buffers*8bytes per pixel (FP16) is already 253MB.
1920*1080*4xMSAA*4bytes per pixel (32bpp Z) is 31MB.
LPP
Normals (FP16) + depth (32bpp), 4xMSAA -> 95MB
Divide by 2.25 for 720p
Though I'd like more explanation on that.
Found what I was looking for and as I thought it was in response to one of wsippel's posts.
http://forum.beyond3d.com/showpost.php?p=1573832&postcount=191
What about it? It's just a 32-bit Z-buffer.
This thread is a rollercoaster, someday it's pretty optimistic, and then pretty pessimistic.
Something I'd like to know though, is it ever mentioned in the thread that an RV770LE was in the kits as per wsippel's sources' claim waaaay back when?
(IIRC it was something like a 3.6GHz TriCore, AMD HD Radeon 4830, & 1GB of memory, right?)
wsippel said:From what I've heard, it is GDDR3. Not GDDR5, not XDR2. Maybe my source is wrong, I don't know. That's not necessarily a problem of course, depending on how wide the bus is, and there might even be benefits (lower latency). No idea. The GPU is supposedly has 640 SPs and is running at 500MHz, and the CPU is a triple core PPC running at 3.5GHz with a metric ton of l2 cache. Again, no idea if it's actually true - just something I was told in private. It's all second hand information, and the devkits are not even close to final, anyway.
I should probably rephrase to be sure I get the answer I want.
How much embedded Mem is necessary to achieve 1080p @ 60fps, some form of good AA and 2 tablets that also support their full resolution (800x540?) @60fps w/ good AA?
Will the rumored 32MBs eDRAM(or 1T-SRAM) cover that?
Is it possible that Nintendo would include 48MBs or 64MB?
Actually, I think I need to find this information for prosperity's sake. It was posted in this thread, but damn if it's not going to be a pain in the ass to backtrack and find it. :lol
The ride is really fun.
Okay, so let us assume these are the specs for the alpha hardware.
Final hardware should have a 4-core PPC at 3.5Ghz with 32MB of eDRAM, 800 SPU GPU 600Mhz RV770 (heavily modified to be directX 12 compliant), 2GB of GDDR3 with an extra wide bus (what was 4-stream thing called?) or 1.5GB GDDR5 or XDR2.
Okay, so let us assume these are the specs for the alpha hardware.
Final hardware should have a 4-core PPC at 3.5Ghz with 32MB of eDRAM, 800 SPU GPU 600Mhz RV770 (heavily modified to be directX 12 compliant), 2GB of GDDR3 with an extra wide bus (what was 4-stream thing called?) or 1.5GB GDDR5 or XDR2.
I should probably rephrase to be sure I get the answer I want.
How much embedded Mem is necessary to achieve 1080p @ 60fps, some form of good AA and 2 tablets that also support their full resolution (800x540?) @60fps w/ good AA?
Will the rumored 32MBs eDRAM(or 1T-SRAM) cover that?
Is it possible that Nintendo would include 48MBs or 64MB?
Someone told me that an early devkit supposedly used an off-the-shelf RV770LE. But that chip would certainly not be used in the final hardware either way.
DX12 isn't even coming next year. It hasn't even been announced yet. Besides, why would they modify an RV770 for that?
I should probably rephrase to be sure I get the answer I want.
How much embedded Mem is necessary to achieve 1080p @ 60fps, some form of good AA and 2 tablets that also support their full resolution (800x540?) @60fps w/ good AA?
Will the rumored 32MBs eDRAM(or 1T-SRAM) cover that?
Is it possible that Nintendo would include 48MBs or 64MB?
From the same place:
Quote:
Originally Posted by brain_stew
Chatter about a significant amount of eDRAM is spot on. Final hardware will have enough for either 720p w/ MSAA or 1080p rendering in a single pass. I don't think anyone will really be disappointed with the memory/cache setup once more specific details leak
So it is fine yippee
Now don't get me wrong, I'm no mega graphics whore, but MSAA only being doable on 720 does seem a little out dated doesn't it? Don't the PS3/360 do all that already or is it really a matter of the cost (time) of the process?
Also, can someone clarify what the "1080p rendering in a single pass" means? I'm taking that to mean "1080p with no anti-aliasing"
By "single pass", brainstew means without tiling; the process of splitting each frame into smaller chunks and swapping them in and out of the framebuffer and the main memory to render the full frame. The X360 can only do 720p without AA in a single pass. If you want to add AA or render at 1080p, you need to tile the framebuffer (which many devs do). If what brainstew says is true, the Wii U can either do 720p with AA in a single pass or 1080p with no AA in a single pass which is a substantial improvement over the 360. Just like the 360, devs can tile the frame buffer if they want 1080p with AA (and they'll have a much easier time of it too, given the extra RAM).Now don't get me wrong, I'm no mega graphics whore, but MSAA only being doable on 720 does seem a little out dated doesn't it? Don't the PS3/360 do all that already or is it really a matter of the cost (time) of the process?
Also, can someone clarify what the "1080p rendering in a single pass" means? I'm taking that to mean "1080p with no anti-aliasing"
The explanation to that is that the amount of GPU-local edram one can expect on the WiiU cannot meet the needs of deferred shading algorithms, not without some form of tiling.Though I'd like more explanation on that.
By "single pass", brainstew means without tiling; the proccess splitting each frame into smaller chunks and swapping them in and out of the framebuffer and the main memory to render the full frame. The X360 can only do 720p without AA in a single pass. If you want to add AA or render at 1080p, you need to tile the framebuffer (which many devs do). If what brainstew says is true, the Wii U can either do 720p with AA in a single pass or 1080p with no AA in a single pass which is a substantial improvement over the 360. Just like the 360, devs can tile the frame buffer if they want 1080p with AA (and they'll have a much easier time of it too, given the extra RAM).
It means that it doesn't have to tile the picture, like the Xb360 does.
By "single pass", brainstew means without tiling; the proccess splitting each frame into smaller chunks and swapping them in and out of the framebuffer and the main memory to render the full frame. The X360 can only do 720p without AA in a single pass. If you want to add AA or render at 1080p, you need to tile the framebuffer (which many devs do). If what brainstew says is true, the Wii U can either do 720p with AA in a single pass or 1080p with no AA in a single pass which is a substantial improvement over the 360. Just like the 360, devs can tile the frame buffer if they want 1080p with AA (and they'll have a much easier time of it too, given the extra RAM).
lherre seemed to be pretty adamant about it being three cores so I'll stick with that.
And this comes from the early dev kit?
The system sounds nice enough for me.
Not to forget, maybe... just maybe... Nintendo still has some
tricks up their sleeve regarding graphics.
Okay, so let us assume these are the specs for the alpha hardware.
Final hardware should have a 4-core PPC at 3.5Ghz with 32MB of eDRAM, 800 SPU GPU 600Mhz RV770 (heavily modified to be directX 12 compliant), 2GB of GDDR3 with an extra wide bus (what was 4-stream thing called?) or 1.5GB GDDR5 or XDR2.
More likely 3-core POWER7 at moderate clock speed (they're probably still using Xenon in the devkit) with 12 MB EDRAM, a 512 SPU Southern Islands like core with 32 MB EDRAM and 1 GB of mysterious memory. I don't think the final hardware will be much faster than the devkit, just more modern. Shouldn't the devkit be in intended to set some sort of realistic baseline?Final hardware should have a 4-core PPC at 3.5Ghz with 32MB of eDRAM, 800 SPU GPU 600Mhz RV770 (heavily modified to be directX 12 compliant), 2GB of GDDR3 with an extra wide bus (what was 4-stream thing called?) or 1.5GB GDDR5 or XDR2.
It should. Though history knows cases of early devkits playing bad jokes on devs *cough* dual 970MP filling in for a Xenon *cough*Shouldn't the devkit be in intended to set some sort of realistic baseline?