• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

prag16

Banned
its possibly more to do with encyption/decryption, the flash itself shouldn't be as slow as it seems to be

This is something I put forth ages ago as a possible explanation for some of the slowness. It's the reason why Blackberry phones used to take so long to launch. They basically created a hash of the entire file system and ran a diff on every boot to make sure no rogue shit was present.

Security and antipiracy measures. Makes sense especially after how irreparably hacked up the Wii got. Yes, there's no way the flash memory should be that slow even if it's shitty cheap flash.
 

Donnie

Member
Hey, I factored that into my really silly math!



DMIPS is integer performance. Games are mostly FP-dependent, and I can tell you that increases in floating point performance has dramatically outpaced integer performance in subsequent CPU generations.

Furthermore, because FP compute dominates CPUs in games, throwing more INT units at games serves no purpose. This is the primary reason AMD's "8-core" CPUs get absolutely murdered by Intel's 4-core CPUs in games, because AMD's "8-core" CPUs consist of 4 "modules" which have 8 INT units and 4 FP units, and Intel's 4-core CPUs have 4 INT units and 4 FP units and Intel's FP performance beats the ever-loving shit out of AMD's. AMD having effectively double the INT performance of Intel per-core (ignoring bottlenecking at the shared cache and the memory subsystem) doesn't do a thing for AMD for games.

So no, 4-5x more raw power on INT doesn't mean a whole lot. I'm willing to bet my guess of 13x more is in the right ballpark when you realize it's the FP units that will get pounded by games running on the Jaguar in Xbone/PS4. The FP units on a modern x86 design are another universe from the FP units on the PPC750.



Yeah, I guess. As you were then.

Strange then that Espresso's little brother (Broadway), which you equate to a Pentium II level CPU, has better floating point performance then Bobcat (Jaguars little brother).

I realise Jaguar's SIMD implementation is improved over Bobcats BTW.
 

krizzx

Junior Member
Should't it be 137.5GB/s? And no, I don't think you are. But the macros don't necessarily need to be accessible all at once. There could be a DDR-style bus protocol which allows multiple slower macros to masquerade as a single fast device. Say, if there were 8 macros at 225MHz each, you could have a 1024bit bus running at 550MHz. As re the actual macros count - I've got no idea.

That is interesting. Also, there is another question I asked a while back that was never properly answered. Couldn't the Wii U's RAM be dual channel since its 2 512 chips? That would double the performance over what people initially believed it was.
 
I don't know, but wasn't there a strangeness to the memory bus width that didn't make sense with 16 bit?

Also, did we ever figure out how what was claimed in this quote from Ubisoft's Michel Ancel? http://www.kotaku.com.au/2012/10/according-to-raymans-creator-the-wii-u-is-surprisingly-powerful/

“[W]e improved the engine – but I think the console is quite powerful. Surprisingly powerful. And there’ a lot of memory. You can really have huge textures, and it’s crazy because sometimes the graphic artist – we built our textures in very high-dentition. They could be used in a movie. Then we compress them, but sometimes they forget to do the compression and it still works!”

Whatever the configuration of CPU and GPU combo is as relates to the 1GB of RAM and its ability to do things like that, I don't think we've figured out yet. That's at base what we're looking to solve in the end here right?
 
A

A More Normal Bird

Unconfirmed Member
Is there a PC version of any of the recent Rayman titles? It would be helpful to know more about the textures. Depending on their resolution and the compression used the difference between the compressed and uncompressed could be in the tens of megabytes or it could be only a few.
 

wsippel

Banned
Should't it be 137.5GB/s? And no, I don't think you are. But the macros don't necessarily need to be accessible all at once. There could be a DDR-style bus protocol which allows multiple slower macros to masquerade as a single fast device. Say, if there were 8 macros at 225MHz each, you could have a 1024bit bus running at 550MHz. As re the actual macros count - I've got no idea.
2048 / 8 * 550000000 / 1024 / 1024 / 1024 = 131
 

krizzx

Junior Member
Its funny that they aren't encountering any of these problems that others like to insist on.

Wayforward = no problems
Toki Tori 2 dev = no problems
Shin'en = no problems
Frozenbyte = no problems
Precursor Games = no problems

Why are the small devs having no problems but the big devs crying like infants? Could it be what was originally stated by most of the unbiased from the beginning? That the devs simiply didn't want to rewrite the code for the Wii U, only did as much as what was needed to get the game running?


“CryEngine 3 fully supports Wii U and it’s been a great system to work with so far,” Precursor Games CEO Paul Caporicci explained to Nintendo Enthusiast.

“And in regards to Shadow of the Eternals, beautiful visuals are just one part of the experience. There are many different pieces that all come together for this experience and the Wii U handles that complete package very well.

“So, we’ve had no problems so far and we’re not at all concerned about what the Wii U can handle.”

http://www.nintendo-insider.com/2013/05/21/precursor-games-say-wii-u-is-a-great-system-to-work-with/
 

Heyt

Banned
I have been reading this post for a long while but my lack of real knowledge on the subject makes following the discussion really hard. So.. how would you explain to an uneducated idiot like me in wich point of the rollercoaster are we now in terms of the console's power?

Sorry I can not add nothing to the conversation and all.
 

krizzx

Junior Member
I don't know, but wasn't there a strangeness to the memory bus width that didn't make sense with 16 bit?

Also, did we ever figure out how what was claimed in this quote from Ubisoft's Michel Ancel? http://www.kotaku.com.au/2012/10/according-to-raymans-creator-the-wii-u-is-surprisingly-powerful/

“[W]e improved the engine – but I think the console is quite powerful. Surprisingly powerful. And there’ a lot of memory. You can really have huge textures, and it’s crazy because sometimes the graphic artist – we built our textures in very high-dentition. They could be used in a movie. Then we compress them, but sometimes they forget to do the compression and it still works!”

Whatever the configuration of CPU and GPU combo is as relates to the 1GB of RAM and its ability to do things like that, I don't think we've figured out yet. That's at base what we're looking to solve in the end here right?

In the end, we are looking to solve what the GPU is running. Its clear that most of the components are not RV7XX now at least(digital foundry can flush their writeup down the toilet). Last I checked, none of the RV700s were dual graphics engine GPUs.

Generally, though, we have only heard praise for the Wii U RAM, so it can't have the bandwidth problems that people are insisting. It wouldn't make any sense if it did. We would have heard complaints by now given the state of the industry. No one has reported needing any kind of special tricks when it comes to memory. In fact, as you just posted, they said that it takes textures with no problems even when uncompressed. The NFS Most Wanted dev made the same statement. They just flipped a switch and it was taking PC textures with no issues. B3D's bandwidth starved claims and all of sites that did the "RAM clock" comparisons with the PS3/360 were a lie.

It has always been reported that it is easy to use.
 
What are the chances 1T-SRAM bandwidth is higher?

If it is 131 GB/s, that would be excellent and the most we can realistically hope for.

Hey wsippel, mind posting a link to that Renesas datasheet? This is one aspect of the gpu that has perhaps gone underanalyzed in light of all the hooplah concerning shaders and such.
 
If it is 131 GB/s, that would be excellent and the most we can realistically hope for.

Hey wsippel, mind posting a link to that Renesas datasheet? This is one aspect of the gpu that has perhaps gone underanalyzed in light of all the hooplah concerning shaders and such.

Is that because of density or bus width, maybe a combination of both?
 
Can the feature for texture compression that Toki Tori 2 dev mentioned help in reduce the disparity in relation to RAM from the PS4/XBone, or that is a common feature in GPUs?
Just curious!
 

Absinthe

Member
Can the feature for texture compression that Toki Tori 2 dev mentioned help in reduce the disparity in relation to RAM from the PS4/XBone, or that is a common feature in GPUs?
Just curious!

Great question! I was wondering the exact same thing today.

To build on your question, I wonder how much groundbreaking proprietary tech may be in the Wii U in one way or another in relation to that? As noted above, even the Rayman dev stated that they forgot to compress some massive textures a few times, and the system didn't even hiccup. That said, I am not trying to play up the "secret sauce" angle, I only wonder because of how custom/baffling everything seems to be.
 

krizzx

Junior Member
anybody else on here laugh at Wii U's sales skyrocketing after the Xboner's announce?

I laughed harder at the way the title was edited and a lot people came in insisting that it was because of the price drop and not the XboxOne announcement despite the fact that the price drops have been proven to not work for the console ever many times over.
 

MDX

Member
Great question! I was wondering the exact same thing today.

To build on your question, I wonder how much groundbreaking proprietary tech may be in the Wii U in one way or another in relation to that? As noted above, even the Rayman dev stated that they forgot to compress some massive textures a few times, and the system didn't even hiccup. That said, I am not trying to play up the "secret sauce" angle, I only wonder because of how custom/baffling everything seems to be.


No secret sauce necessary when in fact it is a custom design.
Nintendo is not stupid. They have been in the game console business donkey years
and know what they are doing. Unlike MS, they made a game console, not a
companion for TV.

Iwata: Instead of just designing a GPU, for example, you're making a game console.
 
A

A More Normal Bird

Unconfirmed Member
Great question! I was wondering the exact same thing today.

To build on your question, I wonder how much groundbreaking proprietary tech may be in the Wii U in one way or another in relation to that? As noted above, even the Rayman dev stated that they forgot to compress some massive textures a few times, and the system didn't even hiccup. That said, I am not trying to play up the "secret sauce" angle, I only wonder because of how custom/baffling everything seems to be.
Groundbreaking proprietary tech? Probably none. It's not something that's likely to come directly out of Nintendo, and if AMD had groundbreaking texture format tech it would be on their other GPUs. You shouldn't read too much into the use of uncompressed textures, as I mentioned above the difference could be anything from a few meg to a few tens of megs. There are also the facts that it's a 2D title and that the quote indicates that the textures are compressed in the final product. I think Ancel was just providing an example of the benefits of the extra RAM on the Wii-U over PS3/360.
 

MDX

Member
When I read Iwata asks:

Shiota
Yes. The designers were already incredibly familiar with the Wii, so without getting hung up on the two machines' completely different structures, they came up with ideas we would never have thought of. There were times when you would usually just incorporate both the Wii U and Wii circuits, like 1+1. But instead of just adding like that, they adjusted the new parts added to Wii U so they could be used for Wii as well.

Iwata
And that made the semiconductor smaller.

Shiota
Right. What's more, power consumption fell. That was an idea that only designers familiar with Wii could have put forth. We were able to make such a small semiconductor because so much wisdom bubbled up!


and if I look at Nintendo's main site:
http://www.nintendo.com/wiiu/features/tech-specs/

IBM Power®-based multi-core processor


I get the impression that the final CPU might not have been made
at 45nm as previously stated. Its not mentioned anymore. Maybe,
after the reveal in 2011, some breakthrough occurred, as mentioned in
IWATA asks, that gave Nintendo the opportunity to go with smaller
chip fabrications, which allowed power consumption to be decreased
and performance to be increased.

Their dev kits, however, were still based on the older designs (there
were reports of overheating, AFAIR).

What didnt make sense to me was the how late Nintendo was with their final
dev kits when the console was revealed publicly mid 2011. Even their games have
been delayed. Something caused a hiccup in their plans.
 
4.3 Billion transistors for HD 7900 series (32 compute units or 2048ALUs) HD 7800 series is 2.8 Billion transistors (20 compute units or 1280ALUs) and the HD7700 series is 1.5 Billion transistors (10 compute units or 640ALUs)

Going by those families, XB1 has 12 compute units, but should otherwise fall into the HD 7700 series with 768 ALUs, making the transistor count close to ~1.7 Billion. PS4 has 18 compute units and should fall into HD 7800 series (between 7850 and 7870) making the count ~2.5 Billion transistors.

Microsoft's 5 Billion transistors claim is about all silicon, and in case anyone was wondering Wii U's GPU should be around ~750 million transistors if Thraktor's measurements were correct earlier in the year, not including the eDRAM which took up around 220 million transistors: http://www.neogaf.com/forum/showpost.php?p=45095173&postcount=836

Altogether Wii U's MCM is around 1.2 Billion transistors.

If Wii Us MCM is 1.2B trans, that's one of the worst engineering jobs ever considering it has yet to prove it outpowers a 360 which weighed in at ~500m (same for PS3).

My guess is the Wii U guys are assuming the wrong node. Or it's just that inneficient due to having backwards compatibility. Or something. Could be 32MB EDRAM offhand could at ~200m transistors over 10MB in 360 too which alone would kick 360 to ~700m I guess if it had 32MB EDRAM.

The rest of your stuff looks good. The Xbone likely has more transistors than the PS4 while having less compute grunt. So MS focuses on a spec Sony cant match. It's likely 1.6 billion or more transistors for 32MB of ESRAM so "everything else" is "only" 3.4.

OT/Which would seem to be a really stupid move on MS part and it may be, but I bet they did all the calculations. Those 1.6B trans could pack really dense, they integrate into the SOC (EDRAM doesn't), and they shrink well over time. They could also provide performance help in theory, at least they are much more flexible than 360 EDRAM./OT
 

z0m3le

Banned
Specialguy: Xenos Transistor count is 232 million for the GPU and ~100 million for the daughter die with ROPs and 10MB eDRAM or around 332 Million for both, the ~500 million transistors I believe is counting Xenon, the CPU.

Wii U's GPU with 35MB eDRAM is ~975 Million, only the ~750 Million when you are looking at just the GPU parts and number is based on AMD's GPU transistor density afaict.This is one of the reasons I doubt we have seen anything from Wii U's GPU yet.

It is also one of the reasons I find 160ALUs to be a bit crazy, I mean if you want to talk about custom ALUs, those that are 90% bigger than R800, means that is it certainly extremely over the top custom, which is why I don't think we are looking at 160ALUs, also even with Flipper along for the ride, that is only 26million transistors for the entire GPU minus the eDRAM for that chip. Which should point to more going on than some R700 ALUs which are smaller than R800 ALUs + some TEV logic for backwards compatibility, that isn't a well thought out idea, it's just pandering to the lowest possible specs.

As for Wii U not yet exceeding 360, ports are historically locked to their lead platform and Wii U has other issues to convert the game to play nice, such as different memory architecture and a CPU that can't do what Cell or Xenon could. It does more of other stuff that might be more useful once the GPU is utilized for GPGPU functions, it's impossible to get a performance level from the Wii U off of Nintendoland and that is the only ground up title I think Wii U has seen. I might be wrong, but I assume future games will look a lot better.
 

tipoo

Banned
4.3 Billion transistors for HD 7900 series (32 compute units or 2048ALUs) HD 7800 series is 2.8 Billion transistors (20 compute units or 1280ALUs) and the HD7700 series is 1.5 Billion transistors (10 compute units or 640ALUs)

Going by those families, XB1 has 12 compute units, but should otherwise fall into the HD 7700 series with 768 ALUs, making the transistor count close to ~1.7 Billion. PS4 has 18 compute units and should fall into HD 7800 series (between 7850 and 7870) making the count ~2.5 Billion transistors.

Microsoft's 5 Billion transistors claim is about all silicon, and in case anyone was wondering Wii U's GPU should be around ~750 million transistors if Thraktor's measurements were correct earlier in the year, not including the eDRAM which took up around 220 million transistors: http://www.neogaf.com/forum/showpost.php?p=45095173&postcount=836

Altogether Wii U's MCM is around 1.2 Billion transistors.



Nice, thanks. I would assume Microsoft isn't counting memory transistors as that would make the count even crazier. So the CPU+GPU+eSRAM add up to 5 billion, the eSRAM will be taking 3x the transistors of the Wii Us eDRAM. Both are 32MB, so lets say 660 million transistors for the eSRAM in the One. Plus 1.7 billion for the GPU...Hmm, that still leaves a lot. Too much for 8 Jaguar cores to fill in, but we do also know there's a lot of custom chips like Move engines and whatnot going on in there too, plus the two chips in the Kinect which they may be counting.

Do we know for sure if they said 5 billion just for the APU, or for all silicon?
 

z0m3le

Banned
Nice, thanks. I would assume Microsoft isn't counting memory transistors as that would make the count even crazier. So the CPU+GPU+eSRAM add up to 5 billion, the eSRAM will be taking 3x the transistors of the Wii Us eDRAM. Both are 32MB, so lets say 660 million transistors for the eSRAM in the One. Plus 1.7 billion for the GPU...Hmm, that still leaves a lot. Too much for 8 Jaguar cores to fill in, but we do also know there's a lot of custom chips like Move engines and whatnot going on in there too, plus the two chips in the Kinect which they may be counting.

Do we know for sure if they said 5 billion just for the APU, or for all silicon?

All silicon, as they didn't mention what it was for. Yes I believe they are also counting the Kinect chips. When all is said and done, I wouldn't be surprised if there was an ARM chip running one of the 3 OSs as well.
 
You shouldn't read too much into the use of uncompressed textures, as I mentioned above the difference could be anything from a few meg to a few tens of megs.

No one would have read into it much if the developer who spoke of it wasn't excited about it, and intentionally bringing it up to make a point. So there's every reason to.
 
Their dev kits, however, were still based on the older designs (there
were reports of overheating, AFAIR).

Many devkits are susceptible to overheating, I wouldn't read anything into this (of course, this entire thread is about reading far too much into far too little, but what can you do).
 
If Wii Us MCM is 1.2B trans, that's one of the worst engineering jobs ever considering it has yet to prove it outpowers a 360 which weighed in at ~500m (same for PS3).

Martin Sauter of Shin'en said in a recent Game Reactor interview:

'Nintendo hardware is not as bad as you sometimes read. I think it's WAY more powerful than it is communicated through the press.'

'I think the Wii U is a good compromise between price point, because don't forget you have the tablet controller. A great hardware base. And it's much better than everybody reads. It's better than Xbox. Sorry, it is better. You can squeeze lots out of it. But you have to work hard at it.'
'We really want to push the hardware... We really want to show something that's "Wow! This is something great.'
___

So While I can agree that there are no released or even fully revealed big budget, ground up titles for Wii U, and thus it has not been proven to gamers, let's not pretend or even suggest anymore that the Wii U may not even be on the level of 360 please. We are in this to find truth about what the console is capable of right?
 
Martin Sauter of Shin'en said in a recent Game Reactor interview:

'Nintendo hardware is not as bad as you sometimes read. I think it's WAY more powerful than it is communicated through the press.'

'I think the Wii U is a good compromise between price point, because don't forget you have the tablet controller. A great hardware base. And it's much better than everybody reads. It's better than Xbox. Sorry, it is better. You can squeeze lots out of it. But you have to work hard at it.'
'We really want to push the hardware... We really want to show something that's "Wow! This is something great.'
___

So While I can agree that there are no released or even fully revealed big budget, ground up titles for Wii U, and thus it has not been proven to gamers, let's not pretend or even suggest anymore that the Wii U may not even be on the level of 360 please. We are in this to find truth about what the console is capable of right?
Cool. Do you have a link to that interview?
 

disap.ed

Member
Martin Sauter of Shin'en said in a recent Game Reactor interview:

'Nintendo hardware is not as bad as you sometimes read. I think it's WAY more powerful than it is communicated through the press.'

'I think the Wii U is a good compromise between price point, because don't forget you have the tablet controller. A great hardware base. And it's much better than everybody reads. It's better than Xbox. Sorry, it is better. You can squeeze lots out of it. But you have to work hard at it.'
'We really want to push the hardware... We really want to show something that's "Wow! This is something great.'

That's the problem though, publishers want quick ports of XB360 games and not months of work to reach their level.
That's also the reason why we saw more technically worthy ports/titles from smaller devs (with NFS:MW being the only real exception of that rule, plagged with freezes though)
 
In the end though, people don't want to pay full price for Mass Effect 3 on Wii U when the trilogy is available on other consoles. They don't want Arkham City, it had been out for months already. A big miscalculation made early on by these port developers I think is that Early adopters of Wii U were new to HD gaming. A lot of people owned a PS3 or 360 or both, and had already bought those games, or had passed on them. It isn't fair at all imo to use such ports as a gauge for sales, or as one for what the console is capable.

Considering the CPU make up of all of these consoles, and the fact that they are largely GPU centric, even the more powerful consoles would probably have hiccups and problems if you took one of those games and just threw the 360 or PS3 code at it and expected results without some work, or proper dev tools. Gotta be fair.
 
The Shin'en dev also spoke briefly of diminishing returns, and ended up by saying:

'I think you can make great games with it. And I'm not sure if the much more powerful PS4 will produce much better looking games.'

'I think that steps will be much much smaller this next generation.'
 

krizzx

Junior Member
So it actually was using tessellation and there is no baking. My eyes did not deceive me. It shall be nice to see what this looks/runs like at release. Its hard to tell if this is Wii U footage or PC footage though. Could be both. Sometimes the screen is locked at 30 FPS. Other times its up to 90. Would seem to lock a frame rate to 30 FPS on a PC.

http://www.youtube.com/watch?feature=player_embedded&v=D2cU8HlEcZY#!

They had good thinks to say about Wii U develop and case any missed(intentionally skipped) them.
http://www.nintendo-insider.com/2013/05/21/precursor-games-say-wii-u-is-a-great-system-to-work-with/
 

fred

Member
Can the feature for texture compression that Toki Tori 2 dev mentioned help in reduce the disparity in relation to RAM from the PS4/XBone, or that is a common feature in GPUs?
Just curious!

Texture compression is a common feature for GPUs, but going by the WiiWare titles released last gen Nintendo have some crazy compression algorithms going on to squeeze a game like Jett Rocket into just 40MB. So yes, imo, Nintendo's texture compression should help in reducing the difference somewhat. How much though I wouldn't like to hazard a guess.

One other thing about the RAM that could be interesting is what Ancel said a while back about the Wii U having 'almost unlimited' RAM. I'm not sure about anyone else but to me that sounded like he was talking about something other than the difference between having 512MB and 1GB to play with.
 
Considering that old story about how back in the day Iwata stepped in with Pokemon Silver/Gold and compressed the game so well that they were able to fit Kanto and the original gyms into the game, it just may be that Iwata has made it a point to have hardware devs emphasize excellent compression methods.
 

wsippel

Banned
If it is 131 GB/s, that would be excellent and the most we can realistically hope for.

Hey wsippel, mind posting a link to that Renesas datasheet? This is one aspect of the gpu that has perhaps gone underanalyzed in light of all the hooplah concerning shaders and such.
The macros are definitely custom, there's nothing quite like it in their archives:

http://www.renesas.com/products/soc/asic/cbic/ipcore/edram/

The closest would be the 64Mbit UX8LD macro cut in half (128kW x 256b instead of 256kW x 256b), but I assume Nintendo is using the elusive high performance UX8GD variant that was once mentioned in a Renesas press release, designed for game consoles and supporting clock speeds up to 800MHz.


Texture compression is a common feature for GPUs, but going by the WiiWare titles released last gen Nintendo have some crazy compression algorithms going on to squeeze a game like Jett Rocket into just 40MB. So yes, imo, Nintendo's texture compression should help in reducing the difference somewhat. How much though I wouldn't like to hazard a guess.

One other thing about the RAM that could be interesting is what Ancel said a while back about the Wii U having 'almost unlimited' RAM. I'm not sure about anyone else but to me that sounded like he was talking about something other than the difference between having 512MB and 1GB to play with.
We don't know if Nintendo has any proprietary texture compression formats. We only know they support S3TC and most likely 3Dc. Though NERD apparently developed a new wavelet based image compression scheme a while ago. Wavelet compression is extremely efficient, but generally considered unsuited for textures, so it's probably unrelated.
 
So help me understand this compression thing. If PS4 has a game that uses 3GB of RAM a good developers with Wii U hardware tools should be able to compress it to fit within its 1GB limit?

I don't think that we can say for sure, but that is the general idea. PS4's 8 gigs of high quality RAM take a lot of work out of the equation for developers who want to just throw unoptimized code and assets around, and allow for potentially epic experiences when the work is put in. While Wii U has an extremely efficient and acceptable pool of RAM that when the hardware is understood, and proper tools are available, can take PC assets, filter them through the 'Wii U process' without much loss of quality, and use them with no problem.

To what extent? We don't know yet. But I imagine that games like Bayonetta 2, and probably Retro's game will give us a better idea.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Sorry, I missed this one:

That is interesting. Also, there is another question I asked a while back that was never properly answered. Couldn't the Wii U's RAM be dual channel since its 2 512 chips? That would double the performance over what people initially believed it was.
It's 4 chips (of 4 Gb each), and we know the characteristics of those - we know what the cumulative throughput of those 4 chips is. It doesn't matter what bus config those sit at - the bus throughput cannot exceed the chips' cumulative throughput.
 

tipoo

Banned
So help me understand this compression thing. If PS4 has a game that uses 3GB of RAM a good developers with Wii U hardware tools should be able to compress it to fit within its 1GB limit?

No one here can claim to know what percentage the compression ratios would be. But 3GB of data in 1GB of memory, while the PS4 and One are unable to? That doesn't sound likely. I'm thinking double digit percentages advantage if there is one, not three hundred percent.

And you're also assuming the PS4 employs no compression of its own, perhaps they have similar implementations. Heck, we don't even know if the PS4s may be better.

And as mentioned, ALL graphics chips have had some texture compression ability for years, and it's gotten better with each generation. We also don't know if the developer was simply talking about that feature, which all GPUs would have, including the One and PS4.
 

wsippel

Banned
No one here can claim to know what percentage the compression ratios would be. But 3GB of data in 1GB of memory, while the PS4 and One are unable to? That doesn't sound likely. I'm thinking double digit percentages advantage if there is one, not three hundred percent.

And you're also assuming the PS4 employs no compression of its own, perhaps they have similar implementations. Heck, we don't even know if the PS4s may be better.

And as mentioned, ALL graphics chips have had some texture compression ability for years, and it's gotten better with each generation. We also don't know if the developer was simply talking about that feature, which all GPUs would have, including the One and PS4.
As unlikely as it is, if Nintendo/ NERD found a way to make wavelets work for texture compression, we'd be looking at ~100:1 instead of 4-6:1. And it might actually work with a Wii U like memory architecture: Read a wavelet compressed texture from MEM2, decompress/ transcode to S3TC, write to MEM1, texture from there.
 
What exactly is the purpose of this thread at this point?

I see a lot of speculation but I thought the purpose of this thread was to confirm, not to speculate?

And what exactly are you all trying to confirm?
 

tipoo

Banned
As unlikely as it is, if Nintendo/ NERD found a way to make wavelets work for texture compression, we'd be looking at ~100:1 instead of 4-6:1. And it might actually work with a Wii U like memory architecture: Read a wavelet compressed texture from MEM2, decompress/ transcode to S3TC, write to MEM1, texture from there.

That would be a nice fantasy, but nothing in line with what the nano assault dev saved, unless they were trying to use 10,000MB textures in the initial release :p

If they shave something off in the hundred megabyte range, I would assume we're talking double digit percentage compression.

And again, we're still not sure if this isn't an ubiquitous feature for other platforms too. Nvidia and AMD both introduced things like this years ago (or rather back when it was ATI).
 

Hermii

Member
What exactly is the purpose of this thread at this point?

I see a lot of speculation but I thought the purpose of this thread was to confirm, not to speculate?

And what exactly are you all trying to confirm?

I think the point is to discuss and speculate about Latte. Its impossible to 100% confirm anything unless a developer spill the beans, but it is possible to make some probable theories.
 

wsippel

Banned
That would be a nice fantasy, but nothing in line with what the nano assault dev saved, unless they were trying to use 10,000MB textures in the initial release :p

If they shave something off in the hundred megabyte range, I would assume we're talking double digit percentage compression.

And again, we're still not sure if this isn't an ubiquitous feature for other platforms too.
Shin'en mostly uses procedural textures, which means they can store infinite terabytes of textures in a few lines of code. You're confusing them with Two Tribes. And as you said, they probably didn't even try to use several gigabytes of textures in the first place. For all we know, that tweet could have been about squeezing 220MB down to 20MB. The whole game is only 536MB as it is.

WayForward also managed to shrink Mighty Switch Force substantially after release, but I believe that was audio related.
 

v1oz

Member
In the end though, people don't want to pay full price for Mass Effect 3 on Wii U when the trilogy is available on other consoles. They don't want Arkham City, it had been out for months already. A big miscalculation made early on by these port developers I think is that Early adopters of Wii U were new to HD gaming. A lot of people owned a PS3 or 360 or both, and had already bought those games, or had passed on them. It isn't fair at all imo to use such ports as a gauge for sales, or as one for what the console is capable.

Considering the CPU make up of all of these consoles, and the fact that they are largely GPU centric, even the more powerful consoles would probably have hiccups and problems if you took one of those games and just threw the 360 or PS3 code at it and expected results without some work, or proper dev tools. Gotta be fair.
That Lego City game is a Wii U exclusive published by Nintendo. Designed from the ground up for the Wii U, its not a port. But from what I read in reviews it suffers from slow down issues and serious pop up. It's not a technically demanding game by any standards it shouldn't be taxing modern hardware at all.
 

v1oz

Member
So it actually was using tessellation and there is no baking. My eyes did not deceive me. It shall be nice to see what this looks/runs like at release. Its hard to tell if this is Wii U footage or PC footage though. Could be both. Sometimes the screen is locked at 30 FPS. Other times its up to 90. Would seem to lock a frame rate to 30 FPS on a PC.

http://www.youtube.com/watch?feature=player_embedded&v=D2cU8HlEcZY#!

They had good thinks to say about Wii U develop and case any missed(intentionally skipped) them.
http://www.nintendo-insider.com/2013/05/21/precursor-games-say-wii-u-is-a-great-system-to-work-with/

That demo is all running on the PC. No word if all the tessellation and real time lighting will make it to the Wii U. The PS3/360 would not be able to run the game at those detail settings.
 
That Lego City game is a Wii U exclusive published by Nintendo. Designed from the ground up for the Wii U, its not a port. But from what I read in reviews it suffers from slow down issues and serious pop up. It's not a technically demanding game by any standards it shouldn't be taxing modern hardware at all.

And loading times. Makes me wonder, too.
 
That Lego City game is a Wii U exclusive published by Nintendo. Designed from the ground up for the Wii U, its not a port. But from what I read in reviews it suffers from slow down issues and serious pop up. It's not a technically demanding game by any standards it shouldn't be taxing modern hardware at all.

You left out a key point made.

Proper tools.

As a near launch title open world game, even though they did have a small window where the current tools are available, you expect that the game should have no frame rate issues or pop in? I haven't played the game myself, but I have heard people say that the last parts of the game seem far more polished and have fewer issues than the earlier portions.
 
Status
Not open for further replies.
Top Bottom