Osiris
I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
There are already games pulling off effects not capable on current gen system.
Such as?
There are already games pulling off effects not capable on current gen system.
Such as?
There are already games pulling off effects not capable on current gen system.
It clearly can not!
well it's obvious that it can't. but it sure can outclass the last Gen systems like the Xbox 360 and ps3 thoughIt clearly can not!
nano assault neo is nothing special and not even close to beyond what the 360 and PS3 can do!
It clearly can not!
nano assault neo is nothing special and not even close to beyond what the 360 and PS3 can do!
I don't know what's worse: people incoherently screaming at each other or an echo chamber cultivated for months.This thread is now only a strand of its former self...
Nano Assault Neo
Also, developers are on record saying that RAM is not a latency bottleneck in the WiiU and that the GPU is more powerful and modern over the curret gen GPUs.
The only bottleneck in the system is the CPU, but this shouldn't stop it from producing better graphics than current gen.
What we are looking at is unoptimized good that was rushed for launch.
There is no need to rely on low level programming or prebaked/faked effects like current gen.
Will the results look the same? Maybe, but it is defintely easier to achieve on the WiiU.
Nano Assualt is pulling off the effect without using tricks. It's API level code easily accessible by the game code.
There is no need to rely on low level programming or prebaked/faked effects like current gen.
You have no idea what you are talking about!
All real time rendering is using tricks!
GFX APIs are always accessible by game code!
The 360 and PS3 also have APIs and don't need to be programmed to the metal.
Pre baking is used in games on all platforms, the WiiU is not even close to powerful enough not to need it ever.
And it is not like the 360 and PS3 don't have games without it.
True, but the Wii U GPU is capable of doing all the "real" effects without faking them in these ports from the current gen HD systems. Third Parties however are not going to be putting extra resources in getting those effects to work on the Wii U - a platform that just launched and has no guarantee that their extra effort and money spent on ports will give them any money back in return.
The Wii U GPU is not an e6760 granted, but the GPU itself will perform very similar to that model with some "custom Nintendo tweaks" added in. 570-600 GFLOPS and all the modern effects like Compute Shading but in an OpenGL format of course......the GPU is not the cause of concern though right?
The Wii U CPU is a new and extremely efficient processor that does Out of Order Execution which has not been shown at all in these ports. The Wii U CPU is most very likely running these ports at "In Order Execution" thus only relying on the raw data transfer in parallel like an older style CPU similar to what is in the Xbox 360. Since it's clocked lower than those older "In Order" CPU's developers are having performance issues with the CPU. Understandable. All the modern CPU's are capable of Out of Order Execution including Sandy Bridge and Bulldozer: http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=98&Itemid=1&limit=1&limitstart=9
Take a look at the Benefits of OoOE vs IoE: http://www.cs.washington.edu/education/courses/csep548/06au/lectures/introOOO.pdf
The Wii U CPU when developed with it's intended way of operation (OoOE) would run circles around the last gen CPU in the Xbox 360. A game like Black Ops 2 that has been in development for 2 + years and made to run from the ground up to take the most advantage of the 360 architecture first was never going to be able to run on Wii U just as smooth without a lot of optimizing and re-coding. The tiny Wii U Treyarch team (6-7 guys) should be congratulated that they were able to get the game running "good enough" for the launch of the console with probably less than a year of development.
Basically we are seeing Xbox 360 games being cut and pasted on the Wii U with very little optimizing and not taking advantage of the very important OoOE CPU that would leverage a lot better results if developers wanted to take the time to use it. Which they don't, and I totally understand the reasons why since it's just not cost effective for them and the game industry to spend a lot money on a new console with a small user base.
Points to take from this:
- The Wii U CPU is modern and not old 7 year old tech
- The Wii U CPU is not being used in these ports like it was designed to be used
- Third Parties are happy porting over Xbox 360 games with minimal optimization and no improvements to graphics since it requires more money and work, which they can't afford.
True, but the Wii U GPU is capable of doing all the "real" effects without faking them in these ports from the current gen HD systems. Third Parties however are not going to be putting extra resources in getting those effects to work on the Wii U - a platform that just launched and has no guarantee that their extra effort and money spent on ports will give them any money back in return.
The Wii U GPU is not an e6760 granted, but the GPU itself will perform very similar to that model with some "custom Nintendo tweaks" added in. 570-600 GFLOPS and all the modern effects like Compute Shading but in an OpenGL format of course......the GPU is not the cause of concern though right?
The Wii U CPU is a new and extremely efficient processor that does Out of Order Execution which has not been shown at all in these ports. The Wii U CPU is most very likely running these ports at "In Order Execution" thus only relying on the raw data transfer in parallel like an older style CPU similar to what is in the Xbox 360. Since it's clocked lower than those older "In Order" CPU's developers are having performance issues with the CPU. Understandable. All the modern CPU's are capable of Out of Order Execution including Sandy Bridge and Bulldozer: http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=98&Itemid=1&limit=1&limitstart=9
Take a look at the Benefits of OoOE vs IoE: http://www.cs.washington.edu/education/courses/csep548/06au/lectures/introOOO.pdf
The Wii U CPU when developed with it's intended way of operation (OoOE) would run circles around the last gen CPU in the Xbox 360. A game like Black Ops 2 that has been in development for 2 + years and made to run from the ground up to take the most advantage of the 360 architecture first was never going to be able to run on Wii U just as smooth without a lot of optimizing and re-coding. The tiny Wii U Treyarch team (6-7 guys) should be congratulated that they were able to get the game running "good enough" for the launch of the console with probably less than a year of development.
Basically we are seeing Xbox 360 games being cut and pasted on the Wii U with very little optimizing and not taking advantage of the very important OoOE CPU that would leverage a lot better results if developers wanted to take the time to use it. Which they don't, and I totally understand the reasons why since it's just not cost effective for them and the game industry to spend a lot money on a new console with a small user base.
Points to take from this:
- The Wii U CPU is modern and not old 7 year old tech
- The Wii U CPU is not being used in these ports like it was designed to be used
- Third Parties are happy porting over Xbox 360 games with minimal optimization and no improvements to graphics since it requires more money and work, which they can't afford.
True, but the Wii U GPU is capable of doing all the "real" effects without faking them in these ports from the current gen HD systems. Third Parties however are not going to be putting extra resources in getting those effects to work on the Wii U - a platform that just launched and has no guarantee that their extra effort and money spent on ports will give them any money back in return.
The Wii U GPU is not an e6760 granted, but the GPU itself will perform very similar to that model with some "custom Nintendo tweaks" added in. 570-600 GFLOPS and all the modern effects like Compute Shading but in an OpenGL format of course......the GPU is not the cause of concern though right?
The Wii U CPU is a new and extremely efficient processor that does Out of Order Execution which has not been shown at all in these ports. The Wii U CPU is most very likely running these ports at "In Order Execution" thus only relying on the raw data transfer in parallel like an older style CPU similar to what is in the Xbox 360. Since it's clocked lower than those older "In Order" CPU's developers are having performance issues with the CPU. Understandable. All the modern CPU's are capable of Out of Order Execution including Sandy Bridge and Bulldozer: http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=98&Itemid=1&limit=1&limitstart=9
Take a look at the Benefits of OoOE vs IoE: http://www.cs.washington.edu/education/courses/csep548/06au/lectures/introOOO.pdf
The Wii U CPU when developed with it's intended way of operation (OoOE) would run circles around the last gen CPU in the Xbox 360. A game like Black Ops 2 that has been in development for 2 + years and made to run from the ground up to take the most advantage of the 360 architecture first was never going to be able to run on Wii U just as smooth without a lot of optimizing and re-coding. The tiny Wii U Treyarch team (6-7 guys) should be congratulated that they were able to get the game running "good enough" for the launch of the console with probably less than a year of development.
Basically we are seeing Xbox 360 games being cut and pasted on the Wii U with very little optimizing and not taking advantage of the very important OoOE CPU that would leverage a lot better results if developers wanted to take the time to use it. Which they don't, and I totally understand the reasons why since it's just not cost effective for them and the game industry to spend a lot money on a new console with a small user base.
Points to take from this:
- The Wii U CPU is modern and not old 7 year old tech
- The Wii U CPU is not being used in these ports like it was designed to be used
- Third Parties are happy porting over Xbox 360 games with minimal optimization and no improvements to graphics since it requires more money and work, which they can't afford.
Wii U:
Such as?
The Wii U is capable of displaying Trine 2 expansion Goblin Menace with the graphics as intended, something developer Frozenbyte believes isn't possible on Xbox 360 and PS3.
..."Basically that does require... well, not huge amounts more graphics processing power, but still considerable. If we would publish that on the other consoles, then I believe that there would be some small downscaling of what it is right now."
I like looking into this thread just to see hilarious posts like this
Just gonna add some gasoline to this inferno: Trine 2 devs did day the Goblin Menace dlc included in the WiiU directors cut wouldn't technically be graphically possible on ps360...
I'm aware this could really be referring to anything, not necessarily a specific graphical effect. But....I like to watch the world burn so there it is
Source: http://www.videogamer.com/wiiu/trine_2/news/wii_u_graphics_would_need_to_be_scaled_back_on_xbox_360_and_ps3.html
True, but the Wii U GPU is capable of doing all the "real" effects without faking them in these ports from the current gen HD systems. Third Parties however are not going to be putting extra resources in getting those effects to work on the Wii U - a platform that just launched and has no guarantee that their extra effort and money spent on ports will give them any money back in return.
The Wii U GPU is not an e6760 granted, but the GPU itself will perform very similar to that model with some "custom Nintendo tweaks" added in. 570-600 GFLOPS and all the modern effects like Compute Shading but in an OpenGL format of course......the GPU is not the cause of concern though right?
The Wii U CPU is a new and extremely efficient processor that does Out of Order Execution which has not been shown at all in these ports. The Wii U CPU is most very likely running these ports at "In Order Execution" thus only relying on the raw data transfer in parallel like an older style CPU similar to what is in the Xbox 360. Since it's clocked lower than those older "In Order" CPU's developers are having performance issues with the CPU. Understandable. All the modern CPU's are capable of Out of Order Execution including Sandy Bridge and Bulldozer: http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=98&Itemid=1&limit=1&limitstart=9
Take a look at the Benefits of OoOE vs IoE: http://www.cs.washington.edu/education/courses/csep548/06au/lectures/introOOO.pdf
The Wii U CPU when developed with it's intended way of operation (OoOE) would run circles around the last gen CPU in the Xbox 360. A game like Black Ops 2 that has been in development for 2 + years and made to run from the ground up to take the most advantage of the 360 architecture first was never going to be able to run on Wii U just as smooth without a lot of optimizing and re-coding. The tiny Wii U Treyarch team (6-7 guys) should be congratulated that they were able to get the game running "good enough" for the launch of the console with probably less than a year of development.
Basically we are seeing Xbox 360 games being cut and pasted on the Wii U with very little optimizing and not taking advantage of the very important OoOE CPU that would leverage a lot better results if developers wanted to take the time to use it. Which they don't, and I totally understand the reasons why since it's just not cost effective for them and the game industry to spend a lot money on a new console with a small user base.
Points to take from this:
- The Wii U CPU is modern and not old 7 year old tech
- The Wii U CPU is not being used in these ports like it was designed to be used
- Third Parties are happy porting over Xbox 360 games with minimal optimization and no improvements to graphics since it requires more money and work, which they can't afford.
I like looking into this thread just to see hilarious posts like this
Someone should tell them they are wrong because Black Ops 2 and GAF told me so.
So, someone actually speaking about the technical side of the Wii U in a thread about specs, and you laugh it off...how very nice of you.
What I find unfortunate is that people will read this (or not) and just post useless, uninformative responses along the lines of "lol". Or they will respond with equally useless posts like mine :/
True, but the Wii U GPU is capable of doing all the "real" effects without faking them in these ports from the current gen HD systems. Third Parties however are not going to be putting extra resources in getting those effects to work on the Wii U - a platform that just launched and has no guarantee that their extra effort and money spent on ports will give them any money back in return.
The Wii U GPU is not an e6760 granted, but the GPU itself will perform very similar to that model with some "custom Nintendo tweaks" added in. 570-600 GFLOPS and all the modern effects like Compute Shading but in an OpenGL format of course......the GPU is not the cause of concern though right?
The Wii U CPU is a new and extremely efficient processor that does Out of Order Execution which has not been shown at all in these ports. The Wii U CPU is most very likely running these ports at "In Order Execution" thus only relying on the raw data transfer in parallel like an older style CPU similar to what is in the Xbox 360. Since it's clocked lower than those older "In Order" CPU's developers are having performance issues with the CPU. Understandable. All the modern CPU's are capable of Out of Order Execution including Sandy Bridge and Bulldozer: http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=98&Itemid=1&limit=1&limitstart=9
Take a look at the Benefits of OoOE vs IoE: http://www.cs.washington.edu/education/courses/csep548/06au/lectures/introOOO.pdf
The Wii U CPU when developed with it's intended way of operation (OoOE) would run circles around the last gen CPU in the Xbox 360. A game like Black Ops 2 that has been in development for 2 + years and made to run from the ground up to take the most advantage of the 360 architecture first was never going to be able to run on Wii U just as smooth without a lot of optimizing and re-coding. The tiny Wii U Treyarch team (6-7 guys) should be congratulated that they were able to get the game running "good enough" for the launch of the console with probably less than a year of development.
Basically we are seeing Xbox 360 games being cut and pasted on the Wii U with very little optimizing and not taking advantage of the very important OoOE CPU that would leverage a lot better results if developers wanted to take the time to use it. Which they don't, and I totally understand the reasons why since it's just not cost effective for them and the game industry to spend a lot money on a new console with a small user base.
Points to take from this:
- The Wii U CPU is modern and not old 7 year old tech
- The Wii U CPU is not being used in these ports like it was designed to be used
- Third Parties are happy porting over Xbox 360 games with minimal optimization and no improvements to graphics since it requires more money and work, which they can't afford.
So, someone actually speaking about the technical side of the Wii U in a thread about specs, and you laugh it off...how very nice of you.
I like looking into this thread just to see hilarious posts like this
I guess it just doesn't make sense if you're looking at it through turd-tinted glasses to begin with...
I think that already happened......but I know what you mean. I think I made a pretty good case about the CPU though, but people will believe what they want.
It might have issues with running current gen games but how would the Wii U fair vs Xbox 360 when running the next gen games? That is what the system was designed more to run......since it has modern technology. It's not that hard to understand when you think about it.
Sorry, but the Wii U CPU is "horrible and slow".
Normally, this really would be a case where the answer is "eDRAM". You'd expect that the massive bandwidth proveded for the FB by storing it in the eDRAM would be perfect for rendering lots of transparencies.Transparencies seem to be causing the bulk of issues in ports, in addition to AI. That's an issue with GPU and RAM, not the CPU. How would you solve that on the Wii U?
PS3 has a similar problem, and in 6 years there's never really been a solution other than to reduce their resolution.
Sorry, but the Wii U CPU is "horrible and slow".
Sorry, but the Wii U CPU is "horrible and slow".
Sorry, but the Wii U CPU is "horrible and slow".
Well....if you read what he's saying you'll see his reasoning as to why that might not be accurate in every scenario.
Sorry, but the Wii U CPU is "horrible and slow".
What he's saying is nonsense. Does he have proof that every single one of those ports were cut and paste jobs?
I don't know what it up with some of the people on here regarding Wii U. Is it personal grudge? I have no idea, nor do I care much at this point. I'm still waiting for a game that shows off the system's capabilities. I gave up on trying to defend the system a while ago. Now they laugh at every time the term "GPGPU" or "modern feature set" is mentioned with no basis at all. Oh well :/
What he's saying is nonsense. Does he have proof that every single one of those ports were cut and paste jobs?
What he's saying is nonsense. Does he have proof that every single one of those ports were cut and paste jobs?
The PS3 hardware was "slow and broken" yet it's fine now.
http://www.theinquirer.net/inquirer/news/1007286/ps3-hardware-slow-broken
Do you have proof they aren't?
Sorry but, can an OoOe cpu run like In Order? Is this something that devs can enable/disable at runtime?
What he's saying is nonsense. Does he have proof that every single one of those ports were cut and paste jobs?
*Looks at port performance*What he's saying is nonsense. Does he have proof that every single one of those ports were cut and paste jobs?
Really? I heard that OoOE CPU's can run IoE just as well as non OoOE CPU's (Dunno what to call them, lol).It'll run IoE instructions very crappily--especially since it doesn't have the brute force to overcome the issue without major re-coding.
See the entire Tegra line.A CPU can be horrible and slow and still be modern and not 7 year old tech.
Even of he doesn't, does that render hos post nonsense?
No, and I'm not making any claims that they were or weren't. But I'm not basing any arguments around a dumb assumption.