Vinci said:Comparing it to the PS3? No.
Why would it be compared to the PS3? It's a handheld.
Vinci said:Comparing it to the PS3? No.
Plinko said:How so?
Zombie James said:Is that just for the card or total system consumption?
HomerSimpson-Man said:At that resolution you're really stretched the limits of your 4850 especially to the huge 1080p frame buffer at only 512MB RAM with Witcher 2 with near max settings. You're better suited towards lower resolutions and easing the settings some more.
Are you sure about that?HomerSimpson-Man said:I looked at the 480GTX and 580GTX, the wattage and heat usage is scary at over 350 watt. The 480GTX was released the in the Spring of last year and 580GTX fall/winter and the different is like 10 watts. It's no wonder Dennis doubts it in a next gen console.
The 480GTX is now a year old, but that thing is still a monster of a card that goes for like $350-400.
Benchmarks that I've seen have put the 580GTX between 280-320 watts under load and most computers using that card are usually paired with a 750 watt+ power supply.Zombie James said:Is that just for the card or total system consumption?
Best wait for confirmation.AceBandage said:So, about a 4850, just like we've been hearing.
That's far more than just a "50% increase."
Ohhh StuBuuuuuurns.
Yes, because consoles have never done that before. Do you choose to ignore the IBM CPU news too? Or is 2010 too old?nib95 said:Disappointing. Old tech in new hardware. I hope they don't plan to charge much for it...
Read the thread. This has been standard practice for the gaming industry since the beginning. The PS3 and 360 are in fact anomalies to this.nib95 said:Disappointing. Old tech in new hardware. I hope they don't plan to charge much for it...
SolarPowered said:Best wait for confirmation.
HomerSimpson-Man said:Apparently just the GPU.
150 idle, 360+ under full load. @_@
Of course load varies from game to game, and your card won't constantly get slammed running a game, but still, the ground level and ceiling for the card...wow.
http://www.bit-tech.net/hardware/graphics/2010/11/09/nvidia-geforce-gtx-580-review/8
Thw Wii was also an anomaly going in the opposite direction. I'm starting to think that the system will be underpowered compared to the current rumors to accommodate the crazy new controller but I hope that Nintendo surprises me.doomed1 said:Read the thread. This has been standard practice for the gaming industry since the beginning. The PS3 and 360 are in fact anomalies to this.
Also, It shouldn't cost more than $350 (at least that's my current limit for day 1 purchase).
And the increase from idle to peak can be mostly attributed to the GPU.Zombie James said:See, this is why I asked. Some sites don't label their charts correctly for power consumption. Others, like Anandtech, do: http://www.anandtech.com/show/4008/nvidias-geforce-gtx-580/17
Total SYSTEM power consumption = 173W idle, 389W Crysis, 452W Furmark.
doomed1 said:Read the thread.
Sho_Nuff82 said:This is assuming it's a 4870.
GoldenEye 007 said:Yes, because consoles have never done that before. Do you choose to ignore the IBM CPU news too? Or is 2010 too old?
Furmark's designed to stress GPUs, so yeah.wsippel said:And the increase from idle to peak can be mostly attributed to the GPU.
nib95 said:I don't care about what's happened in the past. I'm comparing it to NOW. With the current hardware climate and current console specs, this is a sorry excuse for hardware from Nintendo. Come the new Playstation and Microsoft consoles, once again Nintendo will be far away in the distance charging silly money for dated hardware and visuals/tech.
Not really. The NES and SNES were both running off of decade old hardware. Now, we can know for certain that the WiiU is more powerful than the PS3 and 360, so claiming it's "underpowered" is foolish, silly, and creates empty confirmation bias. Until the PS4 and Xbox 720 or whatever are announced, there is neither proof nor reason to believe that the WiiU will be underpowered in any way shape or form. This "problem" is one entirely made up in your head and doesn't exist.Saint Gregory said:Thw Wii was also an anomaly going in the opposite direction. I'm starting to think that the system will be underpowered compared to the current rumors to accommodate the crazy new controller but I hope that Nintendo surprises me.
You mean the ones that don't exist yet and only live in your imagination?nib95 said:I don't care about what's happened in the past. I'm comparing it to NOW. With the current hardware climate and current console specs, this is a sorry excuse for hardware from Nintendo. Come the new Playstation and Microsoft consoles, once again Nintendo will be far away in the distance charging silly money for dated hardware and visuals/tech.
nib95 said:I don't care about what's happened in the past. I'm comparing it to NOW. With the current hardware climate and current console specs, this is a sorry excuse for hardware from Nintendo. Come the new Playstation and Microsoft consoles, once again Nintendo will be far away in the distance charging silly money for dated hardware and visuals/tech.
Willy105 said:Why would it be compared to the PS3? It's a handheld.
nib95 said:I don't care about what's happened in the past. I'm comparing it to NOW. With the current hardware climate and current console specs, this is a sorry excuse for hardware from Nintendo. Come the new Playstation and Microsoft consoles, once again Nintendo will be far away in the distance charging silly money for dated hardware and visuals/tech.
Don't you have to be at least 13 years old to post on GAF?nib95 said:I don't care about what's happened in the past. I'm comparing it to NOW. With the current hardware climate and current console specs, this is a sorry excuse for hardware from Nintendo. Come the new Playstation and Microsoft consoles, once again Nintendo will be far away in the distance charging silly money for dated hardware and visuals/tech.
DragonKnight said:You really have very little hardware knowledge. Ace, we could use that thread you were talking about the other day
Seriously.DragonKnight said:You really have very little hardware knowledge. Ace, we could use that thread you were talking about the other day
What do you mean "nooooo"? Make the damn thread and put the work into convincing the mods it's of value.AceBandage said:I told you guys. Mandatory thread about how consoles and game development works.
But noooo.
That is seriously fucking mindblowing. I think Mr_Brit is fucking insane at this point. I was willing to entertain the possibility before...but that thing is a goddamned monster that should not exist on this planet.Zombie James said:See, this is why I asked. Some sites don't label their charts correctly for power consumption. Others, like Anandtech, do: http://www.anandtech.com/show/4008/nvidias-geforce-gtx-580/17
Total SYSTEM power consumption = 173W idle, 389W Crysis, 452W Furmark.
You are in the extreme minority now. Anyone who actually expects the WiiU to pack more than a 4870 is several things other than sane. They are insane, ignorant of pc hardware, or they are trolls.nib95 said:I don't care about what's happened in the past. I'm comparing it to NOW. With the current hardware climate and current console specs, this is a sorry excuse for hardware from Nintendo. Come the new Playstation and Microsoft consoles, once again Nintendo will be far away in the distance charging silly money for dated hardware and visuals/tech.
shadyspace said:What do you mean "nooooo"? Make the damn thread and put the work into convincing the mods it's of value.
DragonKnight said:You really have very little hardware knowledge. Ace, we could use that thread you were talking about the other day
nib95 said:Sorry, but with current hardware prices on the PC front, minimum spec in my personal opinion should have been R900. It's not only more feature rich and better performing, but far more efficient as well in terms of heat output and energy consumption.
I guess I'll form a proper opinion once we know the full specs of this custom GPU. Because if adapted enough, it could technically still have room for a good margin of added prowess.
"Based on R700" is what we hear all the time. All AMD GPUs are based on some older AMD GPUs. That tells us nothing. Whatever ends up in WiiU most certainly isn't any R700 as you know it. Nintendo has a quite talented chip design team in the US. What probably happened was that two or three years ago, Nintendo contracted AMD to design a chip for their next system. Nintendo sent their design guys from NTD over, AMD took whatever was bleeding edge at the time, and they started working on it together for the next few years - the result being whatever powers WiiU. It's probably not even finished right now - doesn't get any more "bleeding edge", right?nib95 said:I don't care about what's happened in the past. I'm comparing it to NOW. With the current hardware climate and current console specs, this is a sorry excuse for hardware from Nintendo. Come the new Playstation and Microsoft consoles, once again Nintendo will be far away in the distance charging silly money for dated hardware and visuals/tech.
Actually I wouldn't take that as confirmation of anything. While I love Engadget, sometimes they jump to conclusions without knowing the full picture, or drop some news where potentially erroneous implications will be made since they haven't given the full details.Plinko said:That's what I took from this article. This absolutely confirms the hardware is capable of pumping out data to multiple controllers.
This is huge positive news.
AceBandage said:There's far more qualified people that would have better insight (particularly in game development). I'd be glad to help throw a topic together, but without prior mod consent, it would just become a trolling ground and a place for people to point out where I'm wrong.
It needs to be a collaboration between a few different knowledgeable people.
DragonKnight said:I mean lets think about the scenario here: top of the line gaming rigs cost several hundreds of dollars, shrinking and customizing always adds even more cost, and console RAM is expensive. You think Sony and Micro are going to sell at losses upwards of 200-300 dollars? Both are going to maximize performance and minimize loss. We can assume that both will sell at a loss but certainly not one in the $100+ arena anymore. Hell, with the NGP not being profitable until 2015, I don't expect the ps4 to be the ps3 of the next gen. Unless the entire playstation division wants to be in the red until 2018 lol.
AceBandage said:You really can't compare consoles to PCs though.
First off, PCs can deal with heat and power a lot better than consoles. An R900 would melt a console.
Second, consoles are closed function machines. Games are programed specifically for them and don't have to worry about different specs or overhead like PCs do.
AceBandage said:You really can't compare consoles to PCs though.
First off, PCs can deal with heat and power a lot better than consoles. An R900 would melt a console.
Second, consoles are closed function machines. Games are programed specifically for them and don't have to worry about different specs or overhead like PCs do.
Nintendo-4Life said:I have but one question, how will this be compared to the next generation PS4 and X720 (assuming they go all out in power like they did this gen)? Will it be a generation behind like the Wii was? Will it be the "dreamcast" of next gen? or more like the PS2 in comparison to GC and Xbox?
I'm in the PS2 comparison camp but it depends on Sony and Microsoft. It all comes down to their timing and investment. If they take a hit on hardware again they could easily outgun Nintendo the question is if that is the most wise thing to do.Nintendo-4Life said:I have but one question, how will this be compared to the next generation PS4 and X720 (assuming they go all out in power like they did this gen)? Will it be a generation behind like the Wii was? Will it be the "dreamcast" of next gen? or more like the PS2 in comparison to GC and Xbox?
Eiji said:Too much Dragon Ball in this thread
Well, it's not. The process has nothing to do with the GPU family, and the main difference, efficiency wise, is the difference between VLIW5 (R700) and VLIW4 (recent Radeons) from what I can tell. Raw performance per Watt is pretty much identical. Depending on the workload, VLIW4 can be more or less efficient. Neither is a golden bullet, neither is inherently more efficient.nib95 said:Firstly, it was to my understanding that R900 was more efficient than RV770. Smaller nm processes and better efficiency in both heat and power.