• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U clock speeds are found by marcan

I think I've changed my mind guys. A triple core processor running at 1.2 ghz compares favorably to 7 year old technology. This is very good news. And seeing as how the only problem anyone has with the CPU is the clock speed and not the actual architecture from nineteen ninety nine, which is just before the turn of the century, it should also compare very favorably to the Durango and Orbis' CPU, seeing as how it will be clocked below 2 ghz and also feature GPGPU integration just like the Wii U. I am very happy.
I know that you're being funny but considering the fab shrink the CPU is based on turn of the century achitecture just as much as IBM's other low wattage CPUs in this line.

I'm sure that Orbis and Durango CPUs will will outperform it handily but it's probably not as bad a CPU as most are thinking.
 
Agni is 60 fps, so I don't think is too far-fetched the possibility of getting those graphics at 30fps with lower IQ. Also the engine is still being worked on so optimizations are probably still being worked on.
Do you think lower Iq at 720p or 1080p.I personally thing it's going to be the same as this get 720p the standard.
 

Log4Girlz

Member
I know that you're being funny but considering the fab shrink the CPU is based on turn of the century achitecture just as much as IBM's other low wattage CPUs in this line.

I'm sure that Orbis and Durango CPUs will will outperform it handily but it's probably not as bad a CPU as most are thinking.

I for one am confident that this processor is far superior than anything featured in any smart phone or tablet. So it can't be that bad.
 

Easy_D

never left the stone age
But those demos start running on the pc's.Who knows how they are going to run on those systems.

Now I'm wondering if the 360 outclassed the top of the line PC's back when the 360 launched? Because if the 360 was better, who's to say the Durango won't be? Other than hardware costs
 

QaaQer

Member
Now I'm wondering if the 360 outclassed the top of the line PC's back when the 360 launched? Because if the 360 was better, who's to say the Durango won't be? Other than hardware costs

If ms has poured some money into R&D, the 720 might be pretty cool. I wish there were more leaks. Do we even know if it is IBM or AMD yet?
 
Now I'm wondering if the 360 outclassed the top of the line PC's back when the 360 launched? Because if the 360 was better, who's to say the Durango won't be? Other than hardware costs
I had a fairly powerful PC when the 360 launched and the video card I had at the time was better than Xenos but the triple-core Xenon was better than my dual core CPU.

With game optimizations for the 360 games looked better than on my PC also. The same will probably happen when the 720 launches in terms of games looking better due to optimization but it won't take long for PCs to gain the upper hand again.
 

jmc1987

Neo Member
Looking at the next gen demo videos like Agni or Unreal 4 how can you said he's wrong? Until nothing shows the contrary I think he's right.

He is certainly not right, because they haven't been proven to be capable of running either tech demo at that level. Nothing has been shown to prove that either system is that capable. He may eventually be proven right, but I doubt that he is until it's proven that they can.
 

Rolf NB

Member
Now I'm wondering if the 360 outclassed the top of the line PC's back when the 360 launched? Because if the 360 was better, who's to say the Durango won't be? Other than hardware costs
Never going to happen again. "Top of the line PCs" are now beyond 400W. Noone will make a console with such extreme power consumption, and as a consequence noone will match bleeding-edge PC specs ever again.
 

Rolf NB

Member
Did Intel's R&D go on vacation from 2003-2010 and then decide to make the core tech significantly faster in a short span of time?
The Pentium 4 architecture was a collossal dud, but they also released the Pentium M (Banias/Dothan under "Centrino" branding) in 2003, which formed the foundation for a still running streak of great Intel architectures.
 

OryoN

Member
Interesting if true...

https://twitter.com/marcan42/status/274856397915697152

Hector Martin
‏@marcan42
If you want more evidence that MHz isn't everything, a little birdie points out that Durango (Xbox 720) is specc'ed to have a 1.6GHz CPU.

So what does this mean?

It means that - according to the logic that many GAFers apply to Wii U - Xbox Next will have a sub-current-gen CPU.

If true, it probably means that a large focus for next-gen consoles will be about getting the best bang for your buck, on a performance per watt basis, especially for the CPU. Sounds like a "duh, that's always the case" response, but that wasn't actually the trend. Large, hot, CPUs that didn't came anywhere close to their theoretical max probably wasn't the best way to achieve that goal.
 

Meelow

Banned
It means that - according to the logic that many GAFers apply to Wii U - Xbox Next will have a sub-current-gen CPU.

If true, it probably means that a large focus for next-gen consoles will be about getting the best bang for your buck, on a performance per watt basis, especially for the CPU. Sounds like a "duh, that's always the case" response, but that wasn't actually the trend. Large, hot, CPUs that didn't came anywhere close to their theoretical max probably wasn't the best way to achieve that goal.

Isn't their rumors about Xbox 720 having a "painfully low" processor.

Yep.

http://www.vg247.com/2012/09/06/xbox-720-release-delayed-due-to-manufacturing-issue-report/

So if it is true, all of the current/next gen consoles will focus much more on the GPU then the CPU like others said.

build quality

build quality

build quality

"The site’s source went so far as to say the build quality wasn’t even up to “horrid” yet."

Missed that word, sorry.
 

Spiegel

Member
I think I've changed my mind guys. A triple core processor running at 1.2 ghz compares favorably to 7 year old technology. This is very good news. And seeing as how the only problem anyone has with the CPU is the clock speed and not the actual architecture from nineteen ninety nine, which is just before the turn of the century, it should also compare very favorably to the Durango and Orbis' CPU, seeing as how it will be clocked below 2 ghz and also feature GPGPU integration just like the Wii U. I am very happy.

science-sarcasm-Professor-Frink-Comic-Book-Guy-631.jpg
 

gofreak

GAF's Bob Woodward
Another core. Who knows what else. We barely known what's in the Wii U CPU and that's out.


Or several more. With threading. I very much doubt MS will put 3 or 4 single-threaded older Powers in their machine @ 1.6Ghz though. But if that clock was true it will be interesting to see the design.
 

DSN2K

Member
you just have to look at AMD's 8 "Core" Bulldozers/Piledriver compared to Intels i5 and even worse i3's to see clock for clock not the whole answer.
 

Ryoku

Member
Another core. Who knows what else. We barely known what's in the Wii U CPU and that's out.

More like 3-5 more cores. If the rumor is correct (in that the ARM multi-core processor handles the OS on Wii U) then I wouldn't be surprised to see a couple of cores in Durango (or PS3, even) be locked for the OS. We've already seen something similar on PS3. In the end, though, it seems that Durango/PS4 will have more cores for development than Wii U, regardless.
 
More like 3-5 more cores. If the rumor is correct (in that the ARM multi-core processor handles the OS on Wii U) then I wouldn't be surprised to see a couple of cores in Durango (or PS3, even) be locked for the OS. We've already seen something similar on PS3. In the end, though, it seems that Durango/PS4 will have more cores for development than Wii U, regardless.
Speaking of the multi-core ARM in the Wii U, did we get anymore info about that? If it is like scarlet, I would think that it would be clocked at the same frequency as the GPU, which is 550MHz. Scarlet was an ARM9, so if its 550MHz, would it probably be ARM11/Cortex-Ax?
 
Question for people who know these things:

How powerful would XB3/PS4 have to be to completely shut Wii U out of the picture? Like, if you had PC style graphics settings on all multiplats, how powerful would they need to be that Wii U couldn't even handle "lowest settings?"
 

Meelow

Banned
Question, since the Wii U GPU is 550MHz correct? Is that above expectations, below or about right?

If it turns out true Xbox 720 is clocked at 1.6ghz there's going to be a lot of people jumping out of tenth floor windows with their 360s strapped to their back before even waiting for the first media of actual games to appear.

IF this is true, it makes me wonder if people will still attack the Wii U CPU.
 

Kai Dracon

Writing a dinosaur space opera symphony
If it turns out true Xbox 720 is clocked at 1.6ghz there's going to be a lot of people jumping out of tenth floor windows with their 360s strapped to their back before even waiting for the first media of actual games to appear.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
If it turns out true Xbox 720 is clocked at 1.6ghz there's going to be a lot of people jumping out of tenth floor windows with their 360s strapped to their back before even waiting for the first media of actual games to appear.

I would expect they'll wait until they know the number of cores and architecture before anything rash though, given that most of the ire drawn by the Wii U CPU has been based on a combination of it's clockspeed, core count, architecture and developer comments.

Of course it's more fun to ignore that and retrospectively claim that all criticism was due to clock speed alone.
 

LeleSocho

Banned
Question for people who know these things:

How powerful would XB3/PS4 have to be to completely shut Wii U out of the picture? Like, if you had PC style graphics settings on all multiplats, how powerful would they need to be that Wii U couldn't even handle "lowest settings?"

Let me see if i understand what are you saying... how much powerful the next consoles have to be to have something where even the worst of their games looks significantly better than WiiU best game? I guess twice as powerful is enough and from what we know ps4 and x720 are already better than that.
 

wsippel

Banned
Question, since the Wii U GPU is 550MHz correct? Is that above expectations, below or about right?
If Marcan says it's 550MHz, it's 550MHz. lherre and Arkam basically confirmed it. It's a bit higher than I expected at least, and apparently quite a bit higher than Nintendo initially planned. The Zelda and Japanese Garden techdemos were seemingly running on lower clocked machines, so that's certainly promising.
 

ArynCrinn

Banned
If it turns out true Xbox 720 is clocked at 1.6ghz there's going to be a lot of people jumping out of tenth floor windows with their 360s strapped to their back before even waiting for the first media of actual games to appear.

As long as performance of the system and it's games are up-to-par or monstrously better, there shouldn't be any issues (not the case with Wii U obviously). Sad to see so many people up in arms crying because they can't potentially have a psychological circle jerk to amazing polys counts, crisp textures and IQ. It's PERFORMANCE and overall smoothness that has the most bearing on gameplay and potential gameplay and expanded features, graphics in and of themselves are vastly overrated as a selling point and sticking point. But that's the story of the "gamer" (from critic to consumer) and really this gen as a whole.

Ironically, the "casual" is the least exploited by this type of thinking...
 

Meelow

Banned
If Marcan says it's 550MHz, it's 550MHz. lherre and Arkam basically confirmed it. It's a bit higher than I expected at least, and apparently quite a bit higher than Nintendo initially planned. The Zelda and Japanese Garden techdemos were seemingly running on lower clocked machines, so that's certainly promising.

Interesting. I'm not going to lie that gets me really hyped for built from the ground up Wii U games.

The Wii's GPU was 243MHz so imagine what Nintendo can do with a modern GPU with 550MHz
 
Question, since the Wii U GPU is 550MHz correct? Is that above expectations, below or about right?



IF this is true, it makes me wonder if people will still attack the Wii U CPU.

People will attack the Wii U CPU regardless because it will undoubtedly be ass compared to the other current gen consoles and only around the same level as the CPU's of last gen's consoles
 
If Marcan says it's 550MHz, it's 550MHz. lherre and Arkam basically confirmed it. It's a bit higher than I expected at least, and apparently quite a bit higher than Nintendo initially planned. The Zelda and Japanese Garden techdemos were seemingly running on lower clocked machines, so that's certainly promising.
That is a good point. Until we get some Wii U games that display its power, expect to see those Zelda demo-gigs for awhile longer. :)

At least now we know for sure that the system was weaker back then. An almost 40% increase for the GPU and 25% for the CPU is not insignificant.
 
Let me see if i understand what are you saying... how much powerful the next consoles have to be to have something where even the worst of their games looks significantly better than WiiU best game? I guess twice as powerful is enough and from what we know ps4 and x720 are already better than that.

No, not just a looks thing, a "can't even run at lowest settings." If you took Black Ops 4, turned off all the effects, reduced the draw distance, etc, (the equivalent of turning all the settings to low on the PC version) would it be able to run on Wii U? If not, then Nintendo under-powered Wii U out of next gen multiplats. If yes, then there's no real reason for the games not to show up besides publisher apathy/antipathy.
 

LeleSocho

Banned
No, not just a looks thing, a "can't even run at lowest settings." If you took Black Ops 4, turned off all the effects, reduced the draw distance, etc, (the equivalent of turning all the settings to low on the PC version) would it be able to run on Wii U? If not, then Nintendo under-powered Wii U out of next gen multiplats. If yes, then there's no real reason for the games not to show up besides publisher apathy/antipathy.

If you remove stuff you can make run everything even on N64 in fact some CoD Black Ops1 is present on the Wii even if it's terribly outdated... there's no such a thing like "like the PC games" since every game is its own story. You are not clear enough
in the end if they now have problems porting games on the wiiu when the others are out probably they will have even more problems.
 

netBuff

Member
If yes, then there's no real reason for the games not to show up besides publisher apathy/antipathy.

Or maybe, just maybe, they won't show up because the cost of porting the game to the Wii U will be too high in relation to the expected return on investment and available audience for a port.

Publishers really hate Nintendo, that's what it is!

As long as performance of the system and it's games are up-to-par or monstrously better, there shouldn't be any issues (not the case with Wii U obviously). Sad to see so many people up in arms crying because they can't potentially have a psychological circle jerk to amazing polys counts, crisp textures and IQ. It's PERFORMANCE and overall smoothness that has the most bearing on gameplay and potential gameplay and expanded features, graphics in and of themselves are vastly overrated as a selling point and sticking point. But that's the story of the "gamer" (from critic to consumer) and really this gen as a whole.

Ironically, the "casual" is the least exploited by this type of thinking...

As a gamer, I want the best possible experience. While Wii U downports will certainly be possible (maybe even likely), at least I'm not going to buy them when better versions of the same games are available. Graphics do matter to the experience - framerate, pop-in, tearing are all very detrimental: They are what is suffering the most on current consoles, future downports certainly won't fare better on that front.
 

remnant

Banned
If it turns out true Xbox 720 is clocked at 1.6ghz there's going to be a lot of people jumping out of tenth floor windows with their 360s strapped to their back before even waiting for the first media of actual games to appear.
No people will just admit CPU are no longer the big driver in a console and that the GPU is more important.

Basically the exact opposite of what they say in WiiU threads
 
Of course it's more fun to ignore that and retrospectively claim that all criticism was due to clock speed alone.

Yeah, I said earlier that I don't understand the push to concentrate on dispelling the myth that clock speed is an important factor in performance. I feel like its akin to me focusing "it's not the size that counts; it's how you use it!" rhetoric in response to accusations that I have a small penis when more damaging testimonials like "he lacks stamina," and "he has no idea how to please a woman" are being tossed out there about me by past lovers.
 

LeleSocho

Banned
No people will just admit CPU are no longer the big driver in a console and that the GPU is more important.

Basically the exact opposite of what they say in WiiU threads

No, they probably will understand that clock isn't everything but yet they will still (rightly) shit on the Espresso
 

Meelow

Banned
No people will just admit CPU are no longer the big driver in a console and that the GPU is more important.

Basically the exact opposite of what they say in WiiU threads

If that happens I might have to look at their post history and find the post saying "CPU are better" and show it to them lol.
 
Or maybe, just maybe, they won't show up because the cost of porting the game to the Wii U will be too high in relation to the expected return on investment and available audience for a port.

Publishers really hate Nintendo, that's what it is!



As a gamer, I want the best possible experience. While Wii U downports will certainly be possible (maybe even likely), at least I'm not going to buy them when better versions of the same games are available. Graphics do matter to the experience - framerate, pop-in, tearing are all very detrimental: They are what is suffering the most on current consoles, future downports certainly won't fare better on that front.

Which matters more to a gaming experience, graphical things like framerate, pop-in, tearing, etc, or functional things like off-screen play, persistent maps, new inputs (touch, gyro)?

If I had the choice between GTA VI on Wii U with off screen and mini map versus GTA VI on XB3 with neither of those but 50% better graphical performance, I know which one I'd buy everytime. Yes, Wii U versions of multiplats are unlikely to have the best graphics. No, this doesn't make them the inferior versions.
 

Log4Girlz

Member
So you finally believe that PS4 / 720's CPU wont be clocked anywhere near '7 YEAR OLD TECH !!!' of PS360 ? ;).

Only took you a week of denial in EC's WiiU spec thread lol...

Fells strange now the shoes on the other foot eh :D

I was pretty sure the new CPU's would be clocked at 10 ghz and totally would run counter to modern chip design where more cores with modern architectures and threads at a lower clock were more common. I mean totally.
 

ArynCrinn

Banned
As a gamer, I want the best possible experience. While Wii U downports are certainly be possible (maybe even likely), at least I'm not going to buy them when better versions of the same games are available. Graphics do matter to the experience - framerate, pop-in, tearing are all very detrimental.

That's not "graphics" in the sole aesthetic sense though, that's performance! That was my point.

And I agree 100% with buying the superior version of a game, be it for better visuals or whatever. I was just pointing out that it's not the visuals in and of themselves that are most important, it's the performance. That's what enables better gameplay and a smoother experience. But it could well be the case that millions are spent on big budget titles that do literally nothing new for gameplay or performance (sticking to 30fps etc) and just try and push the visual/cinematic bar while doing nothing new. Right now, that sells copies and systems, and that's sad.
 

Easy_D

never left the stone age
Never going to happen again. "Top of the line PCs" are now beyond 400W. Noone will make a console with such extreme power consumption, and as a consequence noone will match bleeding-edge PC specs ever again.

Oh yes, I forgot about power consumption, size and heat. Fair point.

If Marcan says it's 550MHz, it's 550MHz. lherre and Arkam basically confirmed it. It's a bit higher than I expected at least, and apparently quite a bit higher than Nintendo initially planned. The Zelda and Japanese Garden techdemos were seemingly running on lower clocked machines, so that's certainly promising.

Wait. So the CPU speed was set in stone when those demos were showed, but not the GPU, which has seen a significant increase in power? Well shit, games looking as good as the Zelda demo is fine by me to be honest.
 
Yeah, I said earlier that I don't understand the push to concentrate on dispelling the myth that clock speed is an important factor in performance. I feel like its akin to me focusing "it's not the size that counts; it's how you use it!" rhetoric in response to accusations that I have a small penis when more damaging testimonials like "he lacks stamina," and "he has no idea how to please a woman" are being tossed out there about me by past lovers.

The trolls in this thread have already made up their minds that everyone bashing the CPU is a dumb dumb who thinks clock speed equals power and doesn't understand "architecture" and "OoOE".

Hilarious how ps4/720 CPUs keep getting brought up as a counterpoint.

They wont.

Looking at how developers are basing their next gen games around ps4/720, they already have.
 

Log4Girlz

Member
The trolls in this thread have already made up their minds that everyone bashing the CPU is a dumb dumb who thinks clock speed equals power and doesn't understand "architecture" and "OoOE".

Hilarious how ps4/720 CPUs keep getting brought up as a counterpoint.

If the Orbis and Durango's CPU were clocked at 1.25 ghz they will claim a huge victoly and try to shove this in their doubter's faces.
 
Top Bottom