• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Community Thread

Status
Not open for further replies.

JordanN

Banned
Yeah, that I understand, what Im wondering about is more the quality, as in is that good or bad on a 1080p screen, and what type of hardware would you need to produce such textures.

edit... how much ram?
A single 8192x8192 texture uses 96Mb.
 

Meelow

Banned
As BG has said many times Wii U will be capable of 'scaled down' PC / PS4 / 720 multiplatform games, it's whether the developers will want to spend the time and money on developing them for the console that's the issue.

The best analogy he ever wrote on here was something along the lines of 'PS360 were 20 times more powerful than the Wii, PS4 / 720 will be 4 times more powerful at most'.

That tells you everything you really need to know about it's power imo, pretty impressive and don't judge it by NSMB U and Nintendo Land.

Aw, i love that Majora's Mask video, such a creepy soundtrack, it really didn't help that i watched that about 4 mins before Nintendo's E3 2012 conference lol ! :'(.

It is awesome how the Wii U is 24x more powerful than the Wii, and I actually thought Nintendo Land in the main area looked good graphically.

Someone want to link me to that Majora's Mask video?

http://www.youtube.com/watch?v=cyazYYev7Nw
 

USC-fan

Banned
You're taking what the man said out of context.

Optimization on a closed platform is higher, specially when the generation advances and it starts being outdated; that doesn't mean it does more gflops or is in parity with more powerful hardware, just that lots of the time they start doing stuff in roundabout ways in order to preserve performance; kinda like how Shadow of the Colossus was able to fake HDR on the PS2 or how for example PC games can often waste resources on dynamic shadows whereas if you're really tight on a console and the game allows it you might as well just pre-bake the whole level in order to save resources. Gflops don't increase and doing 1080p won't be any easier to do on a platform with less gigaflops than a PC just because it's a closed platform, it's not suddenly the same because a lot of those tasks rely on power alone.
That was not his reason:
Ryan Shrout: Focusing back on the hardware side of things, in previous years’ Quakecons we've had debates about what GPU was better for certain game engines, certain titles and what features AMD and NVIDIA do better. You've said previously that CPUs now, you don't worry about what features they have as they do what you want them to do. Are we at that point with GPUs? Is the hardware race over (or almost over)?

John Carmack: I don't worry about the GPU hardware at all. I worry about the drivers a lot because there is a huge difference between what the hardware can do and what we can actually get out of it if we have to control it at a fine grain level. That's really been driven home by this past project by working at a very low level of the hardware on consoles and comparing that to these PCs that are true orders of magnitude more powerful than the PS3 or something, but struggle in many cases to keep up the same minimum latency. They have tons of bandwidth, they can render at many more multi-samples, multiple megapixels per screen, but to be able to go through the cycle and get feedback... “fence here, update this here, and draw them there...” it struggles to get that done in 16ms, and that is frustrating.

Ryan Shrout: That's an API issue, API software overhead. Have you seen any improvements in that with DX 11 and multi-threaded drivers? Are those improving that or is it still not keeping up?

John Carmack: So we don't work directly with DX 11 but from the people that I talk with that are working with that, they (say) it might [have] some improvements, but it is still quite a thick layer of stuff between you and the hardware. NVIDIA has done some direct hardware address implementations where you can bypass most of the OpenGL overhead, and other ways to bypass some of the hidden state of OpenGL. Those things are good and useful, but what I most want to see is direct surfacing of the memory. It’s all memory there at some point, and the worst thing that kills Rage on the PC is texture updates. Where on the consoles we just say “we are going to update this one pixel here,” we just store it there as a pointer. On the PC it has to go through the massive texture update routine, and it takes tens of thousands of times [longer] if you just want to update one little piece. You start to advertise that overhead when you start to update larger blocks of textures, and AMD actually went and implemented a multi-texture update specifically for id Tech 5 so you can bash up and eliminate some of the overhead by saying “I need to update these 50 small things here,” but still it’s very inefficient. So I’m hoping that as we look forward, especially with Intel integrated graphics [where] it is the main memory, there is no reason we shouldn't be looking at that. With AMD and NVIDIA there's still issues of different memory banking arrangements and complicated things that they hide in their drivers, but we are moving towards integrated memory on a lot of things. I hope we wind up being able to say “give me a pointer, give me a pitch, give me a swizzle format,” and let me do things managing it with fences myself and we'll be able to do a better job.

He has said the same thing in dozen of interviews

it is unhappily true that we have these consoles here running at sixty frames per second, and we could have these massively more powerful PC systems that struggle sometimes to hold the same framerate because of unnecessary overheads. If we were programming that hardware directly on the metal the same way we do consoles, it would be significantly more powerful.

great watch
http://www.computerandvideogames.com/306236/id-softwares-john-carmack-20-minute-video-interview/

Of course gflop dont get created but I was making easy for non tech people to understand. SO when they make these statement that it running on high end pc system and can never run on a console that is not true. If the leak specs for ps470 they will be able to match any high end single gpu pc today. Not on paper but it doesnt have to....it a closed box.


Do what now?
?????

Good thread on gaf on this http://www.neogaf.com/forum/showthread.php?t=434165
 
Next gen is going to be soooo fun.

We're going to see this,
enginedemo.gif
and this,
nonplayabledemo.gif
and this,
cutscene.gif


Personally, you would have to be a graphic's whore to hate any of them.

Or you'd have to not be a graphics whore. The above are just movies. There's nothing actually in them that shows how they'd increase the actual fun in a game. Maaaaaaybe the Zelda demo could represent actual playable content, but that's a stretch, and the camera angles it displays kind of lend towards rather un-fun gameplay, so I imagine it won't be quite like that.
 

Ryoku

Member
It is awesome how the Wii U is 24x more powerful than the Wii, and I actually thought Nintendo Land in the main area looked good graphically.

PS360 = 20x more powerful than Wii.
Wii U = 3x as powerful as PS360.
Wii U = 60x as powerful as Wii.

Take it as you will. Just trying to correct some math. I don't like the overwhelmingly vague nature of multipliers, but oh well.
 

AJSousuke

Member
Or you'd have to not be a graphics whore. The above are just movies. There's nothing actually in them that shows how they'd increase the actual fun in a game. Maaaaaaybe the Zelda demo could represent actual playable content, but that's a stretch, and the camera angles it displays kind of lend towards rather un-fun gameplay, so I imagine it won't be quite like that.
you'd have to be a gameplay whore to hate those ;)

A game looking like that would either have less gameplay or length on it.

I hope Nintendo can keep up with the others and offer us games more based on gameplay inmersion than simple graphics inmersion.
 

Terrell

Member
Why do you think despite reaching price parity (and actually selling for less now on average) the PS3 failed to gain any major traction over the 360 in North America?

Not only that, but as mentioned before the account is tied to other investments of time and money - XBLA purchases and achievements.

2 things here:

1) The reason the PS3 isn't selling is because all it has to differentiate itself from the 360 are Sony exclusives, EVERYTHING else is available on 360. THAT, combined with the major marketing fuck-ups of its first 3 years, sealed the fate of the PS3 in Western territories, not some mythical tie to XBL. Anyone could tell you that.

2) So your XBLA purchases stop working when you stop paying for XBL? That's.... kind of illegal, dude.

I don't think Iwata is clueless about tech or anything, but I do wonder if he is coming at this primarily from a Japanese perspective and without a lot of knowledge of what the big Western developers are doing and will be doing.

You mean driving themselves out of business? Yeah, pretty sure he's got a firm grasp on that.

Can't wait to see studios go bankrupt over another ridiculous budget increase.

It's what this industry gets for letting publishers hold all the keys to all the doors. The movie industry is looking at the games industry and LAUGHING about the total surrender of IP control and the creative bankruptcy that not even they can match.
 

ASIS

Member
you'd have to be a gameplay whore to hate those ;)

A game looking like that would either have less gameplay or length on it.

I hope Nintendo can keep up with the others and offer us games more based on gameplay inmersion than simple graphics inmersion.

It can be both you know.
 
Of course gflop dont get created but I was making easy for non tech people to understand. SO when they make these statement that it running on high end pc system and can never run on a console that is not true. If the leak specs for ps470 they will be able to match any high end pc today. Not on paper but it doesnt have to....it a closed box.
No, certainly they can't match it at first.

Like I said you're taking him out of context and without understanding the intrincancies of what he's saying. which is what latency in outputting something means.

He's reffering to how PC's have bottlenecks due to their nature. The reason for those bottlenecks vary a lot, from the complex OS running in background to the fact that developers can't possibly have time to optimize for every single gpu in particular/they have to keep the game optimized in abstract terms (and build optimizations on top); also has to do with how they don't solder a gpu right next to the cpu, ports, drivers, darned API's, namely DirectX and software get's in the way, and how memory banks tend to be bigger but often not consisting of RAM as good (at least at the time a generation starts); also, commercial GPU's lack a dedicated framebuffer; meaning the main video ram bank get's tapped for outputting the image (also happens on X360 since 10 MB are not enough and ps3 doesn't have a embedded edram framebuffer) and that RAM certainly isn't as fast as one could hope, or dedicated.

That doesn't mean that the console will have in practical terms double the performance though, just that there'll be less latency to it. How to explain this... Xbox didn't have a dedicated framebuffer, it used RAM from the main bank for the z-buffer and framebuffer which was something you could consider a bottleneck against PS2 who had 4 MB of 2560-bit eDRAM, it's still regarded as the more powerful one despite the fact that it hindered it's efficiency (and obtaining 60 frames on games), because PS2 couldn't match what was going on even if it had reduced latency outputting.

what you do before outputting and with raw power, has less to do with latency (since the latency is when it comes to outputting it/storing it to output, meaning you can't possibly match the detail a 3 GFlop GPU is pulling with a 1.5 GFlop machine, only with tricks, same for resolution and stuff like effects and AA, because that's dependent on computing power, not latency.

Mentioning it's equivalent to more GFlops is a big falacy and shouldn't be even implied for what it means in practical terms. Consoles are more efficient as closed boxes, yes, but the extent they are is very limited, they still can only output what they computation power allows them to (just like PC GPU's actually), as for the rest they can try to reduce the difference by optimization and roundabout ways.


EDIT: Need an example? Go no further, Crysis 1 on PC came out in 2007, when PC hardware was just one generation over console's (PS3 has a custom Geforce 7800, Crysis recommended a state of art Geforce 8800 and would take every single drop of extra power you gave it) it was also running on Cry Engine 2, years passed, Cry Engine 3 powered Crysis 2 was done with console's in mind (and celebrated by PC gamers as a step back from Crysis graphics) and Crysis 1 was ported into Cry Engine 3 and released on X360/PS3, yet it's regarded to look worse than Crysis on PC; lightning, effects and the like suffered; yet they were meant for only a gpu generation ahead.
 
2 things here:
1) The reason the PS3 isn't selling is because all it has to differentiate itself from the 360 are Sony exclusives, EVERYTHING else is available on 360. THAT, combined with the major marketing fuck-ups of its first 3 years, sealed the fate of the PS3 in Western territories, not some mythical tie to XBL. Anyone could tell you that.

I think one of the biggest problems PS4 will have, esp in North America is that a large percentage of people bought PS3 with the 'free Blu Ray player' in mind.

It's going to be tough to get them to shell out another $400 when they all ready have a Blu Ray player, admittedly a really, really slow one ;).

Also if you look at exclusive Sony games esp this gen, the first in the series sells amazing and then with each sequel they sell 50% less and then 50% less (LBP, Motorstorm, Killzone, Resistance, Infamous ect).

If Sony do build the rumoured powerhouse (AMD 4 Core 3Ghz CPU, 2-4 GB's of GDDR 5 Ram and an 1800 teraFLOP GPU) and sell it at $400 against a more powerful and prob cheaper Xbox 720 then i really think PS4 could be Sony's last console.

If Wii U is at least twice as powerful as the PS360 and they can let customers see that with a graphical showcase at the August Nintendo Direct, Nintendo Land is bundled free with the system, Pikmin 3 & NSMB U are out before Xmas and they can get Blops 2, Fifa 13, Madden 13 and WWE 13 then i think it could have as much as a 15 million unit head start on both Sony and MS by Xmas 2013.
 
2 things here:

1) The reason the PS3 isn't selling is because all it has to differentiate itself from the 360 are Sony exclusives, EVERYTHING else is available on 360. THAT, combined with the major marketing fuck-ups of its first 3 years, sealed the fate of the PS3 in Western territories, not some mythical tie to XBL. Anyone could tell you that.

2) So your XBLA purchases stop working when you stop paying for XBL? That's.... kind of illegal, dude.
It has a BluRay player, it has a lower average selling price, it has free online multiplayer. As you note it largely has the same games as the 360, despite the advantage of these USPs - yet it cannot sell as well in the US market, where multiplayer first person shooters dominate.

It's obviously not just some magical network service brand loyalty effect, as I already alluded to it's more than that - it's a combination of having built a critical mass in their network service and the major social aspects of this generation's games, like CoD, that you seem to want to dismiss entirely.

Again, since you omitted and ignored it:
You obviously cannot sell a person a console on the prospect of paying to play for P2P gaming.
You can sell a person a console on the prospect of playing CoD with their friends, when all their friends are on that platform's online service.

An equivalently/cheaper priced console, with the same games, providing free online multiplayer and a bonus BluRay player didn't cause a mass migration. I'm not sure what exactly you're predicting will happen in this generational transition that will topple XBL's userbase like a house of cards.

As for your second point, I'm not sure if you're being deliberately obtuse. I thought it was clear I was referring to account carryover with associated purchases tied to said account.

Anyway, I don't want to keep clogging up this thread with somewhat unrelated discussion. There's a thread in the Gaming Discussion side if you want to continue discuss. http://www.neogaf.com/forum/showthread.php?t=478816
 

USC-fan

Banned
No, certainly they can't match it at first.

Like I said you're taking him out of context and without understanding the intrincancies of what he's saying. which is what latency in outputting something means.

He's reffering to how PC's have bottlenecks due to their nature. The reason for those bottlenecks vary a lot, from the complex OS running in background to the fact that developers can't possibly have time to optimize for every single gpu in particular/they have to keep the game optimized in abstract terms (and build optimizations on top); also has to do with how they don't solder a gpu right next to the cpu, ports, drivers, darned API's, namely DirectX and software get's in the way, and how memory banks tend to be bigger but often not consisting of RAM as good (at least at the time a generation starts); also, commercial GPU's lack a dedicated framebuffer; meaning the main video ram bank get's tapped for outputting the image (also happens on X360 since 10 MB are not enough and ps3 doesn't have a embedded edram framebuffer) and that RAM certainly isn't as fast as one could hope, or dedicated.

That doesn't mean that the console will have in practical terms double the performance though, just that there'll be less latency to it. How to explain this... Xbox didn't have a dedicated framebuffer, it used RAM from the main bank for the z-buffer and framebuffer which was something you could consider a bottleneck against PS2 who had 4 MB of 2560-bit eDRAM, it's still regarded as the more powerful one despite the fact that it hindered it's efficiency (and obtaining 60 frames on games), because PS2 couldn't match what was going on even if it had reduced latency outputting.

what you do before outputting and with raw power, has less to do with latency (since the latency is when it comes to outputting it/storing it to output, meaning you can't possibly match the detail a 3 GFlop GPU is pulling with a 1.5 GFlop machine, only with tricks, same for resolution and stuff like effects and AA, because that's dependent on computing power, not latency.

Mentioning it's equivalent to more GFlops is a big falacy and shouldn't be even implied for what it means in practical terms. Consoles are more efficient as closed boxes, yes, but the extent they are is very limited, they still can only output what they computation power allows them to (just like PC GPU's actually), as for the rest they can try to reduce the difference by optimization and roundabout ways.
lol really guy. You don't have explain anything to me.

What it does mean in practical terms is what I said. This is not something I dream up buddy. You can twist it whatever way you want but that doesn't change anything. There are not "tricks" its coding to the metal, no apis that slow down preformance .
 

AJSousuke

Member
It can be both you know.

yeah I know, the point is that the emphasis should be in the gameplay. I'm referring to aspects like not focusing on the amount of cinematics or background scenes(like in the GoWar games where you see a lot of crazy stuff going on the background) but in what the player can do.
Putting it shortly, i hope games don't become Michael Bay movies. That doesn't mean i don't want them to be pretty.
 

Ryoku

Member
There are not "tricks" its coding to the metal, no apis that slow down preformance .

Are you kidding me?
They get more power out of the GPU because they can take shortcuts to produce similar results as the actual thing itself. By definition, this is optimizing the game.

These are your tricks: Baked versus dynamic shadows. Linear corridors versus open-world. Small FoV versus large. Crappy/lack of shadows on "unimportant" objects. Less AA. Lower targeted framerates, lower resolution. This saves resources that can be used on more important aspects of the game, such as significantly higher detail in things directly in focus, like character/weapon models.

On top of that, you have to code for one thing only, and not, as lostinblue said, arbitrarily (due to many different GPUs), so yes, it is coding to the metal. That still doesn't increase said performance or power of the hardware. Lack of API only does so much in freeing up resources. Majority of the freed-up resources comes from the shortcuts, or "tricks". You're just utilizing more of it. Sucking out all the juice. Use whatever metaphors you want.

Do the same on a 3.7TFLOP GPU, and you'll get a significantly better-looking game. It's just that it's not cost-effective, as the consoles make the most moneys (how many people even have a 7970?).
 
lol really guy. You don't have explain anything to me.

What it does mean in practical terms is what I said. This is not something I dream up buddy. You can twist it whatever way you want but that doesn't change anything. There are not "tricks" its coding to the metal, no apis that slow down preformance .
You made up the whole 1.86 Gflops on console = 3.72 GFlops on computer statistic though, and seemed to claim Carmack to have said it, then gave the snipet were he didn't name a formula that applies to the whole console/pc efficiency because there isn't one; for one according to him it's the API that slows things the most with latency, not the speed the GPU takes to answer those calls once they reach it.

53-percent-of-statistics-are-made-up.jpg


You could say it's actually a tenth:

AMD's Richard Huddy: the PC can actually show you only a tenth of the performance if you need a separate batch for each draw call.
Source: http://techreport.com/discussions.x/20628

3720 PC GFlops=372 Console GFlops? Well no, it's just temperamental; you have to work around it (that example is a very specific one too, and probably exagerated because the dude despises Direct X it know it has bearing, but he's probably guessing numbers rather than citing some benchmark); then again you have to work around console's too; a experienced multiplatform developer for PS3 and X360 knows perfectly well that the two architectures have different strenghts and at this point they know how to tap it; it's the same thing except instead of optimizing for a piece of hardware they're often optimizing in making it run abstract to the hardware it's on so future gpu's don't break it and have to deal with the API for it too.


Carmack issue with latency is all the rage now because he is working on 3D glasses that have latency themselves and it all adds up to the point there's noticeable lag between what you input and the game moving which breaks the illusion (he actually talked with the manufacturers for improved firmware for those) that's why he talks so much about ms lately (and built his own apps to measure ms lag).
 

USC-fan

Banned
Are you kidding me?
They get more power out of the GPU because they can take shortcuts to produce similar results as the actual thing itself. By definition, this is optimizing the game.

These are your tricks: Baked versus dynamic shadows. Linear corridors versus open-world. Small FoV versus large. Crappy/lack of shadows on "unimportant" objects. Less AA. Lower targeted framerates, lower resolution. This saves resources that can be used on more important aspects of the game, such as significantly higher detail in things directly in focus, like character/weapon models.
so your saying you can't do this with a PC? Lol okay...

On top of that, you have to code for one thing only, and not, as lostinblue said, arbitrarily (due to many different GPUs), so yes, it is coding to the metal. That still doesn't increase said performance or power of the hardware. Lack of API only does so much in freeing up resources. Majority of the freed-up resources comes from the shortcuts, or "tricks". You're just utilizing more of it. Sucking out all the juice. Use whatever metaphors you want.

Do the same on a 3.7TFLOP GPU, and you'll get a significantly better-looking game. It's just that it's not cost-effective, as the consoles make the most moneys (how many people even have a 7970?).
It increases performance as greater minds than mind say by 2x. So yes coding to the metal does increase performance. Of course if you place a 3.7 tflop gpu in a closed box you can do the same.


There is no debate here, seems people just want to argue about nothing.
You made up the whole 1.86 Gflops on console = 3.72 GFlops on computer statistic though, and seemed to claim Carmack to have said it, then gave the snipet were he didn't.

You could say it's actually a tenth:

Source: http://techreport.com/discussions.x/20628

3720 PC GFlops=372 Console GFlops? Well no, it's just temperamental; you have to work around it (that example is a very specific one too, and probably exagerated because the dude despises Direct X); then again you have to work around console's too; a experienced multiplatform developer for PS3 and X360 knows perfectly well that the two architectures have different strenghts and at this point they know how to tap it; it's the same thing except instead of optimizing for a piece of hardware they're often optimizing in making it run abstract to the hardware it's on so future gpu's don't break it.

http://twitter.com/ID_AA_Carmack/statuses/50277106856370176
"Consoles run 2x or so better than equal PC hardware, but it isn’t just API in the way, focus a single spec also matters."


Winner, winner chicken dinner.... carmack ftw...
 
so your saying you can't do this with a PC? Lol okay...
You can, but you're not prone to; console's do it because they're most of the time chasing after the PC or the kind of thing the PC is promising to do now.

Also bare in mind that PC has the burden of being the one that can be rendered at any resolution, pre-baking a level that on a X360 will run always at 1280x720 on a PC can mean huge ass textures, and the truth is, PC's surpassed this console generation GPU's long ago, chances are doing those dynamic shadows is not that much of a big deal for them.

Being future proof means it has to be done that way; it also suggest the whole subjective parity thing is a falacy. You'd be better off sugesting rendering a X360 game without AA at 720p is "kind of" in parity with some GPU card out there running everything at 1080p with 4x AA, makes more sense.
There is no debate here, seems people just want to argue about nothing.
Not really. I don't have a issue with the whole added efficiency point and being able to make up for the lack of something on a closed platform better than on a PC; coding "on-metal" has it's advantages also, but it doesn't surpass the fact that if you have say, 400 stream processors the thing will never ask to a call you make like if it had 800 stream processors.

There's no multiplication miracle here.


And thus I'm against institutionalizing that it's a formula that doubles console's performance's (because it really varies on the situation). It doesn't.


Note: Nothing personal, I just find the way you insist on quantifying the difference to be too misleading to let it be.
 
Well there is all this "info" around suggesting that they were doing something like a < 2Ghz processor and possibly not SMT which would be a huge step down from the expected.

And if that's the case, as long as they made it more efficient it will be better off. Though we still don't know how much poor optimization affects that.

Has it what! I'm having a aneurysm every time I hear more crappy news about third parties not committing, the uber beast PS4/720's, or read some bollocks by Iwata or Reggie.

I feel completely shafted by the Wii generation, however I am a Nintendo fan (Believe it or not) and love my 3d Mario, Zelda and Metroid so I'd prefer to have those games next generation. However I also really want those third party titles. I want to play Tomb Raider et al, but I don't see myself being able to afford two systems. To not own a Nintendo system next generation would really bum me out but to own one and not have many games might do so more :)

I'm just hoping that we start to hear some really good news soon wrt third party "core" games.

I think it's time to pop open a bottle of Jacks and ride it until launch day :)

Well the way Iwata made it sound we'll get better info about the future this fall. I don't know how much it will be, but hopefully it's something nice.

The best analogy he ever wrote on here was something along the lines of 'PS360 were 20 times more powerful than the Wii, PS4 / 720 will be 4 times more powerful at most'.

Can't say I said that one.

Of course gflop dont get created but I was making easy for non tech people to understand. SO when they make these statement that it running on high end pc system and can never run on a console that is not true. If the leak specs for ps470 they will be able to match any high end single gpu pc today. Not on paper but it doesnt have to....it a closed box.

I'm sorry but that was a bad, bad analogy for non-tech people. Lostinblue said in more detail what I would have said. I don't want people being confused into thinking a console's closed environment allows a GPU to exceed it's theoretical limit.

After all we don't know how "API-heavy" those demos are while running on a GTX 680. I doubt it was too a point to cut the GPU's performance in half or worse.
 

Ryoku

Member
so your saying you can't do this with a PC? Lol okay...

Ahem.....

Do the same on a 3.7TFLOP GPU, and you'll get a significantly better-looking game. It's just that it's not cost-effective, as the consoles make the most moneys (how many people even have a 7970?).



It increases performance as greater minds than mind say by 2x. So yes coding to the metal does increase performance. Of course if you place a 3.7 tflop gpu in a closed box you can do the same.

Err..... How do I put this..... Oh yeah:

Do the same on a 3.7TFLOP GPU, and you'll get a significantly better-looking game. It's just that it's not cost-effective, as the consoles make the most moneys (how many people even have a 7970?).

Again, majority of the performance optimization comes from said shortcuts which you conveniently ignored. Even with the "coding to the metal", you're not getting performance that is twice the capability of the GPU itself. This is the fundamental flaw in your argument.
 
Lostinblue said in more detail what I would have said. I don't want people being confused into thinking a console's closed environment allows a GPU to exceed it's theoretical limit.
Just now you've said it better than I could and without technical jargon. I was having problems in conveying the issue in such few words.
 

Terrell

Member
It has a BluRay player, it has a lower average selling price, it has free online multiplayer. As you note it largely has the same games as the 360, despite the advantage of these USPs - yet it cannot sell as well in the US market, where multiplayer first person shooters dominate.

Games sell consoles, dude. If multi-plat didn't exist this gen, we'd see a different market situation than what we see now. Playing up XBL as being the pillar that suspends the 360's advantage is a fallacy when it has more to do with all that Sony did wrong that Microsoft took early advantage of.

It's obviously not just some magical network service brand loyalty effect, as I already alluded to it's more than that - it's a combination of having built a critical mass in their network service and the major social aspects of this generation's games, like CoD, that you seem to want to dismiss entirely.

Again, since you omitted and ignored it:
You obviously cannot sell a person a console on the prospect of paying to play for P2P gaming.
You can sell a person a console on the prospect of playing CoD with their friends, when all their friends are on that platform's online service.

An equivalently/cheaper priced console, with the same games, providing free online multiplayer and a bonus BluRay player didn't cause a mass migration. I'm not sure what exactly you're predicting will happen in this generational transition that will topple XBL's userbase like a house of cards.

Yeah, it's "equivalently priced" now, but that's a RECENT phenomenon. As Nintendo has said before, first impressions mean a lot in this industry, and Sony absolutely ruined theirs. A mid-gen turnaround has NEVER happened and I never predicted one.

At the beginning of a new generation, the whole playing field resets. The advantages of the 360 being first, cheaper and with equal software support to its main competitor isn't a guarantee.

Your whole argument hinges on "all your friends" being on XBL, but the fact of the matter is that is entirely contingent on the console selling to these people and that's not guaranteed at this point. People thought it was a guarantee for Sony to carry over its user base from the PS2, remember what happened there? XBL only has value if the next Xbox console sells on ALL of its other possible virtues, and a selling point that is only valuable if people already own the console isn't a selling point at all.

As for your second point, I'm not sure if you're being deliberately obtuse. I thought it was clear I was referring to account carryover with associated purchases tied to said account.

This is a new form of the "backwards compatibility" argument, essentially. Unless they're going to immediately sell their 360s, all that content still works on their pre-existing hardware, just like PS2 owners who bought a 360 can still play their PS2 games on the PS2 they still own. Unless MS kills your ability to play this content on your pre-existing hardware if you stop paying the monthly XBL fee, which IS illegal, then... well, we've been through this before, and backwards compatibility isn't quite the selling feature people thought it would be, and neither is this.

Anyway, I don't want to keep clogging up this thread with somewhat unrelated discussion. There's a thread in the Gaming Discussion side if you want to continue discuss. http://www.neogaf.com/forum/showthread.php?t=478816

After calling me "deliberately obtuse" and being "dismissive", you didn't really sell me on how classy that thread is. I'm fine where I am, thanks.
 

USC-fan

Banned
I'm sorry but that was a bad, bad analogy for non-tech people. Lostinblue said in more detail what I would have said. I don't want people being confused into thinking a console's closed environment allows a GPU to exceed it's theoretical limit.

After all we don't know how "API-heavy" those demos are while running on a GTX 680. I doubt it was too a point to cut the GPU's performance in half or worse.

http://twitter.com/ID_AA_Carmack/statuses/50277106856370176

"Consoles run 2x or so better than equal PC hardware, but it isn&#8217;t just APIin the way, focus a single spec also matters."

It doesn't get more simple than this. Exactly want I said....you get double or more compared to PC hardware.
 
http://twitter.com/ID_AA_Carmack/sta...77106856370176 "Consoles run 2x or so better than equal PC hardware, but it isn&#8217;t just APIin the way, focus a single spec also matters."

It doesn't get more simple than this. Exactly want I said....you get double or more compared to PC hardware.
Your link is malformed, and no it's not the same you were saying, you were saying like if it was "free", after all a 1.86 TFlop GPU would equal a 3.72 TFlop one for most devs.

Anyway I guess even Carmack can't evade the twitter 140 character limit, but just because he can't write a thesis over the equivalent to internet sms (and he would) you shouldn't assume it's that simple, it certainly isn't. we know for one that optimizing Rage/Id Tech 5 for consoles was no easy feat (took years, and required them to learn the platforms intricacies), and they only did because they had to, because otherwise there would be not enough juice (that surely increases "performance" painstakingly, and yet certainly not past the 2x mark), just like how the game on PC takes a whooping 25 GB but they have to make it fit on 3 DVD's without obligatory instalation for X360, it's all down to optimization and yet they're not surpassing their power, they're trying to output it fully without leaving it unused for checkbox extras (AA, extra resolution, various shadow detail options, etc) which is the way PC deals with leftover power, blast it on options that blow the overhead out. My point being a 2x increment of efficiency certainly doesn't come free, otherwise with actual optimization you'd go even past it, several times; he's including thousands of manhours, a good budget and talent into the mix.

Carmack is an engineer over being a game designer and a major geek at that, that's why we like him, he's hooked on mastering things takes fun out of doing so, but let's say it like this: pushing a platform costs money and that 2x increment in efficiency he talks about costs lots of it. Next gen devs are probably better of not investing it if dev costs are going to increase again. See, in a perfect world, if they had the overhead they'd optimize nothing; that's also part of the reason why most of the time they don't on PC (along with having to run on different hardware).

See, any number thrown in will be very flicky and dependent on lots of variables.
 
Proper link.

https://twitter.com/ID_AA_Carmack/statuses/50277106856370176

http://twitter.com/ID_AA_Carmack/sta...77106856370176 "Consoles run 2x or so better than equal PC hardware, but it isn&#8217;t just APIin the way, focus a single spec also matters."

It doesn't get more simple than this. Exactly want I said....you get double or more compared to PC hardware.

But did you ever stop to think that the reason he says that is because the API, being able to focus on one spec (as he in this tweet and LiB in his previous post said) and other things affecting optimization cut the power of the PC, not double the console's power? In other words, and just focusing on the GPU for the sake of discussion, if both the PC and console have a 1.8 TFLOP GPU, it's not that the closed environment "makes" the console's GPU run at 3.6 TFLOPS. It's all the bloat and lack of optimization on the PC that "makes" the GPU run at 900 GFLOPs. And that's how a console with the same hardware as a PC can run "2x or so better" as Carmack put it.
 

japtor

Member
Look, it's not so much hardware reasons that some publishers aren't putting top-tier efforts onto the system. It's demographics (i.e. pigeonholing the "Wii U audience" or more specifically the "Nintendo audience") and ROI assessments in those circumstances. Whether it's successful as Wii and blows away the sales of the competition or a Gamecube-style flop, it's going to be a "fourth/alternate system" for content in the eyes of most pubs.

The system has the hardware necessary to accept up ports and down ports, just the same way one of my laptops with an integrated graphics card can play most of today's games with shit quality in comparison to one of my gaming rigs. It's not ideal hardware, but it's not going to technically stop someone from putting content on it. It takes time and money and the willingness to invest it.
Yeah the technical hurdle that was there with the Wii doesn't appear to be there with the Wii U (at least not remotely as close, I'm guessing there'll still be issues at the upper end of the "games that will never break even" spectrum). My hope is that for early next gen publishers will notice that 1) ports are techincally and financially feasible, and 2) Wii U has a year head start in user base, which could mean however many million units out there vs 0 up until the others are on sale.

As far as Wii U owners I think they'll buy uglier games as PS2 owners did even when prettier versions on other consoles came out. Compare that to the Wii situation where it was pretty clear since most of the time there was no port at all. Hell if I could've gotten gameplay equivalent ugly ass ports for the Wii I probably never would have gotten a 360.
Well there is all this "info" around suggesting that they were doing something like a < 2Ghz processor and possibly not SMT which would be a huge step down from the expected.
Those specs alone don't mean anything without knowing more details (compare Intel's modern low clocked CPUs vs their old high clocked ones), the legitimate concern would be the reports about devs having issues with it...which supposedly is or isn't addressed now?
Your link is malformed.

Anyway I guess even Carmack can't evade the twitter 140 character limit, just because he can't write a thesis (and he would) for something on twitter you shouldn't assume it's that simple.
How about a car analogy? Everyone likes those!
even when they're not really applicable or accurate!

Sounds like PCs are powerful cars but have the burden of a lot of weight, while consoles are more like stripped down tuner cars (B-spec race cars might be a better comparison). Not necessarily powerful but quick and nimble cause there's less weight to deal with. Or something like raw power representing straight line speed vs low latency relating to cornering speed.
 

USC-fan

Banned
Proper link.

https://twitter.com/ID_AA_Carmack/statuses/50277106856370176



But did you ever stop to think that the reason he says that is because the API, being able to focus on one spec (as he in this tweet and LiB in his previous post said) and other things affecting optimization cut the power of the PC, not double the console's power? In other words, and just focusing on the GPU for the sake of discussion, if both the PC and console have a 1.8 TFLOP GPU, it's not that the closed environment "makes" the console's GPU run at 3.6 TFLOPS. It's all the bloat and lack of optimization on the PC that "makes" the GPU run at 900 GFLOPs. And that's how a console with the same hardware as a PC can run "2x or so better" as Carmack put it.

Same difference. What I said was a console gpu is equal to a top of the line single gpu today.
We are saying the same things....

Six one way a half dozen the other
Your link is malformed, and no it's not the same you were saying, you were saying like ift was "free", after all a 1.86 TFlop GPU would equal a 3.72 TFlop one for most devs.

Anyway I guess even Carmack can't evade the twitter 140 character limit, but just because he can't write a thesis (and he would) for something on twitter you shouldn't assume it's that simple, it certainly isn't. we know for one that optimizing Rage/Id Tech 5 for consoles was no easy feat (took years, and required them to learn the pltforms intricacies), and they only did because they had to, because otherwise there would be not enough juice (that surely increases "performance" painstakingly, and yet certainly not past the 2x mark), just like how the game on PC takes a whooping 25 GB but they have to make it fit on 3 DVD's without obligatory instalation for X360, it's all down to optimization and yet they're not surpassing their power, they're trying to output it fully without leaving it unused for checkbox extras (AA, extra resolution, various shadow detail options, etc). My point being a 2x increment of efficiency certainly doesn't come free, otherwise with actual optimization you'd go even past it, several times; he's including thousands of manhours and a good budget (and talent) into the mix.

See, any number thrown in will be very flicky and dependent on lots of variables.
Funny you say this about rage because it lead platform was the consoles.

"We do not see the PC as the leading platform for games. That statement will enrage some people, but it is hard to characterize it otherwise; both console versions will have larger audiences than the PC version."

Going by carmack other stated problems with PC drivers for rage you have it backwards. The painstaking one was the PC version.
 

NBtoaster

Member
In the leaked SDK, there was a line:



What does this actually mean, how does it compare to current consoles, and what would be the difference between games at 720p vs 1080p?

PS3 supports up to 4096x4096. 360 up to 8192x8192. Modern gpus higher.

Doubt you will see any conventional renderers using many 8kx8k textures, but it does matter for megatextures.
 
Same difference. What I said was a console gpu is equal to a top of the line single gpu today.
We are saying the same things....

Six one way a half dozen the other

Nah, that's not the same difference in this case. We both know that devs need time to gain a a proper understanding of a console's hardware. Keeping with the same scenario, devs aren't going to be maxing out PS4's 1.8 TFLOP GPU at launch. Carmack also said "or so" normally when I see or hear people add that they saying "less than" in other words for Carmack that would suggest less than or equal to 2x. On the PC side we can't assume they will always have the same level of optimization issues. And in this case to the point where a a high-end PC's power is always cut in half bringing it down to the level of the next consoles.
 

USC-fan

Banned
Nah, that's not the same difference in this case. We both know that devs need time to gain a a proper understanding of a console's hardware. Keeping with the same scenario, devs aren't going to be maxing out PS4's 1.8 TFLOP GPU at launch. Carmack also said "or so" normally when I see or hear people add that they saying "less than" in other words for Carmack that would suggest less than or equal to 2x. On the PC side we can't assume they will always have the same level of optimization issues. And in this case to the point where a a high-end PC's power is always cut in half bringing it down to the level of the next consoles.

You going have let me know where you got your secret decoder ring! J/k lol

Now you have to also remember no PC game is made to take advantage of the card I posted. You just get more fps and higher Res. On consoles we are really limited in Res and fps. So they don't have to Max out the gpu at launch to match PC games.
 
How about a car analogy? Everyone likes those!
even when they're not really applicable or accurate!

Sounds like PCs are powerful cars but have the burden of a lot of weight, while consoles are more like stripped down tuner cars (B-spec race cars might be a better comparison). Not necessarily powerful but quick and nimble cause there's less weight to deal with. Or something like raw power representing straight line speed vs low latency relating to cornering speed.
Heh, the good old car analogy, I'll bite.

If top range PC's can be the Lamborghini Aventador (700 hp) single card or the Koenigsegg Agera R (1100 hp) in SLI then console's can be the utilitary GT cars. Now Microsoft and Sony probably want to stay in the small family car segment with a Volkswagen Golf R (266 hp) like kind of hatchback, that or a Subaru Impreza WRX (265 hp) kind of car.

Nintendo seems more focused on wanting to be a AE86 (good efficiency without bottlenecks; unique behaviour at certain situations) on a reasonably powered engine (112 hp) or a Volkswagen Polo hatchback 1.2 or 1.4 TSi (105 or 180 hp) which are supercharged engine's (low cc's, to fit in small engine bays, with turbo).


Came out better than I imagined, but I'm still throwing mud to the wall and see if it sticks I guess.
Funny you say this about rage because it lead platform was the consoles.

"We do not see the PC as the leading platform for games. That statement will enrage some people, but it is hard to characterize it otherwise; both console versions will have larger audiences than the PC version."

Going by carmack other stated problems with PC drivers for rage you have it backwards. The painstaking one was the PC version.
Of course it was, they would spend years downporting it to PS3/X360 is that wasn't the case, it's a technical showcase after all.

Everything is using consoles as lead platforms these days, it's simpler that way; it's the same phenomena that makes some devs actually use the PS3 version as lead platform instead of X360; Ps3 is harder, reaches milestones later, meaning if X360 is the lead platform PS3 will always be in a chasing position and that used to mean a sour port. They don't want consoles to be in a chasing position it's where the money is.

Downporting afterwards is more expensive than doing it alongside and means more roadblocks. If you did a game for PC at this point, trying to push the boundaries of available GPU's with a budget you'd probably have a game that you couldn't easily port anywhere right now if you didn't demake it (murder it), that is.

As for the PC version it went wrong due to drivers and probably some unoptimization on their part, you have to see that Id has limited resources and the publisher's allocated budget for PC releases sure dropped with CoD being console centric now (and the PC version loosing features like servers due to it) Crysis 2 not having been thought with PC's in mind too. They were probably running it on target hardware for too long and only tried to optimize it near the tail end of development. Because consoles (as lead platforms and more profitable platforms) were so much work too.

Shit happens.
 
You going have let me know where you got your secret decoder ring! J/k lol

Now you have to also remember no PC game is made to take advantage of the card I posted. You just get more fps and higher Res. On consoles we are really limited in Res and fps. So they don't have to Max out the gpu at launch to match PC games.

Haha. As an assassin I can't tell you that. ;)

I agree with you. But at the same time the raw power of PC GPUs are setting a larger and larger gap between them and consoles. We could very well see both AMD and nVidia near or surpassing 4 TFLOPs with their next GPUs that will be out around when the other two consoles launch and are in their first year. And we can only expect those PC GPUs to continue to push beyond that as time passes next gen.
 

Ryoku

Member
You going have let me know where you got your secret decoder ring! J/k lol

Now you have to also remember no PC game is made to take advantage of the card I posted. You just get more fps and higher Res. On consoles we are really limited in Res and fps. So they don't have to Max out the gpu at launch to match PC games.

This is a matter of what the devs have made available in the settings. Not really representative of the total power of the hardware.

EDIT: Didn't read the entire post carefully. You said essentially what I said. Sorry about that. I'm going to grab a bite, holy shit.
 
Games sell consoles, dude. If multi-plat didn't exist this gen, we'd see a different market situation than what we see now. Playing up XBL as being the pillar that suspends the 360's advantage is a fallacy when it has more to do with all that Sony did wrong that Microsoft took early advantage of.

Yeah, it's "equivalently priced" now, but that's a RECENT phenomenon. As Nintendo has said before, first impressions mean a lot in this industry, and Sony absolutely ruined theirs. A mid-gen turnaround has NEVER happened and I never predicted one.

At the beginning of a new generation, the whole playing field resets. The advantages of the 360 being first, cheaper and with equal software support to its main competitor isn't a guarantee.

Your whole argument hinges on "all your friends" being on XBL, but the fact of the matter is that is entirely contingent on the console selling to these people and that's not guaranteed at this point. People thought it was a guarantee for Sony to carry over its user base from the PS2, remember what happened there? XBL only has value if the next Xbox console sells on ALL of its other possible virtues, and a selling point that is only valuable if people already own the console isn't a selling point at all.

This is a new form of the "backwards compatibility" argument, essentially. Unless they're going to immediately sell their 360s, all that content still works on their pre-existing hardware, just like PS2 owners who bought a 360 can still play their PS2 games on the PS2 they still own. Unless MS kills your ability to play this content on your pre-existing hardware if you stop paying the monthly XBL fee, which IS illegal, then... well, we've been through this before, and backwards compatibility isn't quite the selling feature people thought it would be, and neither is this.

After calling me "deliberately obtuse" and being "dismissive", you didn't really sell me on how classy that thread is. I'm fine where I am, thanks.
I didn't mean it as an insult, I actually wasn't sure if you were deliberately missing my point... and I'm not sure how you're not being dismissive of online service effects on the generational transition...

Anyway, I'm aware games sell consoles - ergo, the online services aren't a benefit in a vacuum. They're a benefit in the context of games that rely heavily on online multiplayer communities.

Yes, Sony fucked up royally at the start of this generation. I'm sure I've said that I don't think that the online network is the sole factor. But it's a compounding one.

2009 isn't particularly recent in terms of price parity, and as for mid-gen turnarounds, the 360 essentially supplanted the Wii around the launch of Kinect.

The 720 will not be first obviously, since the Wii U launches this year, cheaper is an unknown; but as for software support, it is essentially a given that the 720 will receive third party support. I'm not sure on what basis one would dispute that.

Early in a generation, backwards compatibility is somewhat important. But I'm not just referring to monetary investment - investment of time into building a persona around stupid avatars and arbitrary achievements - there's a aversion to losing that intangible asset.

Yes, the argument is based around all one's friends being on XBL - about a critical mass having been built. When the transition occurs, do all one's friends suddenly cease to be on XBL - and if so - why?

Presumably 20M XBL Gold members migrate en masse to the Miiverse to play CoD?

The expectations of a transition from PlayStation to PlayStation relied entirely on a brand effect, and less a social/community effect.

The world as it is now, is not the same as that of 2005 or 2000 or 1994 - where the complete lack of comprehensive online services didn't particularly hurt. In the past, the playing field has reset, but I think social aspects have changed the game.
 

AzaK

Member
Those specs alone don't mean anything without knowing more details (compare Intel's modern low clocked CPUs vs their old high clocked ones), the legitimate concern would be the reports about devs having issues with it...which supposedly is or isn't addressed now?

I realise that GHz isn't necessarily relevant, but are there any sub 2Ghz tri-core CPU's without SMT that could outclass a 3.2GHz tri-core Xenon with 2-way SMT? That's sort of what the rumours were looking at.
 

ohlawd

Member
New guy here.

First off, I'd like to thank everyone who have made the old Speculation Threads fun to visit. There's no point in me mentioning specific people because I'm bound to forget some and I don't want anyone to feel left out.

I'm definitely looking forward to the Wii U's launch. Pikmin 3, Project P-100, ZombiU and Rayman Legends are the games I'm really looking forward to for the Wii U.

Lots of things can change in the upcoming months but I hope we can all pull though until the very end.
 
Haha. We have a lord and an ohlawd posting here. Awesome!

I realise that GHz isn't necessarily relevant, but are there any sub 2Ghz tri-core CPU's without SMT that could outclass a 3.2GHz tri-core Xenon with 2-way SMT? That's sort of what the rumours were looking at.

Another thing worth considering is that the final CPU most likely wasn't completed till around or just before the GPU was completed. So most, if not all, assessments were based on pre-final kits.
 

Terrell

Member
2009 isn't particularly recent in terms of price parity, and as for mid-gen turnarounds, the 360 essentially supplanted the Wii around the launch of Kinect.

Still got a long way to go to "supplant" the market leader. Wii has considerably slowed, but 360's momentum isn't strong enough to carry it past the Wii before the Durango launch, either.

The 720 will not be first obviously, since the Wii U launches this year, cheaper is an unknown; but as for software support, it is essentially a given that the 720 will receive third party support. I'm not sure on what basis one would dispute that.

Because I remember 2005 and 2006 when people dismissed Microsoft and were bullish about Sony for the same reasons, as all signs pointed to PS3 having a strong exclusive software library after E3. Anything prior to launch is an illusion that fades in the presence of hard sales numbers.

Early in a generation, backwards compatibility is somewhat important. But I'm not just referring to monetary investment - investment of time into building a persona around stupid avatars and arbitrary achievements - there's a aversion to losing that intangible asset.

Once again, XBL and the 360 don't break down at the Durango launch. You're not FORCED to buy the new shiny thing. If you're invested in it, you can just as easily stay invested in it on an older platform. It's actually one of the more worrying things that MS will need to deal with in the next generation: legacy support for XBL on the 360.

Yes, the argument is based around all one's friends being on XBL - about a critical mass having been built. When the transition occurs, do all one's friends suddenly cease to be on XBL - and if so - why?

An unforeseen shift in the purchasing habits of the market. Same reason as EVERY generation that preceded this one. People's wants change over time and MS might not offer what people want in the Durango. Simple as that.


Presumably 20M XBL Gold members migrate en masse to the Miiverse to play CoD?

No, 20 million XBL Gold members migrate to another platform because it may or may not offer the same things MS might not be offering with Durango, as mentioned above.

The world as it is now, is not the same as that of 2005 or 2000 or 1994 - where the complete lack of comprehensive online services didn't particularly hurt. In the past, the playing field has reset, but I think social aspects have changed the game.

You're not buying Xbox Live. You're buying a console, XBL is a mere value add to said console. Yes, there is a social component, but you aren't buying the social component itself as it has a transient benefit. And history shows that people are willing to sacrifice social aspects to their lifestyles for things they find more appealing... just look at the rise of the cell phone as proof of that. Ba-dum-tish.
 

Meelow

Banned
I went to that topic about Microsoft saying the Wii U is a 360 and I can't believe people are taking that seriously.
 

USC-fan

Banned
I went to that topic about Microsoft saying the Wii U is a 360 and I can't believe people are taking that seriously.

Well is it that hard to see why? Nothing on the wiiu looks better than x360 games.

Nintendo is the problem,they just want to show launch games. They need to show a game above what ps360 can do.... just show some video or something, anything. Until then people are going to think it and i t might be close to the truth.
 

Meelow

Banned
Well is it that hard to see why? Nothing on the wiiu looks better than x360 games.

Nintendo is the problem,they just want to show launch games. They need to show a game above what ps360 can do.... just show some video or something, anything. Until then people are going to think it and i t might be close to the truth.

Fall Conference 2012.
 
Well is it that hard to see why? Nothing on the wiiu looks better than x360 games.

Nintendo is the problem,they just want to show launch games. They need to show a game above what ps360 can do.... just show some video or something, anything. Until then people are going to think it and i t might be close to the truth.

This is true. Some of us can speak all day that Wii U more powerful, but it's still on Nintendo to prove it.
 
I realise that GHz isn't necessarily relevant, but are there any sub 2Ghz tri-core CPU's without SMT that could outclass a 3.2GHz tri-core Xenon with 2-way SMT? That's sort of what the rumours were looking at.
Don't focus so much on CPU, PS3 CPU sucked, most multiplatform games use only one core on X360 due to it; most consoles cpu's weren't that hot either, and it really doesn't matter all that much. Specially since the moment you don't need the cpu to track vertex coordinates.

Why sub-2 GHz?


Anyway SMT and Hyperthreading on P4 are pretty much the same so should be comparable. Intel claimed a "15-30% increment" on that, probably closer to 20/25% on most realworld scenarios, as suggested here:

The result is an average 24.7% performance improvement over an aggressive baseline synchronous architecture with &#64257;xed cache designs for two-threaded workloads, and 19.2% for single-threaded applications.
Source: http://infoscience.epfl.ch/record/97167/files/hipeac07.pdf


It might be missing since ARM has had problems with it:

ARM (one of whose major selling points is the small die size of their solutions), state that it is considerably better to double your silicon area and stick two cores on, than it is to go for a more complex single core with SMT support, their reasoning being that a well-designed multi-core system, while bigger, will actually use less power. They claim up to 46% savings in energy over an SMT solution with four threads.

Also, moving an application to two threads on a single SMT-enabled core will increase cache-thrashing by 42%, whereas it will decrease by 37% when moving to two cores.
Source: http://www.theinquirer.net/inquirer/news/1037948/arm-fan-hyperthreading

ARM being focused on embedded solutions. But at the same time strange because Power7 seems to be built around SMT.


As for how many MHz it would have to have to equal Xenon no idea; if SMT is absent it probably is no big deal; it's at most averaging a 20% hit on an updated design with out of order execution that Xenon lacks; I doubt they're going under 3 GHz though. (I actually did math regarding MIPS, Xenon seems to do 2 MIPS per MHz, Power7 is at 2.15, meaning it could equal it and slightly surpass xenos at 3 GHz going by that alone)
 

Drago

Member
Well is it that hard to see why? Nothing on the wiiu looks better than x360 games.

Nintendo is the problem,they just want to show launch games. They need to show a game above what ps360 can do.... just show some video or something, anything. Until then people are going to think it and i t might be close to the truth.

100% agreed

Hopefully the Fall Conference or an August Nintendo Direct will shed some light on what's in store. Hope it looks good too :p
 

Roo

Member
So, I was trying to catch up a little bit and I found a post with a video where there was a slight lag playing NSMBU

This one:
http://www.youtube.com/watch?v=mM_jlMy3F3Y

Now, they slowed down the shit out of that video to appreciate it, but is it noticeable in real life? like... "oh shit, there's lag on this thing"
I'm a little worried because if those, being tethered units had lag, I can see it being worse without wires.

Maybe it's because they're still prototype/ not final units but still, there's something fishy here.
By the way, as I claimed before, I'm not a tech guy so for those who know about latency/lag/response/wireless/bluetooth talk. Can it get worse over time? like... wasted receivers and trasmitters components making the lag more obvious or I don't have anything to worry about? :D
 
Still got a long way to go to "supplant" the market leader. Wii has considerably slowed, but 360's momentum isn't strong enough to carry it past the Wii before the Durango launch, either.
I wasn't referring to cumulative installed base...
Once again, XBL and the 360 don't break down at the Durango launch. You're not FORCED to buy the new shiny thing. If you're invested in it, you can just as easily stay invested in it on an older platform. It's actually one of the more worrying things that MS will need to deal with in the next generation: legacy support for XBL on the 360.
I thought we were discussing the effect on generational transition i.e. in the event a consumer intends to buy a new console, what effect will online services/accounts have.

Obviously no one's forced to buy a new shiny anything - be it a Wii U, PS4, 720 or PC.

Are you implying that that XBL is actually detrimental to a generational transition - even under the assumption that the account is entirely transferable to the next generation?
An unforeseen shift in the purchasing habits of the market. Same reason as EVERY generation that preceded this one. People's wants change over time and MS might not offer what people want in the Durango. Simple as that.

No, 20 million XBL Gold members migrate to another platform because it may or may not offer the same things MS might not be offering with Durango, as mentioned above.
Sorry, but this doesn't really answer anything. Essentially you're saying that assuming some sort of monumental fuck up on Microsoft's part people will discard their online personas and friends list and build anew on a different platform.

Unspecified unforeseen circumstances could make Microsoft's the best or worst selling console of the 8th gen. Unforeseen circumstances could make consoles as a whole irrelevant.

It's really no basis for discussion, because I agree entirely, if Microsoft screws up, if they lose touch with consumers, then they'll lose marketshare. If any of the three screw up then they lose marketshare.

That doesn't make XBL irrelevant to the generational transition.
You're not buying Xbox Live. You're buying a console, XBL is a mere value add to said console. Yes, there is a social component, but you aren't buying the social component itself as it has a transient benefit. And history shows that people are willing to sacrifice social aspects to their lifestyles for things they find more appealing... just look at the rise of the cell phone as proof of that. Ba-dum-tish.
But that's the point, although we may differ on how much of a selling point - it's a selling point that will affect consumer choices.

It's December 2014. Jimmy has an XBOX 360 and plays CoD etc. with his friends on XBL. He has his 100,000 or whatever gamerscore. He's spent over $100 on XBLA. He can carry all of this over to the 720.
He's going to buy a new console.
There's a $400 PS4 or X720 and a $300 Wii U. All else being relatively equal - third party support etc. - he gets doesn't get the 720 because?

I'm not sure what you mean by the last part. If anything smartphones are a prime example of service vendor-lock in. An iOS owner is more likely to purchase iOS products in the future because of carryover of Apps; likewise Android.
 
Man, sorry. Been quite a few days since I've been here (temporarily retired my Crow T Robot avatar in honor of the late Andy Griffith), anything notable to report? Did Shin johnpv report back with his Wii U experience impressions? Did Rosti ever get his answers back from his NOA contact?
 

Ryoku

Member
Man, sorry. Been quite a few days since I've been here (temporarily retired my Crow T Robot avatar in honor of the late Andy Griffith), anything notable to report? Did Shin johnpv report back with his Wii U experience impressions? Did Rosti ever get his answers back from his NOA contact?

I don't think there's been anything substantially new. I've been out of power since Friday up until last night. A storm knocked out most of the DC metropolitan area since Friday night :/
I know people who still haven't received power, and companies are saying that the last of the people won't get power until probably next Sunday D:
 
Status
Not open for further replies.
Top Bottom