• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Timothy Lottes: "a 2011 GPU (6970) seems like a possible proxy for nextgen consoles"

Timothy Lottes is the creator of the popular FXAA anti-aliasing algorithm.

http://timothylottes.blogspot.com/2012/01/to-extremes-and-back-to-reality.html

Entertaining NeoGAF Thread: Developers Discuss Benefits Of Targeting 720p/30fps Next Gen, Using Film Aesthetics captures a lot of the concerns of the vocal PC and console gamer, responding to comments in the Games vs Film post.

Ok, Now Back to Reality

My prior comment, "IMO a more interesting next-generation metric is can an engine on a ultra-highend PC rendering at 720p look as real as a DVD quality movie?" is a rhetorical question asking if it is possible for a real-time engine to start to approach the lower bound of a DVD movie in realism.

To make this clear, I'm not suggesting that games should compromise interactive experience just to get visual quality. If I was going to develop a title for next generation consoles I would output 1080p and run frame locked to 60Hz with no dropped frames period. I still believe developers will be able to start to reach the quality of film for next generation consoles and current generation PCs, and I'm intending to develop or prove out some of the underlining techniques and technology which gets us there.

I've split his post to two, because each part is addressing a different topic. The above was about the Games vs Film discussion, and bellow is his analysis of what expectations to have for the next-gen console GPUs:

At the same time, certainly expectations for next generation consoles should at least be grounded in some rough realistic estimates for performance. Using public information found on the internet, lets nail down a realistic estimate of what next generation console performance will be, by looking at how ATI/AMD has evolved GPU performance after the Xbox 360,

(1.) Next gen console games will be outputting at 1080p. I can say this with full confidence simply because HDTV typically adds a frame of latency when it needs to convert from 720p to 1080p.

(2.) Using HD6970 as a proxy for a high end PC version of the Xbox 360, lets compare specs. Going from the typical 720p @ 30Hz on Xbox360 to 1080p @ 60Hz on HD6970 with 2x the geometry would take roughly 4x the performance (2x the pixels and geometry times 2x the frame rate) just to provide a similar experience at the higher resolution and frame rate with similar average pixels/triangle.

HD6970 has roughly another 2x over the 4x required to maintain same look at the full HD experience,


Xbox360 = 240 Gflops : 22.4 GB/s : 8 Gtex/s : 4 Gpix/s
HD6970 = 2703 Gflops : 173 GB/s : 84.5 Gtex/s : 28.2 Gpix/s
-------------------------------------------------------------
roughly 11x Gflops : 7x GB/s : 10x Gtex/s : 7x Gpix/s


(3.) What about process scaling, lets attempt to get an idea of what future technology might have, lets compare HD6970 to HD7970. Looks like AMD managed around a 1.4x on-paper spec increase except they did not scale Gpix/s.


HD6970 = 2703 Gflops : 173 GB/s : 84.5 Gtex/s : 28.2 Gpix/s : 250 Watt
HD7970 = 3789 Gflops : 264 GB/s : 118.4 Gtex/s : 29.6 Gpix/s : 250 Watt
-------------------------------------------------------------------------
roughly 1.4x Gflops : 1.5x GB/s : 1.4x Gtex/s : 1x Gpix/s


(4.) What about power scaling? The latest shrink of the Xbox 360 hardware uses a 115 Watt power supply (for the entire system, not just the GPU). So lets assume that next generation consoles won't have huge power supplies like PC GPUs. Taking what I'm wild guessing to be a really liberal estimate for possible GPU power for a 115 Watt system, lets compare a medium power modern proxy for the Xbox 360, the HD6750 (which is a 86 Watt TDP on paper). These numbers suggest if Microsoft launched a Xbox update around last year, that it would not be able to do 1080p at 60 Hz with the same look as current 360 games (because the HD6750 isn't 4x the 360).


Xbox360 = 240 Gflops : 22.4 GB/s : 8 Gtex/s : 4 Gpix/s
HD6750 = 1008 Gflops : 73.6 GB/s : 25.2 Gtex/s : 11.2 Gpix/s
-------------------------------------------------------------
roughly 4.2x Gflops : 3.3x GB/s : 3.2x Gtex/s : 2.8x Gpix/s


(5.) Next generation console performance will be a function of how much power the machine uses and what process technology each vendor adapts. Launch date of the console is going to hint at what process is used. Process scaling is not constant, but for the sake of making this simple, lets just assume each process gets 1.4x the performance. Then lets look at estimated performance scaling from the HD6750 to keep closer to current "console" power levels. This will provide some very rough estimate on what future consoles might have. Lets estimate process technology road maps by looking at google image search results,


2011 : 40nm : HD6750 : 4.2x Gflops : 3.3x GB/s : 3.2x Gtex/s : 2.8x Gpix/s
2012 : 28nm : ?????? : 5.8x Gflops : 4.6x GB/s : 4.4x Gtex/s : 3.9x Gpix/s
2013.5 : 20nm : ?????? : 8.2x Gflops : 6.4x GB/s : 6.2x Gtex/s : 5.5x Gpix/s
2015 : 14nm : ?????? : 11.5x Gflops : 9.0x GB/s : 8.6x Gtex/s : 7.7x Gpix/s


(6.) EDIT: Given an estimate that process technology will continue to advance during the lifetime of a console, a vendor could adapt higher power for launch, with the expectation of reducing this later in the product cycle. For example the first 360 had a 203 Watt power supply at launch and is now at 115 Watts (according to Wikipedia). This leaves a big window of possibility for performance.

Given the window of possible launch dates and power targets it would be hard to know exactly what will end up in next generation consoles, however a 2011 high-end single-GPU card seems like a possible proxy for next generation console, and at least a good start to understanding what could be possible.

If I'm reading this right, he estimates a ~2013.5 console GPU will be required to match the performance of a 2011 high-end PC GPU such as the HD6970, due to the requirement to keep power consumption on consoles low.
And if I'm wrong don't shoot me.
 

SkylineRKR

Member
Thats why I´m expecting. The Wii U has tech from 2008 or so, if the PS4 and next Xbox release by 2013 or 14, they´ll likely have the high end tech from today.

Almost three year old tech would make it possible for them to release at a decent price point and not making a loss right away. Such a thing would be impossible if they release with high end specs yet again.

It might perhaps be more feasible to go with a slightly lower spec via SLI though.
 
I would output 1080p and run frame locked to 60Hz with no dropped frames period

actually, with good post processing effect i really don't need 60fps. Also i think that even with next generation most games will be locked at 30fps (1080p)



EDIT: Given an estimate that process technology will continue to advance during the lifetime of a console, a vendor could adapt higher power for launch, with the expectation of reducing this later in the product cycle. For example the first 360 had a 203 Watt power supply at launch and is now at 115 Watts (according to Wikipedia). This leaves a big window of possibility for performance.

this is almost a given... (imho) there's no need to keep Watt power (for the entire system) below 200Watt at day1. (or even more, i really do not worry about, it's more important keep an efficient cooling system rather than low power consumption )

there will be pleanty of time to reduce power consumption in 4-7 years of life cycle




other discussion about graphics power is way too early, we don't even know if Amd will be Xbox next graphics supplier
 
Well, Duh.

Just like it happened with PS3/360.

Edit: about the power usage, wasn't the PS3 something like 240Watts at launch? I agree with Hiro_Kunimi_80.
 

benny_a

extra source of jiggaflops
Interesting to see that post. In the other thread about "generational leap" the power usage was a very important point that was brought up that set my personal expectations back a bit.

I agree with the above point that a launch unit doesn't have to be below 200 Watts on peak.
 

subversus

I've done nothing with my life except eat and fap
ok, so I'm wise with not upgrading my 6950. When the next-gen hits I'll just upgrade to a top of the line card and relax for the next few years. Well, may be one more upgrade will be required as developers start to squeeze out more power by the end of the gen.
 

McHuj

Member
Yeah, I'm expecting something in the level of performance of 77xx-78xx. Which would fall right in line with his 2012 expectation.

In late 2013, I very highly doubt we'll have 20nm GPUs so I think that's out.
 

KKRT00

Member
This is my prediction from a long time. At least what we will get is something of a performance similar to 6870/560. Can be higher, but it wont be lower.

Real time Samaritan alike games welcome too :)
 

DGRE

Banned
Thats why I´m expecting. The Wii U has tech from 2008 or so, if the PS4 and next Xbox release by 2013 or 14, they´ll likely have the high end tech from today.

Almost three year old tech would make it possible for them to release at a decent price point and not making a loss right away. Such a thing would be impossible if they release with high end specs yet again.

It might perhaps be more feasible to go with a slightly lower spec via SLI though.

Is that a known fact?
 
My wild, uneducated guess: If the next round of consoles get the equivalent of a 6970/560Ti and a nice custom processor, once devs master optimization for these new platforms they will be putting out games that surpass the visuals of current maxed out PC games of today.
 
My wild, uneducated guess: If the next round of consoles get the equivalent of a 6970/560Ti and a nice custom processor, once devs master optimization for these new platforms they will be putting out games that surpass the visuals of current maxed out PC games of today.

Well, he's not saying otherwise, notice he says "I still believe developers will be able to start to reach the quality of film for next generation consoles and current generation PCs, and I'm intending to develop or prove out some of the underlining techniques and technology which gets us there.", meaning even current high end PC GPUs, and next gen console GPUs, can be mastered to output better visuals than we've currently seen.
 
Just asking a question. With all the talk of the next gen using these cards. Why are we not seeing developers take what we have right now and make something that is utilizing this tech?
 
Battlefield 3 (PC) does to an extent and more games will follow..

I wish they would hurry the heck up. I can run BF3 max on my 5870. Hardly seems like we are getting anything even this year. Metro 2033 ran at max on it as well. I really would like to see more games to make it worth it to make a custom pc.
 
I wish they would hurry the heck up. I can run BF3 max on my 5870. Hardly seems like we are getting anything even this year. Metro 2033 ran at max on it as well. I really would like to see more games to make it worth it to make a custom pc.

Metro 2033 was a mid-2010 title, too few people had DX11 GPUs at the time for Metro 2033 to be designed around them. But Metro: Last Light will be released 2.5 years later, so I have high hopes for that title.

According to the Chief Technology Officer at 4A Games, "regarding the effects, around 50% of all improvements will be visible in DX9 mode, +30% in DX10 and another +20% in DX11.", so half (30+20=50%) of the graphical improvements they made will be exclusive to the PC version.
 
I'd be perfectly fine with that. The 6970 is still a beast and a gazillion times more powerfull than the GPU in the PS360. Just imagine what Naughty Dog, Epic and so on could squeeze out of a beast like that.
 

Wazzim

Banned
My wild, uneducated guess: If the next round of consoles get the equivalent of a 6970/560Ti and a nice custom processor, once devs master optimization for these new platforms they will be putting out games that surpass the visuals of current maxed out PC games of today.

That's a safe assumption for sure. We'll see what kind of card they'll use on E3 (hopefully).
 
Pretty bland, obvious article. The only possibly interesting thing is it makes me wonder if he KNOWS something like a 6970 is in next xbox. Since he references "from publically available speculation" and so forth, almost too much. Kind of a protesting too much type deal.

But yeah, other than that makes sense, but I think they could shoot even quite a bit higher if we're looking at 2013-14. For instance right now they could probably cull something very nice from the 7000 series from AMD if they were launching in 012, and yet something even better in 2013, and better in 2014.

But anyway, a 6970 equivalent would be veddy nice indeed, and passes the smell test.

However a gross error he made was assuming 1080P @60. I think it's likely 1080P will be targeted, but as always 30 FPS will be the norm.
 

McHuj

Member
I think he left out was the efficiency of the design. When he compares a design like the HD6750 to the xbox GPU, yeah the theoretical flops are only a factor of 4.2 higher, but I wound say that the HD6750 has more available flops to the programmer thanks to a more efficient architecture.

Even if the next Xbox and PS4 have GPU based on a 77xx series, I'll guess they'll be more efficient and optimized for the closed system design. It won't be just an off the shelf GPU.
 

itsgreen

Member
Pretty bland, obvious article. The only possibly interesting thing is it makes me wonder if he KNOWS something like a 6970 is in next xbox. Since he references "from publically available speculation" and so forth, almost too much. Kind of a protesting too much type deal.

I actually like the guess. But the chip that will power the next console will be most likely a generation in front of current hardware... It will largely resemble the 8000 series if it launches nov 2012. 9000 series even if they launch november 2013.

(if they do things the same as this generation).

But yeah, other than that makes sense, but I think they could shoot even quite a bit higher if we're looking at 2013-14. For instance right now they could probably cull something very nice from the 7000 series from AMD if they were launching in 012, and yet something even better in 2013, and better in 2014.

But anyway, a 6970 equivalent would be veddy nice indeed, and passes the smell test.

However a gross error he made was assuming 1080P @60. I think it's likely 1080P will be targeted, but as always 30 FPS will be the norm.

If MS would go for the traditional route I can imagine it's even more powerful (in the long term).

Totally with you that 1080p 30 is the norm for next gen. That way you can also easily do side by side 3d at 1080p (or 720p-ish effectively per eye).

I still think though that MS could go either way though... just throttling a bit of the normal next gen curve. So it would only be as effective as 4x 360... (instead of the usual 8x per generation)
 
So there was a thread the other day that implied Battlefield 3 PC @ max settings is what we are to expect from the next generation. After reading this I am to assume that was correct?
 

SkylineRKR

Member
That would pass as good launch material. Plus, a console can usually do more than a PC with the exact same specs thanks to it being closed and more efficiently used.

To me the graphics aren't even the biggest concern, I'm more interested in getting the full 64 players into a next-gen console BF.
 
Think more 6950-6870 than 6970 imo. The high TDPs this generation caused all sorts of issues and betting your console launch on the foundries hitting their production targets is a risky business.
 
Totally with you that 1080p 30 is the norm for next gen.

ugh.

as a fan of fighting games and racing games I hope they at least aim for 60fps. That way the IQ of fighting games won't be compromised to everything else even though they might be "good" by current standards/ always room for improvement.
 

SkylineRKR

Member
ugh.

as a fan of fighting games and racing games I hope they at least aim for 60fps. That way the IQ of fighting games won't be compromised to everything else even though they might be "good" by current standards/ always room for improvement.

Tech and graphics are ever evolving. In the end, the most complex games and PC ports during the course of 2016 or so will probably be sub-HD and without AA all over again. Just to run them on the ageing tech of the Ps4 and Xbox 3.

There is no 'standard'.
 
ugh.

as a fan of fighting games and racing games I hope they at least aim for 60fps. That way the IQ of fighting games won't be compromised to everything else even though they might be "good" by current standards/ always room for improvement.

I can see 30 fps happening for most games, but it is way more important for racers and fighting games to have 60 fps, and I expect that that will be the case again (although there were some exceptions this generation).
 
So there was a thread the other day that implied Battlefield 3 PC @ max settings is what we are to expect from the next generation. After reading this I am to assume that was correct?

From early next-generation, likely yes. But software developers and artists get better as time goes by, so they'll surpass it.

But you know how humans already look a lot like humans etc, so you can't expect a PS1->PS2 or a PS2->PS3 leap.
 
Yes, its based on R740 gpu

Just pointing this out here. We don't know how far this GPU has been customised. Chances are, it's been changed beyond recognition, rendering it capable of utilising modern shaders and techniques.

And this was the first dev kit issued that used this tech. Nintendo may well use a more modern component in the final unit, substituting it for a lower end GPU for development purposes early on.
 

Arucardo

Member
I wish they would hurry the heck up. I can run BF3 max on my 5870. Hardly seems like we are getting anything even this year. Metro 2033 ran at max on it as well. I really would like to see more games to make it worth it to make a custom pc.
Sure you can, at below 30 fps most of the time (even at 720p you'd be looking at 40fps and less) and dropping in to single digits ;) .
 

DCKing

Member
Yes, its based on R740 gpu
This is not true. The devkit is/was based on the RV770LE, a faulty version of the RV770Pro/XT that is indeed roughly equivalent to the RV740. The fact that they're using a faulty chip, as well as 2008 tech in the devkit is an indication that the chip is (or was, our latest update on this is from some time ago) there as a placeholder chip for a more modern chip. It would make absolutely no sense for Nintendo to pick a 2008 design, and most definitely not a RV770LE. My hypothesis is that this is because of Nintendo trying to get their chipmaker (NEC) to get the chip done on 32nm or maybe 28nm.

A chip equivalent to the Radeon HD6970 - one that has 1536 shader processors and a comparable amount of other stuff - is probably the best possible for the PS4 and Xbox Next GPUs. It's probably the best that can be fitted on a console-sized and console-powered GPU as long as the manufacturers are bound to use 28nm for it (which is probably until 2014). Based on the RV770LE figure, Nintendo is likely to go for around 1/3 of their competitor's capacity (equivalent of 512 SPUs). This means the Wii U is ~2-3x more powerful than a 360 and the Xbox Next ~2-3x more powerful than the Wii U, if we're talking raw GPU performance.

One thing to note is that if this graphics guru at EA is so freely speculating about the next consoles' specifications, it indicates that he hasn't been working with a devkit yet. If a graphics guru at EA isn't working on new hardware, what does that tell us?
 
From early next-generation, likely yes. But software developers and artists get better as time goes by, so they'll surpass it.

But you know how humans already look a lot like humans etc, so you can't expect a PS1->PS2 or a PS2->PS3 leap.

Hope we get more polygons in level design some games look really poly starved in that area.
 

CSX

Member
the use of the words such as "film" "cinematic" scares me. Cant help but think that some developers would see that as "24fps it is"
 

disap.ed

Member
This is not true. The devkit is/was based on the RV770LE, a faulty version of the RV770Pro/XT that is indeed roughly equivalent to the RV740. The fact that they're using a faulty chip, as well as 2008 tech in the devkit is an indication that the chip is (or was, our latest update on this is from some time ago) there as a placeholder chip for a more modern chip. It would make absolutely no sense for Nintendo to pick a 2008 design, and most definitely not a RV770LE. My hypothesis is that this is because of Nintendo trying to get their chipmaker (NEC) to get the chip done on 32nm or maybe 28nm.

A chip equivalent to the Radeon HD6970 - one that has 1536 shader processors and a comparable amount of other stuff - is probably the best possible for the PS4 and Xbox Next GPUs. It's probably the best that can be fitted on a console-sized and console-powered GPU as long as the manufacturers are bound to use 28nm for it (which is probably until 2014). Based on the RV770LE figure, Nintendo is likely to go for around 1/3 of their competitor's capacity (equivalent of 512 SPUs). This means the Wii U is ~2-3x more powerful than a 360 and the Xbox Next ~2-3x more powerful than the Wii U, if we're talking raw GPU performance.

One thing to note is that if this graphics guru at EA is so freely speculating about the next consoles' specifications, it indicates that he hasn't been working with a devkit yet. If a graphics guru at EA isn't working on new hardware, what does that tell us?

Very good and informative post.
 

McHuj

Member
One thing to note is that if this graphics guru at EA is so freely speculating about the next consoles' specifications, it indicates that he hasn't been working with a devkit yet. If a graphics guru at EA isn't working on new hardware, what does that tell us?

I musted have missed something. Who's the EA guy? Lottes is an Nvidia guy, and the rumors point to ATI parts in the next consoles.
 

longdi

Banned
This is not true. The devkit is/was based on the RV770LE, a faulty version of the RV770Pro/XT that is indeed roughly equivalent to the RV740. The fact that they're using a faulty chip, as well as 2008 tech in the devkit is an indication that the chip is (or was, our latest update on this is from some time ago) there as a placeholder chip for a more modern chip. It would make absolutely no sense for Nintendo to pick a 2008 design, and most definitely not a RV770LE. My hypothesis is that this is because of Nintendo trying to get their chipmaker (NEC) to get the chip done on 32nm or maybe 28nm.

A chip equivalent to the Radeon HD6970 - one that has 1536 shader processors and a comparable amount of other stuff - is probably the best possible for the PS4 and Xbox Next GPUs. It's probably the best that can be fitted on a console-sized and console-powered GPU as long as the manufacturers are bound to use 28nm for it (which is probably until 2014). Based on the RV770LE figure, Nintendo is likely to go for around 1/3 of their competitor's capacity (equivalent of 512 SPUs). This means the Wii U is ~2-3x more powerful than a 360 and the Xbox Next ~2-3x more powerful than the Wii U, if we're talking raw GPU performance.

One thing to note is that if this graphics guru at EA is so freely speculating about the next consoles' specifications, it indicates that he hasn't been working with a devkit yet. If a graphics guru at EA isn't working on new hardware, what does that tell us?

I think Nintendo expects the Wii U power to be around the RV770LE, more customized yes but the power levels will not exceed it too far. Makes no sense to seed to developers broken devkits if they are rushing out games for launch. The highend R700 parts are not cutting edge design, fairly available to build.

480-640 SP is actually quite good for the size of Wii U, such specs are currently seen in midrange laptop gpu which can run consoles ports at better performance than ps3/360 at 720p, so i think Wii U will be 1.5x faster than ps3/360, and with 1gb ram and maybe 12mb edram. While ps4 and 360 will a lot more powerful with 4gb ram and a lot more memory bandwidth for 1080p.
 

Orayn

Member
Anyone remember that big, hysterical OP from Steven Colbert where he talked about how Sony HAS to use something that's a generation ahead of NVidia's unreleased successors to the GTX 500 series? This news makes me laugh just thinking about it.
 

disap.ed

Member
I think Nintendo expects the Wii U power to be around the RV770LE, more customized yes but the power levels will not exceed it too far. Makes no sense to seed to developers broken devkits if they are rushing out games for launch. The highend R700 parts are not cutting edge design, fairly available to build.

480-640 SP is actually quite good for the size of Wii U, such specs are currently seen in midrange laptop gpu which can run consoles ports at better performance than ps3/360 at 720p, so i think Wii U will be 1.5x faster than ps3/360, and with 1gb ram and maybe 12mb edram. While ps4 and 360 will a lot more powerful with 4gb ram and a lot more memory bandwidth for 1080p.

I think you are a bit too conservative here.
 
I think Nintendo expects the Wii U power to be around the RV770LE, more customized yes but the power levels will not exceed it too far. Makes no sense to seed to developers broken devkits if they are rushing out games for launch. The highend R700 parts are not cutting edge design, fairly available to build.

480-640 SP is actually quite good for the size of Wii U, such specs are currently seen in midrange laptop gpu which can run consoles ports at better performance than ps3/360 at 720p, so i think Wii U will be 1.5x faster than ps3/360, and with 1gb ram and maybe 12mb edram. While ps4 and 360 will a lot more powerful with 4gb ram and a lot more memory bandwidth for 1080p.

undershooting the wiiu edram and overshooting next xbox and ps memory pool :p
 

Elios83

Member
Anyone remember that big, hysterical OP from Steven Colbert where he talked about how Sony HAS to use something that's a generation ahead of NVidia's unreleased successors to the GTX 500 series? This news makes me laugh just thinking about it.

Power-wise is laughable especially if next gen consoles will be based on system on chip solutions. But architecture and features wise is the opposite, it would make sense to use an architecture which offers a much better flops/W ratio and a more modern feature set.
So I expect next gen consoles to have a raw power similar to top 2011 GPUs but with a 2012/2013 architecture and feature set.
 

chaosblade

Unconfirmed Member
I guess I'm the only one expecting a relatively small increase with performance somewhere in the ballpark between a GTX260-GTX460.

Maybe that just means I'll be surprised if we see something better.
 
Top Bottom