• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IGN Rumor: Xbox 3 GPU ~= AMD 6670, Wii U ~5x current gen, Xbox 3 ~6x, Dev Kits August

Status
Not open for further replies.

TAJ

Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.
I'm not sure what the problem is. Consoles have always had a lower power PC graphics card or equivalent, it's on the same power curve as the original xbox and the 360.

If 720 were actually on the same power curve, I'd be ecstatic. Unfortunately, you have no idea what you're talking about.
Both NV2A and Xenos were actually tweeners... with roughly the same raw power as top GPUs and some features from the next gen GPUs.
 
Haha!

No matter how many times I read it and typed I thought this thread was talking about a 6770!

Well ignore what I was saying earlier. That GPU would be weaker than I expected even on a SoC... I really should stop posting at 4:00am if that's the end result.
 

StevieP

Banned
Wow, I never thought I'd say these words: StevieP was right, at least in RE: to the next Xbox. I can't believe the 720 GPU might be my 3 year old 5770.

Please save us from this technological ghetto, Sony.

And if Sony said no, too? lol
A billion+ on Kinect, all first party studios producing Kinect content, and the new Metro UI should have given you every indication of Microsoft's intentions, at the very least.

Say new Kinect is prices 100$, they could still price rest of system for 350$ at loss. I mean, no body has ever(except Nintendo with Wii) came out with system that is profitable from day one. If they want to change gens every 3-4 years than its fine, but I don't think they want it and I don't think they will do it, and thats problem. If they change it every 4 years and they price it reasonably, than its all good. But, if they keep it for another 8 years, than if you thought consoles held this gen back, you will think again.

So, another ~8 years with high end performances at beginning would be better IMO. Because, when you have say 4gb of ram and something along mid-high 7 series in closed system, games are going to look damn good 3-4 years from now. We will really get closer to CGi graphics than ever, but if we get something like its rumored here, 8 years will be a miserable wait IMO.


You're not getting 4GB of fast ram. 2GB is the practical limit there due to chip densities. If you're talking a split pool of slow + fast memory, maybe, but it would still be a more complex board than MS likely wants.

A 6670 based GPU wouldn't provide a 6x leap (try half of that). I fully expect Sony to target the ~2 teraflop/8x leap ball park, and its completely achievable on 28nm in a sub 200w package.

But this is probably going to be a 32nm SoC. For all intents and purposes, the arbitrary "6x" figure quoted here could be referring to the system as a whole and not just the GPU. Whether they're going with the Bulldozer-based Trinity-like APU I speculated last year (which is becoming more in doubt, sorry BBoyDubC) or an SoC based on an IBM CPU + AMD GPU, it will still be a mid-range part.

This. Also GFX still matter to me. With better graphics (and everything which belongs to it) I can get more immersed in games (Witcher 2, I'm looking at you). If 3D is going on it will be even more importan.t

If graphics are this important to you, you'd own and maintain a gaming PC. I know I do.

SNES and PS2 were massive leaps visually. This next-gen at a 6970 level of performance can be amazing. I really hope this shit ain't true. MS has the money :(.

You're not getting a 6970 or anything similarly higher-spec into a console case.

This is a tech demo, I can't believe we're all gonna fall for this. Remember how everyone drooled for Sony's PS3 Tech-Demo of Final Fantasy 7??? I am not gonna get fooled again.

YES I saw the Day & Night button... When other factors such as AI and multiple characters/polygons, and ACTUAL Gameplay come into the mix, get ready to watch the framerate drop...

Tech-Demos are there for the superficial gamers. I just want more honesty, i.e. less FMVs, more actual GAMEPLAY!!! If you think it's too early to see gameplay, you're barking up the wrong tree.

The day all of us stop dropping our jaws at FMVs or tech-demos aimed to fool us, is the day we'll see AAA titles which deliver the "What You See Is What You Get" relationship we are always looking for.

MEH Everything I see and hear so far about next-gen...

The difference between the Sony FF7 demo and the Zelda WiiU demo, is that the Zelda WiiU demo was running in real time on the alpha Wii Us, and was not CG.

Woah, disappointing as hell. Will this even be able to make future console Battlefield's look like what BF3 looks today on high end pc's? keeping in mind these consoles will go on for 6-7 years?

If the focus is on kinect, then I really hope MS fails hard and at least Sony sticks to their roots and goes for a big GPU upgrade. Next GT, KZ and UC will need it.

If these rumour indications are anything to go by, it's that next gen will be a more normal cycle (i.e. 5ish years, rather than 7-8ish). Which is a good thing for everyone including graphics whores. And why would anyone want a console maker to fail? More importantly, what is this "cater to MEEEEE only" attitude on here? The more people that play games, the better for everyone involved.

Haha!

No matter how many times I read it and typed I thought this thread was talking about a 6770!

Well ignore what I was saying earlier. That GPU would be weaker than I expected even on a SoC... I really should stop posting at 4:00am if that's the end result.

It may be a similar situation as the Wii U. They started with a R770LE-equivalent in their kits, and have upgraded whatever that was to something newer and/or more powerful in their current near-final kits. MS could be throwing a 6670-level equivalent in the current kits that will be going out and will ramp up slightly as they progress. IGN probably got wind of said early kits. Those expecting anything more than midrange GPUs, even at 32nm or 28nm, are expecting too much however.
 
It may be a similar situation as the Wii U. They started with a R770LE-equivalent in their kits, and have upgraded whatever that was to something newer and/or more powerful in their current near-final kits. MS could be throwing a 6670-level equivalent in the current kits that will be going out and will ramp up slightly as they progress. IGN probably got wind of said early kits. Those expecting anything more than midrange GPUs, even at 32nm or 28nm, are expecting too much however.
I was just laughing at myself man. Going through the thread I can count at least six times I typed 6670 without realizing I was talking about a 6670.
 
Sometimes it's OK for Monkeys to laugh at themselves. It's a healthy attitude

Still makes me feel like an ass for ranting at TAJ last night when I don't even think a variant of that card is likely. It might gives us a general area of what they're shooting for... but yikes.
 

TAJ

Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.
You're not getting a 6970 or anything similarly higher-spec into a console case.

A 6970 on 28nm would be smaller than Xenos or RSX started out at. Some people are talking about the end of 2013. It could be waaaaaaaay smaller by then.
 

Dipswitch

Member
Going with a last gen part probably makes sense if MS is looking to make a profit or break even on the hardware from the get-go. As a result, it will also more than likely result in refresh cycles similar to last generation.

Reasons being that not only will this lower spec hardware look "tired" after 4-5 years, there won't be an impetus to drag the gen out by 2-3 years just to recoup costs like there was this gen.

Can't say I fault them for that stance if that's the case. And if it results in lower cost hardware upfront, that's probably a good thing.
 

Donnie

Member
A 6970 on 28nm would be smaller than Xenos or RSX started out at. Some people are talking about the end of 2013. It could be waaaaaaaay smaller by then.

The 6970 is very similar to the 7890 (same number of shaders, rops and TU's) and its 365mm2 on 28nm. The 6790 on 28nm may be a bit smaller, but small enough to be under 240mm2?, I doubt that.
 

StevieP

Banned
Crazy to think the iPad could surpass Microsoft's next generation console within a few years of launch.

Sure, but iPad games aren't getting 20-50million investment into its games - not even with Epic, who know what's up. The market isn't there for it. Not to mention batteries and controllers and all that.
 

DarkChild

Banned
Mind you they are going to get rid of some stuff from desktop GPU that they don't need, and maybe won't include eDRAM. eDRAM took 70 mm2, and actual chip took 170 mm2.
 

danwarb

Member
Crazy to think the iPad could surpass Microsoft's next generation console within a few years of launch.

I think we'll have a long wait for the iPad to pass the this PS3/360 generation. Imagine the size of the battery that'd be required to run something like for 10 hours.
 

beje

Banned
I think we'll have a long wait for the iPad to pass the this PS3/360 generation. Imagine the size of the battery that'd be required to run something like for 10 hours.

Yep, that's a good point. There's no use on an iPhone 4 or iPad to deliver graphics almost on par with a Vita if they barely can't hold their charge for more than an hour and a half when doing so (and taking into account they cost more than twice as much). Dedicated console/handheld gaming will live on as long as smartphone makers don't figure this shit, as well as physical controls, out.
 
If 720 were actually on the same power curve, I'd be ecstatic. Unfortunately, you have no idea what you're talking about.
Both NV2A and Xenos were actually tweeners... with roughly the same raw power as top GPUs and some features from the next gen GPUs.

You're comparing the NV2A and Xenos to the ATI 1xxx and Nvidia 7xxx generations, which isn't really right. There was an enormous jump to the ATI 2xxx and Nvidia 8xxx series with the unified shaders, the ATI 6 vs 7 series doesn't have nearly the same change. The 360 and PS3 had GPUs comparable to midrange desktop models when they came out.
 

Reallink

Member
Sure, but iPad games aren't getting 20-50million investment into its games - not even with Epic, who know what's up. The market isn't there for it. Not to mention batteries and controllers and all that.

If there's ever some standardized or official controller support, and these consoles actually go to market with garbage ass hardware that will be surpassed by tablets within the first year or two, I think that could change very very fast.
 

StevieP

Banned
If there's ever some standardized or official controller support, and these consoles actually go to market with garbage ass hardware that will be surpassed by tablets within the first year or two, I think that could change very very fast.

It won't. At least not in terms of games. A tablet's still a tablet. Hardware means nothing - software means everything.

Tablet sales will continue to skyrocket, and software support will increase (as mentioned, even tech companies like Epic are throwing their weight behind it) but it's not the same kind of software support you're going to get with consoles. The market isn't there for that on tablets, and won't be while Apple and Android lead the way there.
 

beje

Banned
If there's ever some standardized or official controller support, and these consoles actually go to market with garbage ass hardware that will be surpassed by tablets within the first year or two, I think that could change very very fast.

As I already said, you have to take into account the initial costs of $300-$400 for a dedicated console against $600-$700 (+ controllers) for a capable tablet.
 
The 6970 is very similar to the 7890 (same number of shaders, rops and TU's) and its 365mm2 on 28nm. The 6790 on 28nm may be a bit smaller, but small enough to be under 240mm2?, I doubt that.

Has a 7890 even been announced? Even if it has, based on previous precedent you are talking about a salvage part based on the 7970 which has over 4 billion transistors, compared to 2.5B in the 6970.
 

beje

Banned
Also, people complaining about under-power compared to 360-VS-PCs on 2005 are overlooking a very important detail

Back in 2005, a mid-range GPU was something like this (ATI x1300 pictured) and didn't need additional power lines directly from the PSU

HVPqt.jpg


Nowadays, a mid-range GPU is more or less like this and requires one, or even two 12v lines from the PSU

KwsnI.png


Compare size and refrigeration needs relative to the PCI Express connector (and think that all the left part on modern ones is a huge-ass heatsink) and then please, think twice. I know it's a completely unscientific comparison but I think the examples serve its purpose to show the evolution on GPUs and how it's not as easy integrating a mid range GPU into a console as it was some years ago.
 

Salacious Crumb

Junior Member
Also, people complaining about under-power compared to 360-VS-PCs on 2005 are overlooking a very important detail

Back in 2005, a mid-range GPU was something like this (ATI x1300 pictured) and didn't need additional power lines directly from the PSU

HVPqt.jpg


Nowadays, a mid-range GPU is more or less like this and requires one, or even two 12v lines from the PSU

KwsnI.png


Compare size and refrigeration needs relative to the PCI Express connector (and think that all the left part on modern ones is a huge-ass heatsink) and then please, think twice. I know it's a completely unscientific comparison but I think the examples serve its purpose to show the evolution on GPUs and how it's not as easy integrating a mid range GPU into a console as it was some years ago.

Yep, People really seem to struggle grasping this point.
 

Doc Holliday

SPOILER: Columbus finds America
Also, people complaining about under-power compared to 360-VS-PCs on 2005 are overlooking a very important detail

Back in 2005, a mid-range GPU was something like this (ATI x1300 pictured) and didn't need additional power lines directly from the PSU

Compare size and refrigeration needs relative to the PCI Express connector (and think that all the left part on modern ones is a huge-ass heatsink) and then please, think twice. I know it's a completely unscientific comparison but I think the examples serve its purpose to show the evolution on GPUs and how it's not as easy integrating a mid range GPU into a console as it was some years ago.

Well I don't think bringing up destop GPU is exactly a good comparison either. Shouldn't we be looking at laptop sizes and refrigeration needs? Shit you can get a pretty tiny laptop with an AMD 7500M or Nvidia 580M chip. That's without the benefit of a custom CPU, custom GPU and the other stuff that comes with console design.
 

TUROK

Member
Also, people complaining about under-power compared to 360-VS-PCs on 2005 are overlooking a very important detail

Back in 2005, a mid-range GPU was something like this (ATI x1300 pictured) and didn't need additional power lines directly from the PSU

HVPqt.jpg


Nowadays, a mid-range GPU is more or less like this and requires one, or even two 12v lines from the PSU

KwsnI.png


Compare size and refrigeration needs relative to the PCI Express connector (and think that all the left part on modern ones is a huge-ass heatsink) and then please, think twice. I know it's a completely unscientific comparison but I think the examples serve its purpose to show the evolution on GPUs and how it's not as easy integrating a mid range GPU into a console as it was some years ago.

This is such a fucked up comparison.

ati1800xt512.jpg


This is the closest model to the Xbox 360 GPU.

AMD-Radeon-HD-6670-Gets-Torn-Apart-and-Benchmarked-2.jpg


This is what the GPU the rumor actually refers to looks like.
 

LCGeek

formerly sane
If 720 were actually on the same power curve, I'd be ecstatic. Unfortunately, you have no idea what you're talking about.
Both NV2A and Xenos were actually tweeners... with roughly the same raw power as top GPUs and some features from the next gen GPUs.

No one at the time xbox was out could take certain pc ports and run them with the same fidelity. Having the same features is not the same as having the same power. Xenos was advanced but by the time the next major ATI curve came around it was eclipsed.

Those extra features only helped in console vs console comparison. No one within a year of those chips being out what would want that over the top of the line or even new series of geforce or ati cards that came out.
 

Salacious Crumb

Junior Member

Jay Sosa

Member
I honestly don't care as long as it's more powerful than what we have now. Just gimme a classic controller/make motion controls optional and I'm good.
 
Also, people complaining about under-power compared to 360-VS-PCs on 2005 are overlooking a very important detail

Back in 2005, a mid-range GPU was something like this (ATI x1300 pictured) and didn't need additional power lines directly from the PSU

HVPqt.jpg


Nowadays, a mid-range GPU is more or less like this and requires one, or even two 12v lines from the PSU

KwsnI.png


Compare size and refrigeration needs relative to the PCI Express connector (and think that all the left part on modern ones is a huge-ass heatsink) and then please, think twice. I know it's a completely unscientific comparison but I think the examples serve its purpose to show the evolution on GPUs and how it's not as easy integrating a mid range GPU into a console as it was some years ago.

The x1300 was never mid range. Those were budget cards. The midrange was the x16XX line.
 
You're comparing the NV2A and Xenos to the ATI 1xxx and Nvidia 7xxx generations, which isn't really right. There was an enormous jump to the ATI 2xxx and Nvidia 8xxx series with the unified shaders, the ATI 6 vs 7 series doesn't have nearly the same change. The 360 and PS3 had GPUs comparable to midrange desktop models when they came out.

NV2A was between a Geforce 3 and Geforce 4. It has nothing to do with 7xxx cards. It had the clock speed and architecture of a high end GF3 GPU except with an extra vertex shader that came with Geforce 4. GF4 wasn't released until a few months after Xbox shipped. Xbox had a high end GPU. Better than what you could get in a PC card at the time it was released.

Xbox 360 was the same deal. It had a high end GPU. Xenos had unified shaders over a year before you could even buy an ATI PC card with unified shaders.

Even PS3's RSX was based on the high end NV70 design. Although because of delays, by the time it launched, Nvidia had already released G80.

Now I don't expect them to have GPU designs based on the tippy top GPU again due to thermal and power requirements, but a mid-range GPU is not what they historically have used.

The x1300 was never mid range. Those were budget cards. The midrange was the x16XX line.

This.

And Xenos was more powerful than an X1600.
 

DarkChild

Banned
NV2A was between a Geforce 3 and Geforce 4. It has nothing to do with 7xxx cards. It had the clock speed and architecture of a high end GF3 GPU except with an extra vertex shader that came with Geforce 4. GF4 wasn't released until a few months after Xbox shipped. Xbox had a high end GPU. Better than what you could get in a PC card at the time it was released.

Xbox 360 was the same deal. It had a high end GPU. Xenos had unified shaders over a year before you could even buy an ATI PC card with unified shaders.

Even PS3's RSX was based on the high end NV70 design. Although because of delays, by the time it launched, Nvidia had already released G80.

Now I don't expect them to have GPU designs based on the tippy top GPU again due to thermal and power requirements, but a mid-range GPU is not what they historically have used.



This.

And Xenos was more powerful than an X1600.
And that mid range GPU would be more than 2 years old by then. Hmm...Weird situation indeed.
 
NV2A was between a Geforce 3 and Geforce 4. It has nothing to do with 7xxx cards. It had the clock speed and architecture of a high end GF3 GPU except with an extra vertex shader that came with Geforce 4. GF4 wasn't released until a few months after Xbox shipped. Xbox had a high end GPU. Better than what you could get in a PC card at the time it was released.


Thank you.
 
Well I don't think bringing up destop GPU is exactly a good comparison either. Shouldn't we be looking at laptop sizes and refrigeration needs? Shit you can get a pretty tiny laptop with an AMD 7500M or Nvidia 580M chip. That's without the benefit of a custom CPU, custom GPU and the other stuff that comes with console design.

Mobilie GPU's are extremely neutered. They're not comparable to their desktop cousins.

A 6670 does look to be comparable to the 7500M.
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
I'll believe it when I see it. 6670 is very weak for next gen. I have 2 6950s, which is about 5x faster than a single 6670. When last gen hit I had an Nvidia 6800 ultra, and 360 had something a bit faster. This gen I'm supposed to believe that it will be 5x slower that current PC tech (7970)? That's just irrational cost cutting for Microsoft, who is trying to make inroads in the consumer market. Something like a 6950 with 28 nm would cost something like $20 more per GPU chip and get nearly 3x the performance. Seems as irrational as Congress when it cost cuts.
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
And that mid range GPU would be more than 2 years old by then. Hmm...Weird situation indeed.

This isn't a mid range. This is lower-mid range..from early 2011. These days it's low end.
 
NV2A was between a Geforce 3 and Geforce 4. It has nothing to do with 7xxx cards. It had the clock speed and architecture of a high end GF3 GPU except with an extra vertex shader that came with Geforce 4. GF4 wasn't released until a few months after Xbox shipped. Xbox had a high end GPU. Better than what you could get in a PC card at the time it was released.

Xbox 360 was the same deal. It had a high end GPU. Xenos had unified shaders over a year before you could even buy an ATI PC card with unified shaders.

Even PS3's RSX was based on the high end NV70 design. Although because of delays, by the time it launched, Nvidia had already released G80.

Now I don't expect them to have GPU designs based on the tippy top GPU again due to thermal and power requirements, but a mid-range GPU is not what they historically have used.



This.

And Xenos was more powerful than an X1600.

Historically high end gpu did not pull 150+ watts at max. Now our midrange cards that run cooler and at better watt per power ratio and still pull in the high 70s.
 

DarkChild

Banned
Xenos was better than x1800xt, closer to 1900xt(actually faster because architecture), and that card, in 2006 was ~450$, and 360 was at market for more than half a year. So, a cheap card like 6670 would be extremely cheap and "cold" in nextbox. Seems unlikely MS is going so low...
 

DarkChild

Banned
I'll believe it when I see it. 6670 is very weak for next gen. I have 2 6950s, which is about 5x faster than a single 6670. When last gen hit I had an Nvidia 6800 ultra, and 360 had something a bit faster. This gen I'm supposed to believe that it will be 5x slower that current PC tech (7970)? That's just irrational cost cutting for Microsoft, who is trying to make inroads in the consumer market. Something like a 6950 with 28 nm would cost something like $20 more per GPU chip and get nearly 3x the performance. Seems as irrational as Congress when it cost cuts.
360 had something say...~3-4 times faster than 6800 ultra.
 

-COOLIO-

The Everyman
Sure, but iPad games aren't getting 20-50million investment into its games - not even with Epic, who know what's up. The market isn't there for it. Not to mention batteries and controllers and all that.

the market is there, if the power is there, games will exploit it. Casuals gamers are impressed by good 3d graphics too anyway
 
the next xbox will support dx11.1, microsoft new mantra for windows 8 is parity across the board. They also want development across the board, http://blog.ruffiangames.com/?p=683

The best connection to the PC is to port your Xbox 360 game across and have it connect to the Xbox 360 game via the Xbox Live/Windows Live integration. This will allow your players to be connected and have a Live shared gameplay experience. This is a great connection between the screens, but the dedication to the PC shouldn’t end there, we still have the social networking phenomena that is Facebook to utilise.


Easiest way to ensure easy seamless development is to provide very similar development platform.

In all likelyhood we'll see a repeat of the 360, where they had a cpu that was moderate in hardware, but more advanced in arcitecture, it's completely possible that they're using using a card that reads like 6670 with regards to transitors, heat, power, # of stream processors, # of shader piplines etc, but will have the additional feature sets of the 7000 series card. Meaning the gpu will actually sit somewhere between the 2.

what I want is for it to be a 7000 series card that has some dx12 features, but that's a pipe dream.

we also don't know where microsft's putting their focus. If they really wanted they could throw in a really powerful hardware tesselation unit for example and off load geometry to it, and some cloth simulation or what have you.
 
I'll believe it when I see it. 6670 is very weak for next gen. I have 2 6950s, which is about 5x faster than a single 6670. When last gen hit I had an Nvidia 6800 ultra, and 360 had something a bit faster. This gen I'm supposed to believe that it will be 5x slower that current PC tech (7970)? That's just irrational cost cutting for Microsoft, who is trying to make inroads in the consumer market. Something like a 6950 with 28 nm would cost something like $20 more per GPU chip and get nearly 3x the performance. Seems as irrational as Congress when it cost cuts.

Those cards are not even 3x more than a single 6670. They also pull a huge amount of energy, and are loud as fuck. They pull just almost as much power as the entire 360 did at launch.
 

Donnie

Member
Well I don't think bringing up destop GPU is exactly a good comparison either. Shouldn't we be looking at laptop sizes and refrigeration needs? Shit you can get a pretty tiny laptop with an AMD 7500M or Nvidia 580M chip. That's without the benefit of a custom CPU, custom GPU and the other stuff that comes with console design.

A 7500m is slower than the chip rumoured to be in 720, the 6670 is around 25-30% faster.
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
Those cards are not even 3x more than a single 6670. They also pull a huge amount of energy, and are loud as fuck. They pull just almost as much power as the entire 360 did at launch.

Each ARE about 3x. These are facts, don't make stuff up. Leave that to IGN, which is after all part of News Corp which are masters at making stuff up.
6950: 3050
6670: 1226
Note: These benches don't scale with multi-gpu or crossfire, but are pretty accurate for single gpu comparisons.

6990 total system power at stress load is (which is two 6950 on a single board): 545 Watts. They can hire a team of electrical and computer engineers to design a custom board that leverages 28 nm, downclocks 20%, and hits 350 W for a launch, and rev it to <200 Watts just like 360 did.

Here's Battlefield 3 at high settings and real 1080p. 6990 gets 65 fps, 6670 isn't even listed because it's shit. The 5850 is quite a bit faster and it is 25.6 fps. If you interpolate based on videocardbenchmarks having the 5850 2x the 6670, then the 6670 would render at about 12.5 fps. Welcome to next gen.

Crossfire gets about 0.9x efficiency. Thus 5x faster than a single 6670.
 

AlStrong

Member
Also, you keep throwing around the "5-6X is not enough" but don't go into WHY. WHY won't it be enough?

It comes down to what you expect for graphics next generation. Just consider that moving from this gen's 720p/slight sub-HD to 1080p will require more than double the pixel processing and rendering bandwidth. Now factor in that current gen consoles are stuck with 32-bit per pixel formats. FP16 (64bpp) is more than just "HDR" - the precision for shader effects is just that much more important once you start piling on the post-processing and what not.

So even before we can begin to drool about higher quality graphics, we're already worrying about a rather significant jump in resource requirements just to maintain the current gen quality of shading/effects etc.

And since we are talking about just pixel processing, that number is fairly simple to derive, even if it is rather naive and simplistic. TBH I hate these multipliers being thrown around recently just because they have no real context. But here: 1080p is 2.25x 720p (bigger if you want to consider sub-720p current gen titles), and FP16 simply requires double the speed (or higher cost in transistors for full speed vs 32-bpp formats) and double the memory and memory bandwidth (who knows what the edram or main memory situation will be, that's not the point).

In short: if people want 1080p and high precision pixel processing as starting points, then we already require at least 4.5x the processing power of current gen. Then you'll obviously need much more if you're going to expect leaps in lighting and shadowing quality.

Of course, ALUs are fairly inexpensive. We can never get enough Z/Stencil performance as well as memory for those more complex shadow buffers (we are soooo far from having decent shadows in the general case).

Anyways. That's all moot if devs end up targeting 720p, which really isn't an awful idea since they can also more easily accommodate MSAA, and it's not like the GPU hardware scalers have been stale for the last 7 years. Devs will just naturally be able to do a shit ton more with less than half the shading or fillrate requirements.


----------------
FWIW, if we take a look at the geometry side of things, the base performance considerations haven't really seen any significant leap. And by that, I mean that the setup rates are still 1 triangle per clock, and so we're ultimately limited by the clock speed of the GPU. And I think I've heard enough about people trying to cut down on power consumption of a high end GPU by gimping clock rates. There's always a trade-off somewhere, and it's perhaps not so simple to gimp the GPU in such a manner if we are to expect more from the geometry side. Certainly, I don't expect to see integer multiples of Xenos' clock speed, but it's just something to ponder (shadows, particles, decals, multipass rendering and other things will eat up the setup rate).

The multiple geometry engines being added to GPUs now are only useful when tessellating, and we're quite a ways away from it being adopted universally in the artist asset pipelines. Tessellating just puts an extra load on the setup and shading anyway.

Fortunately, ALUs are cheap, so at least we'll have considerably more vertex/geometry shading power than this gen to actually do some more complex manipulations of vertices.
 

thirty

Banned
I think this is all fud but either way could this system handle gta with ice mod at 1080p 60fps? 3d 720p 60fps? All I'd really like to see is that, with maybe double the polygon count and a HUGE jump in textures and lighting. If a system can do that at 300, why do we really need more? I'm actually still very impressed by games this gen. I think we'll see more full downloadable smaller games anyway.
 

TAJ

Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.
Anyways. That's all moot if devs end up targeting 720p, which really isn't an awful idea since they can also more easily accommodate MSAA, and it's not like the GPU hardware scalers have been stale for the last 7 years. Devs will just naturally be able to do a shit ton more with less than half the shading or fillrate requirements.

PC gamers should be praying that console developers target 720p.
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
PC gamers should be praying that console developers target 720p.

You're right. Because then I could use PC hardware to get better effects at 1080p. However it would be much, much more plausible for the next Xbox to perform about as fast as a 6950 but with 7000 28 nm tech to reduce power/heat...regardless 720p DX11 console games will more effects + PC to up the IQ and resolution is a good idea.
 
Status
Not open for further replies.
Top Bottom