• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Timothy Lottes: "a 2011 GPU (6970) seems like a possible proxy for nextgen consoles"

DCKing

Member
A 28nm 6990m would probably cut its TDP by half, or achieve 2x the performance with the same TDP, meaning a something in the lines of

3.2 teraflops with a TDP of 100W.

Which is what I think is the least that will go into the next generation of consoles.
Nonono. A shrink tot 28nm definitely does nog half the power usage. It theeoretically halves the chip area, and it reduces power usage, but power usage is not halved by any means.

So 3.2 teraflops isn't happening, and I don't think a 100W GPU is either...
 
A little tidbit I found on the next round of mobile GPU's:
(a couple months old, but quite relevant to the discussion were having. There are various articles on this all citing the same source, and some of my cliff notes might be from other articles which I can't seem to find at the moment since I'm posting from my iPhone now)

http://www.bestgaminglaptop.net/news/nvidia-28nm-mobile-gpu-plans-revealed/

Cliff notes: 

-nVidias Q1 28nm mobile GPUs will be based on Fermi and offer improvement both in performance and power consumption. (the next top end fermi chip will reduce the 580M's already impressive 100W to 75W while increasing the 3D mark score from the 15 000s to the 20 000s)

-28nm mobile kepler chips should be shipping sometime this year, probably in Q3. These promise to offer a real next-gen improvement as far as mobile chips are concerned, while maintaining low power consumption.

I think that whatever top-end 28nm notebook GPUs are in production by the end of this year should give us the best basis from which to gauge the potential of console GPUs in 2013.

Also, since Tim Lottes is a tech developer for nVidia and is suggesting the 6970 which is an AMD card as the proxy for next gen consoles, then I suppose we can confirm the obvious that AMD has locked down both Sony and MS and nVidia is out? I'm assuming  would be one of the first people to know if nVidia was going to be involved.
 

Proelite

Member
Nonono. A shrink tot 28nm definitely does nog half the power usage. It theeoretically halves the chip area, and it reduces power usage, but power usage is not halved by any means.

So 3.2 teraflops isn't happening, and I don't think a 100W GPU is either...

Its more about the process than node size. Tsmc's 28 nm is going to be improving in power efficiency. Comparing AMD's 40nm and 28nm cards yields some interesting results.

There is no concrete evidence against 100 W or even 150 W. But there is definitely evidence that desktop Gpus has fat that can be trimmed.
 
Fine. Look at the GameCube for a less outlandish example. It didn't sell nearly as well as the PS2, but was more profitable for Nintendo in the end than the PS2 was for Sony. It was also far less risky.

Proof? Nintendo earning reports were heavily influenced by portable sales at that time and as far as I know there's no breakdown available to tell us how much exactly did they earn on GCN alone.
 

Zaptruder

Banned
Some gamers don't care, but many do. The original DS is seen as an ugly hulking monstrosity by most gamers here. You can't deny that a better, sleeker looking console doesn't have any impact on perception.

I won't deny that form factor matters.

But so do other matters - in terms of a home console... those other matters can have greater weight in terms of importance than the form factor itself.

Although the success of the Wii might seem to indicate form factor strongly matters, I'd argue that it's difficult to tell that weighting due to the conflating factor of introducing a gaming zeitgeist; latching onto the imagination of many with the introduction of motion controls.
 

KageMaru

Member
The yields can drastically get worse as chip size increases. You can probably produce many functional chips than you can of a single big chip.

You're right in the current manufacturing scheme, it doesn't make sense to use multiple chips. However, there's some companies out there that are doing stuff with silicon interposers to put small multiple chips on a single die. They claim this reduces cost, can reduce power, and provide the performance of a larger single chip.

Perhaps this is something that will be employed in a future console. AMD is rumored to be prototyping with interposers.

Yeah I know that a chip size increase could effect yields, but I don't think they would allow the chip to get too large considering it's being designed for a console.
 

DCKing

Member
Its more about the process than node size. Tsmc's 28 nm is going to be improving in power efficiency. Comparing AMD's 40nm and 28nm cards yields some interesting results.
Okay. Expecting half the power usage is quite a stretch though. No process node has ever done that.
There is no concrete evidence against 100 W or even 150 W. But there is definitely evidence that desktop Gpus has fat that can be trimmed.
What reasoning is this? There's no definite evidence against 2W GPUs either...

Given that the PS3 and 360 had sub 100W GPUs at launch, the consoles were huge and many technical problems, as well as the fact that Nintendo has validated the business model of releasing a lower power console, I think it is very unlikely MS and Sony will go for 100W GPUs. 150W GPUs are ridiculous.
 

TheExodu5

Banned
Given that the PS3 and 360 had sub 100W GPUs at launch, the consoles were huge and many technical problems, as well as the fact that Nintendo has validated the business model of releasing a lower power console, I think it is very unlikely MS and Sony will go for 100W GPUs. 150W GPUs are ridiculous.

That's my line of thinking. I don't think MS or Sony are going to go with cutting edge loss-leading hardware this time around. Doing that proved to be disastrous for both Sony and Microsoft this gen.
 
A little tidbit I found on the next round of mobile GPU's:
(a couple months old, but quite relevant to the discussion were having. There are various articles on this all citing the same source, and some of my cliff notes might be from other articles which I can't seem to find at the moment since I'm posting from my iPhone now)

http://www.bestgaminglaptop.net/news/nvidia-28nm-mobile-gpu-plans-revealed/

Cliff notes: 

-nVidias Q1 28nm mobile GPUs will be based on Fermi and offer improvement both in performance and power consumption. (the next top end fermi chip will reduce the 580M's already impressive 100W to 75W while increasing the 3D mark score from the 15 000s to the 20 000s)

-28nm mobile kepler chips should be shipping sometime this year, probably in Q3. These promise to offer a real next-gen improvement as far as mobile chips are concerned, while maintaining low power consumption.

I think that whatever top-end 28nm notebook GPUs are in production by the end of this year should give us the best basis from which to gauge the potential of console GPUs in 2013.

Also, since Tim Lottes is a tech developer for nVidia and is suggesting the 6970 which is an AMD card as the proxy for next gen consoles, then I suppose we can confirm the obvious that AMD has locked down both Sony and MS and nVidia is out? I'm assuming  would be one of the first people to know if nVidia was going to be involved.

Nah, Nvidia for Sony, AMD for MS.
 

Sutanreyu

Member
Sony and Microsoft should just turn to PC gaming and offer standardized, branded, and additionally subsidized hardware approved for their respective games. Add a nice software suite to stack on top of that and say hello to your one console future.
 
Yes, its based on R740 gpu

That doesn't mean it won't have any modern advancements added to it. All the R700 cards are built around the same base structure, that doesn't mean each one functions the same. You iterate advancements off of a base structure. That is how you engineer new hardware.
 

chaosblade

Unconfirmed Member
Nah, Nvidia for Sony, AMD for MS.

Why would Sony go for NVidia again? Not to say they won't, but they got burned last time when Microsoft got a better GPU, and AMD currently offers more power per mm², (i.e., cheaper).

NVidia doesn't really have anything to persuade any console manufacturer to go their way next-gen. I expect all 3 to use AMD GPUs, but I agree if that if any one of them was going to go with someone else it would be Sony. Not sure if Imgtec has anything available with PowerVR that would be worthwhile in a console, so NVidia is probably the only other option.
 
Top Bottom