• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

More hints that AMD is building Nintendo NX’s processor (VentureBeat)

Status
Not open for further replies.

jaosobno

Member
So what happened to that Microsoft buying amd rumor? Lol

Even if that were true, why would Microsoft miss the opportunity to be a hardware vendor for Sony and Nintendo?

Look at Samsung and Apple. They are two big competitors and yet Samsung is Apple's main hardware vendor. Being competitors does not prevent you to do business with the same company.
 

LewieP

Member
I doubt MS, were they to acquire AMD would just pull support from either Nintendo or Sony, and I imagine they have contracts in place that would prevent this anyway, but long term they could make their competitors lives more difficult for subsequent consoles.

I assume that one of the primary reasons they'd be interested in such an acquisition would be long term cost savings.
 

jaosobno

Member
I doubt MS, were they to acquire AMD would just pull support from either Nintendo or Sony, and I imagine they have contracts in place that would prevent this anyway, but long term they could make their competitors lives more difficult for subsequent consoles.

I assume that one of the primary reasons they'd be interested in such an acquisition would be long term cost savings.

Again, if you can make money off Sony and Nintendo, why miss the oportunity? Microsoft makes billions from Android patent royalties and Android is (along with iOS) their main mobile OS competitor.

They would be idiots if they refused to sell hardware to Nintendo in future. And if that would happen, Nintendo can always go to Samsung, Qualcomm, Imagination, etc. and completely bypass the "undesirable vendors" like nVidia and Intel (due to their pricing policy, not to mention that Intel's GPU solutions are anemic at best).
 

LewieP

Member
Again, if you can make money off Sony and Nintendo, why miss the oportunity? Microsoft makes billions from Android patent royalties and Android is (along with iOS) their main mobile OS competitor.

They would be idiots if they refused to sell hardware to Nintendo in future. And if that would happen, Nintendo can always go to Samsung, Qualcomm, Imagination, etc. and completely bypass the "undesirable vendors" like nVidia and Intel (due to their pricing policy, not to mention that Intel's GPU solutions are anemic at best).

Indeed. I doubt they would outright refuse, but they would be in a position to make the contracts a little bit more favourable to themselves. Things like the timeframe they'll deliver chips on, the scale of availability.

Sony, and Nintendo assuming they go with AMD for future consoles will probably want to stick with AMD in the future because it's the easiest way to maintain backwards compatibility, which is one of the big advantages of Sony going with x86 this time around.

Certainly MS would be unwise to refuse to sell hardware to their competitors, but being a supplier of their chips would grant them a degree of market power, power which I doubt they would be using entirely benevolently.
 

Panajev2001a

GAF's Pleasant Genius
Indeed. I doubt they would outright refuse, but they would be in a position to make the contracts a little bit more favourable to themselves. Things like the timeframe they'll deliver chips on, the scale of availability.

Sony, and Nintendo assuming they go with AMD for future consoles will probably want to stick with AMD in the future because it's the easiest way to maintain backwards compatibility, which is one of the big advantages of Sony going with x86 this time around.

Certainly MS would be unwise to refuse to sell hardware to their competitors, but being a supplier of their chips would grant them a degree of market power, power which I doubt they would be using entirely benevolently.

What you described is against the law though, hopefully hey would be stupid enough to give the DOJ something to hit them with an antitrust lawsuit again.
 

Litri

Member
Coming back to the HW discussion, would it make sense for Nintendo to go Mali MP or Mali T route for the GPU? Is there anything AMD could put in there not coming from the Mali IP?

I would rule out multi-core GPUs as I don't think Nintendo would like that so I'm leaning towards the Mali T series. Are all Mali GPUs suitable for all ARM CPUs?
 
Nintendo went though quite a process to come up with partners for what would become GameCube.

1VFagxk.jpg


Ultimately though:

sbI1hzN.jpg
+
clykd6S.jpg

You mean Ati? Like it says on the sticker on the GameCube
 

jaosobno

Member
Coming back to the HW discussion, would it make sense for Nintendo to go Mali MP or Mali T route for the GPU? Is there anything AMD could put in there not coming from the Mali IP?

I would rule out multi-core GPUs as I don't think Nintendo would like that so I'm leaning towards the Mali T series. Are all Mali GPUs suitable for all ARM CPUs?

If AMD is the vendor, GCN is practically guaranteed. I highly doubt there will be anything Mail inside NX. I'm thinking we'll see ARM+GCN combo.
 

Terrell

Member
You mean Ati? Like it says on the sticker on the GameCube

Wikipedia said:
ArtX was contracted in May 1998 to create the system logic and the graphics processor (code named Flipper) for Nintendo's fourth game console (code named "Dolphin"), which would eventually be launched as the GameCube.[1][3] Nintendo's Howard Lincoln said, "This company is headed up by Dr. Wei Yen, -- the man who was primarily responsible for the N64 graphics chip. Dr. Yen has assembled at ArtX one of the best teams of 3D graphics engineers on the planet."

ArtX was bought out by ATi in 2000 before the GameCube was released, making ATi Nintendo's hardware partner by default.

EDIT: DAMN, beaten.
 

AmyS

Member
You mean Ati? Like it says on the sticker on the GameCube

ATI had nothing to do with the actual design of GameCube's Flipper GPU. It was done by ArtX. Flipper also had nothing to do with ATI's Radeon 8500 of roughly the same time frame. By the time ATI bought ArtX, the Gamecube GPU was already design completed.

Also:

https://www-03.ibm.com/press/us/en/pressrelease/2181.wss
The IBM copper processor will be paired with a revolutionary graphics chip designed by ArtX Inc., one of the world's leading 3D graphics technologists located in Palo Alto, California. The ArtX team, led by chairman, Dr. Wei Yen, includes a number of well known 3D graphics designers
.
http://www.ign.com/articles/1999/05/14/nintendo-press-conference-transcript
[Howard Lincoln]
The graphics chip is being developed by ArtX of Palo Alto, California. This company is headed up by Dr. Wei Yen, -- the man who was primarily responsible for the N64 graphics chip.

Dr. Yen has assembled at ArtX one of the best teams of 3D graphics engineers on the planet.

We are absolutely confident that Dolphin's graphics will equal or exceed anything our friends at Sony can come up with for Playstation 2.

Dr. Wei Yen is here today in the front row and I'd like him to stand and be recognized.
 

usmanusb

Member
Nintendo is facing drought of titles for few generations from third parties. Is there any possibility that they come out a way (eg. virtualization etc) of running PC gaming on the console and android games on handheld?
 

Rodin

Member
Nintendo is facing drought of titles for few generations from third parties. Is there any possibility that they come out a way (eg. virtualization etc) of running PC gaming on the console and android games on handheld?

The latter is possible and very likely to happen, in the sense that it should be extremely easy to port Android games to the handheld and Nintendo may want to encourage that (in some cases, they will probably hit the home too). The former is impossible.


Anyway, interesting tweet from tamaki about NX

Tamaki said:
For anyone concerned about NX power levels, don't be. It's early days and subject to change. Wii was originally at XB360 levels of power!

Hopefully he'll say more about what he heard, he said he wanted to give Nintendo fans this "good news" about NX a couple of weeks ago and we're still waiting.
 

Neoxon

Junior Member
The latter is possible and very likely to happen, in the sense that it should be extremely easy to port Android games to the handheld and Nintendo may want to encourage that (in some cases, they will probably hit the home too). The former is impossible.


Anyway, interesting tweet from tamaki about NX



Hopefully he'll say more about what he heard, he said he wanted to give Nintendo fans this "good news" about NX a couple of weeks ago and we're still waiting.
I assume he means the console version....or both. That being said, if both form factors have good power, then that's only but Step 1 in Nintendo's road to recovery. The NX likely isn't gonna sell that much regardless based on one form factor, but at least both possible form factors together could produce a decent install base for shared software. Plus it'll make third parties realize that Nintendo can play ball (assuming if they release some mature games to build up the audience needed for third party games). They won't come back for the NX, but if Nintendo does this right, they may come back for the platform after the NX.....or for the next wave of NX hardware, whatever approach Nintendo will take with future hardware.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
He makes it sound like the rumors about power being low are true but that since it's early, it could change.

Has a console ever increased in power over original plans? I thought they always end up lowering, if anything.
 

Rodin

Member
He makes it sound like the rumors about power being low are true but that since it's early, it could change.

Has a console ever increased in power over original plans? I thought they always end up lowering, if anything.

Xbox 360 went from 256MB to 512MB RAM, and PS4 from 4 to 8GB GDDR5. Don't remember anything about CPU/GPU though, excluding maybe some slightly higher clocks (Wii U itself went from 1GHZ CPU/400MHZ GPU to 1.24GHZ CPU/550MHZ GPU iirc).
 

Mpl90

Two copies sold? That's not a bomb guys, stop trolling!!!
Xbox 360 went from 256MB to 512MB RAM, and PS4 from 4 to 8GB GDDR5. Don't remember anything about CPU/GPU though, excluding maybe some slightly higher clocks (Wii U itself went from 1GHZ CPU/400MHZ GPU to 1.24GHZ CPU/550MHZ GPU iirc).

IIRC, PS4 went from 2GB to 8GB GDDR5. Also, I think 3DS went from 64MB to 128MB of RAM.
 

usmanusb

Member
The latter is possible and very likely to happen, in the sense that it should be extremely easy to port Android games to the handheld and Nintendo may want to encourage that (in some cases, they will probably hit the home too). The former is impossible.


Anyway, interesting tweet from tamaki about NX



Hopefully he'll say more about what he heard, he said he wanted to give Nintendo fans this "good news" about NX a couple of weeks ago and we're still waiting.

I think if Android somehow works on Nintendo's handheld or console, it could bring up tons of games and at the cheaper rate with a best controller sku. But if Nintendo also sort out a way of scaling Pc/Ps4/XBO

NX software = Nintendo Software + virtual console + android software from Amazon partnership + scaled titles from PC/Ps4/XBO = will it be still successful?
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
Hmmm...thanks for the replies.

It just seems to out of the ordinary for Nintendo, a company seemingly hellbent on prioritizing a small console size and small power consumption, to even contemplate increasing power on a system.

Then again, we have no idea if the rumors are true. Matt said they weren't, I guess.
 

AlStrong

Member
Xbox 360 went from 256MB to 512MB RAM, and PS4 from 4 to 8GB GDDR5. Don't remember anything about CPU/GPU though, excluding maybe some slightly higher clocks (Wii U itself went from 1GHZ CPU/400MHZ GPU to 1.24GHZ CPU/550MHZ GPU iirc).

360 was supposed to be 3.5GHz+ in the leaked doc, so it was technically cut back. RSX saw a clock reduction to 500MHz core/650MHz mem(instead of the originally announced 550/700).


NV2A (xbox) was supposed to be 250MHz as well, but they had to roll with 233.

And of course, xbox one had its clocks increased.
 
Gamecube was originally supposed to have a 405MHz CPU and 202.5MHz GPU, but that got changed to 485MHz and 162MHz.

3DS was, as everyone knows, originally going to be Tegra powered.

The PS3 was going to have a 2nd Cell processor in place of the RSX, but that got changed because sanity prevailed.
 

usmanusb

Member
if Nintendo go for CPU and GPU from AMD, which is highly possible, which generation of CPU and GPU will be highly possible?

Best case scenario (better than PS4/XBO) = ?
Worst case scenario (Less than PS4/XBO) = ?
Possible case scenario (Equivalent to PS4/XBO) = ?
 

big_z

Member
if Nintendo go for CPU and GPU from AMD, which is highly possible, which generation of CPU and GPU will be highly possible?

Best case scenario (better than PS4/XBO) = ?
Worst case scenario (Less than PS4/XBO) = ?
Possible case scenario (Equivalent to PS4/XBO) = ?


nx < 1tflop
 
Another bump? Sure, what the hell. Anyway, don't get excited about NX specs being remotely close to Xbone. It would take a small miracle. Let's look at this practically. Nintendo need a system they can mass produce (20 million, I don't know, but alot), because they are hoping for (expecting) it to be a huge hit. So they need a mature process and they need it cheap. 28nm it is.

On 28nm, we can look at the Xbone CPU and start making cuts.

Well, Nintendo is going to want that SRAM. Having a small pool of low-latency memory is "in their DNA" according to Takeda, and I'd assume they've optimized their graphics libraries to play well with it. So that huge chunk of RAM will be on the chip, and at about the same die area.

If Nintendo want to safely undercut MS on price (say $199 vs $299), they need to shave costs by about a third. Start cutting out shaders. They probably aren't going lower than Wii U, so 256 ALUs (4 graphics cores) sounds about right. Remember, there is also the south bridge to integrate (Xbone doesn't have it on chip. Carrizo does). If our upper die area limit is 2/3 Xbone's, that's about ~240 mm^2--the size of Carrizo. They'll also presumably save die area from having smaller memory controllers (cut those fat dual 128-bit controllers in half probably) and by using ARM for the CPU instead of Jaguar.

That's my preliminary analysis anyway. Should work around to about ~400 GFLOPS or a little more if they clock aggressively. There's just no getting around restrictions on die size. And it's not a heat issue. It's a cost issue.
 

Rodin

Member
What about eDRAM? They can find a way to use that instead of eSRAM. Also, i find hard to believe that the home console will cost less than 250 bucks.

I agree they'll go with 28nm, but 400GFLOPs would be a trainwreck. That in 2016-17 would be worse than the Wii (aside from API, assuming they'll go with Vulkan) AND without the Wiimote. They'll definitely use another gimmick, but it won't be that successful, it's simply impossible for reasons that should be obvious right now. They really can't be that stupid, something like this would make the Wii U look like the most desirable piece of hardware in history.

Sad to know I'll probably never buy another Nintendo console in my life and the last one I owned was n64. Nintendo hardware reminds me of their games, never seems to mature.
Not a good parallel since their games mature basically at every single new iteration, with very few exceptions.
 

Freiya

Member
Sad to know I'll probably never buy another Nintendo console in my life and the last one I owned was n64. Nintendo hardware reminds me of their games, never seems to mature.
 
What about eDRAM? They can find a way to use that instead of eSRAM. Also, i find hard to believe that the home console will cost less than 250 bucks.

I agree they'll go with 28nm, but 400GFLOPs would be a trainwreck. That in 2016-17 would be worse than the Wii (aside from API, assuming they'll go with Vulkan) AND without the Wiimote. They'll definitely use another gimmick, but it won't be that successful, it's simply impossible for reasons that should be obvious right now. They really can't be that stupid, something like this would make the Wii U look like the most desirable piece of hardware in history.


Not a good parallel since their games mature basically at every single new iteration, with very few exceptions.


Its simple: They will position their home console compared to the handheld. Say the handheld is a 64gflops part... then we can expect a 384 to 640gflops part for the home console.


Sad to know I'll probably never buy another Nintendo console in my life and the last one I owned was n64. Nintendo hardware reminds me of their games, never seems to mature.



Their hardware could be a new Wii U, it doesnt matter to me... what worries me though is their game output, which seems to be lacking what made their previous games so special.
 
What about eDRAM? They can find a way to use that instead of eSRAM. Also, i find hard to believe that the home console will cost less than 250 bucks.

I agree they'll go with 28nm, but 400GFLOPs would be a trainwreck. That in 2016-17 would be worse than the Wii (aside from API, assuming they'll go with Vulkan) AND without the Wiimote. They'll definitely use another gimmick, but it won't be that successful, it's simply impossible for reasons that should be obvious right now. They really can't be that stupid, something like this would make the Wii U look like the most desirable piece of hardware in history.


Not a good parallel since their games mature basically at every single new iteration, with very few exceptions.

If they push clocks to 1 Ghz on the graphics cores (probably not likely, but maybe devs will complain enough), then they can get to 512 GFLOPs. $250 is probably the highest they'll aim for given that they are jumping in mid generation and competing against substantial software libraries. By holiday 2016, Xbox One will likely be down to $300. If they think they are going to outsell Xbox in the U.S. at the same price, they better have another Wii Sports up their sleeves.
 

JordanN

Banned
Any truth to Wii originally being PS360 level?

The closest thing was when Miyamoto said a 360 level Wii would have cost $450.

But I really doubt that confirms they actually had plans of doing it, as in the same interview, Miyamoto said he feared backlash and downplayed graphics talk.

Miyamoto said:
Ultimately, it came down to whether power should be a key element of the console or not. We didn't think it was possible to build a powerful machine for less than 50,000 yen ($450). Not only would it use a lot of electricity, it would need a fan, which meant it would be noisy. Moms would rise up against it. Plus, it would take too long to boot up, like a PC, which isn't an ideal toy.

Lastly, Takamoto said the Wii was always suppose to be the size of 3 DVD cases put together. It was never going to be 360 level.
 

Magwik

Banned
Gamecube was originally supposed to have a 405MHz CPU and 202.5MHz GPU, but that got changed to 485MHz and 162MHz.

3DS was, as everyone knows, originally going to be Tegra powered.

The PS3 was going to have a 2nd Cell processor in place of the RSX, but that got changed because sanity prevailed.
Holy shit there were almost two cell processors in the PS3?
 
What about eDRAM? They can find a way to use that instead of eSRAM. Also, i find hard to believe that the home console will cost less than 250 bucks.

I agree they'll go with 28nm, but 400GFLOPs would be a trainwreck. That in 2016-17 would be worse than the Wii (aside from API, assuming they'll go with Vulkan) AND without the Wiimote. They'll definitely use another gimmick, but it won't be that successful, it's simply impossible for reasons that should be obvious right now. They really can't be that stupid, something like this would make the Wii U look like the most desirable piece of hardware in history.


Not a good parallel since their games mature basically at every single new iteration, with very few exceptions.

Also, I didn't really reply to parts of this, so I'll try again. eDRAM I've changed my mind on. It would be nice, but doesn't look feasible. IBM (or globalfoundries since they bought the plants) is the only partner that could do it, but I don't think Nintendo will want to take the chance of using a specialty process. That locks them into one manufacturer, and if something happens (like when the factory that produced Wii U's GPU/eDRAM got sold to Sony), it makes finding another supplier much more difficult. It also raises costs and even AMD have said they have no interest in specialty nodes.

I would be very intrigued in finding out how efficient a 22nm SOI design would be, but then again, if Nintendo stick with a bulk 28nm process and then migrate down to finFET in a couple years when that becomes feasible, that would probably be a smarter approach.

I don't know what the gimmick will be, but I'm sure they have one planned. Regardless, it seems like they are embracing standard controls again. Also, the talk of this new membership system, account-based purchases, and unified platform is a pretty big change/upgrade regardless. Are those theoretical FLOPs low compared to the competition? Yeah, they are. But people will see Mario Kart in 1080p and won't care. Some will probably even argue once again that the console is twice as powerful as it really is.
 

sörine

Banned
The closest thing was when Miyamoto said a 360 level Wii would have cost $450.

But I really doubt that confirms they actually had plans of doing it, as in the same interview, Miyamoto said he was scared of moms.

Pretty sure even the first Wii devkit was actually a Gamecube (ex: Mario Galaxy).
I believe the Wii remote was actually planned as a Gamecube accessory at first. If you go back further to around when he assumed the presidency Iwata used to make comments about reversing low GC sales with innovative peripherals and controllers. People assumed that meant the DK Bongos but in retrospect I suspect things like the Wii remote and Balance Board were what he was really talking about.
 

Rodin

Member
Its simple: They will position their home console compared to the handheld. Say the handheld is a 64gflops part... then we can expect a 384 to 640gflops part for the home console.

PS Vita is 51GFLOPs and is barely capable of matching the 540p resolution with a lower graphical fidelity than what should be expected from NX if we look at previous generations of Nintendo handheld devices. Nintendo games would aim for that and 60fps, there's no way that a 64GFLOPs GPU would be enough for that task. That kind of horsepower is a joke even for Android ports. It would be the worst portable they've ever made hardware wise, and it would be pretty stupid for them to double down on shitty hardware after all the mud they're still getting after the WiiU/3DS generation. Not saying they should aim for high end specs, but making something that's even worse (way worse) than Wii U and 3DS when they originally came out doesn't make sense at all.

I don't think the portable will hold back the home console, but let's say that's case and they'll do a 64gflops handheld: at that point, why don't they just go with Imagination? They can have a 100GFLOPs part for the handheld and a 819 GPU for the home console. Both would be modern, cheap, power efficient and perfect as a base architecture for several next generation consoles.

I know Nintendo is stubborn and probably doesn't care at all about the fact that even i wouldn't be able to spend money on that kind of hardware, but i think it's pretty safe to assume that something like this would hurt them in many more significant ways, with little to no advantages compared to going with something that's still cheap and underpowered by all means, but acceptable at least.

If they push clocks to 1 Ghz on the graphics cores (probably not likely, but maybe devs will complain enough), then they can get to 512 GFLOPs. $250 is probably the highest they'll aim for given that they are jumping in mid generation and competing against substantial software libraries. By holiday 2016, Xbox One will likely be down to $300. If they think they are going to outsell Xbox in the U.S. at the same price, they better have another Wii Sports up their sleeves.
512GFLOPs is more likely than ~400 imho, but it would still be pretty bad.

Also, I didn't really reply to parts of this, so I'll try again. eDRAM I've changed my mind on. It would be nice, but doesn't look feasible. IBM (or globalfoundries since they bought the plants) is the only partner that could do it, but I don't think Nintendo will want to take the chance of using a specialty process. That locks them into one manufacturer, and if something happens (like when the factory that produced Wii U's GPU/eDRAM got sold to Sony), it makes finding another supplier much more difficult. It also raises costs and even AMD have said they have no interest in specialty nodes.

I would be very intrigued in finding out how efficient a 22nm SOI design would be, but then again, if Nintendo stick with a bulk 28nm process and then migrate down to finFET in a couple years when that becomes feasible, that would probably be a smarter approach.
I see, but eSRAM doesn't look like a good answer to this problem for them. At that point the 1GB HBM as a "small" memory pool would be a much smarter (and forward thinking) move. Not likely, but a better solution for sure. ESRAM has been a problem for Xbone since D1, i really hope they don't go that route.

The 22nm scenario you described would be a great "middle way" solution, but who knows what they're doing.
 
Ok, thanks. If that's the case i can see it happening, but i still think a dual core A57 would be a better fit so i still hope they considered it.


It's around 230GFLOPs for the Air 2 vs 512 for X1. The only viable PowerVR option for the home is the GT7900, a 819GFLOPs GPU that is the perfect fit for an "affordable gaming console" according to Imagination. But we're pretty sure they went with AMD, and i'll be a little pissed if they used something less powerful than that.

Just remembered that the new Apple TV is on the way, though I'm guessing it won't use anything like the GT7900. But I'd imagine that box will be more of a threat to the NX than the PS4/XBO, given what Nintendo is planning on doing next generation.
 
I see, but eSRAM doesn't look like a good answer to this problem for them. At that point the 1GB HBM as a "small" memory pool would be a much smarter (and forward thinking) move. Not likely, but a better solution for sure. ESRAM has been a problem for Xbone since D1, i really hope they don't go that route.

HBM is made exclusively by SK Hynix and also requires they use that interposer/MCM setup, which increases complexity. It's another specialty part, and while nice, I don't think it's optimal for Nintendo's goals. How would prices work out if they used HBM1 and then the higher density HBM2 becomes the norm in 2 years? They would be likely paying higher prices for older tech. With eSRAM, you get that somewhat guaranteed improvement in price, size, etc. with each shrink.

I think eSRAM is only a problem on Xbone for devs looking to push 1080p AAA graphics. It's out of balance for the system, but for a system with less than half the floating op capabilities it's probably great. It will (hopefully) allow Nintendo's own games to push 1080p in ways they haven't this gen.
 

Vena

Member
For those asking about the PS3/X360 Wii, that isn't what existed. What existed was a development chain that was aimed at HD throughput like the aforementioned competitors. The Wii was on a different pipeline.

You can listen to Julian (from Factor 5) discuss it on NVC here. He knows what he's talking about as he's been part of their hardware creation or aware of it since the N64.

It starts at 53 minutes.
 

Rodin

Member
HBM is made exclusively by SK Hynix and also requires they use that interposer/MCM setup, which increases complexity. It's another specialty part, and while nice, I don't think it's optimal for Nintendo's goals. How would prices work out if they used HBM1 and then the higher density HBM2 becomes the norm in 2 years? They would be likely paying higher prices for older tech.
I was thinking about the "shorter cycle before releasing a new hardware" argument. In that case, would make sense for them because they would simply use HBM2 starting with NX gen 2, then they can change it with HBM3 when it's mature, etc.

Maybe yields and costs convinced them to not even consider going that route. Not that eSRAM is cheap though.

With eSRAM, you get that somewhat guaranteed improvement in price, size, etc. with each shrink.

I think eSRAM is only a problem on Xbone for devs looking to push 1080p AAA graphics. It's out of balance for the system, but for a system with less than half the floating op capabilities it's probably great. It will (hopefully) allow Nintendo's own games to push 1080p in ways they haven't this gen.
Probably, but that would imply Nintendo basically sticking with Wii U graphics in 1080p. Now, I'm all for gameplay>>>>>>>>>>graphics and i've always been, but even i wouldn't be comfortable with something like this. I don't know how many people would be thrilled to jump on something like that, no matter how low the price is.

Like i said in my previous post, the 22nm SOI (with eDRAM etc) you suggested would be a great "middle way" solution, better than the eSRAM route imho, and especially if this doesn't prevent Nintendo from switching to HBM in the future, when Takeda is retired and nobody will think that a small pool of memory is still a good idea for a modern gaming machine. Not when there are workarounds for the latency issue like Cerny proved with the PS4 memory architecture, and HBM has way lower latency than GDDR5 iirc.
 
PS Vita is 51GFLOPs and is barely capable of matching the 540p resolution with a lower graphical fidelity than what should be expected from NX if we look at previous generations of Nintendo handheld devices. Nintendo games would aim for that and 60fps, there's no way that a 64GFLOPs GPU would be enough for that task. That kind of horsepower is a joke even for Android ports. It would be the worst portable they've ever made hardware wise, and it would be pretty stupid for them to double down on shitty hardware after all the mud they're still getting after the WiiU/3DS generation. Not saying they should aim for high end specs, but making something that's even worse (way worse) than Wii U and 3DS when they originally came out doesn't make sense at all.

I don't think the portable will hold back the home console, but let's say that's case and they'll do a 64gflops handheld: at that point, why don't they just go with Imagination? They can have a 100GFLOPs part for the handheld and a 819 GPU for the home console. Both would be modern, cheap, power efficient and perfect as a base architecture for several next generation consoles.

I know Nintendo is stubborn and probably doesn't care at all about the fact that even i wouldn't be able to spend money on that kind of hardware, but i think it's pretty safe to assume that something like this would hurt them in many more significant ways, with little to no advantages compared to going with something that's still cheap and underpowered by all means, but acceptable at least.


512GFLOPs is more likely than ~400 imho, but it would still be pretty bad.


I see, but eSRAM doesn't look like a good answer to this problem for them. At that point the 1GB HBM as a "small" memory pool would be a much smarter (and forward thinking) move. Not likely, but a better solution for sure. ESRAM has been a problem for Xbone since D1, i really hope they don't go that route.

The 22nm scenario you described would be a great "middle way" solution, but who knows what they're doing.



I'm not sure Vita is 51gflops. Also, not all Gflops are equal.
Also, 64gflops at 540p would be pretty decent for gaming.
s
Samsung Galaxy S6 has a 192 gflops part... although its not able to sustain the max frequency and drops to the 400mhz range, which brings it close to 120gflops. Keep in mind its driving a 2k screen.
 
sörine;173023250 said:
I believe the Wii remote was actually planned as a Gamecube accessory at first. If you go back further to around when he assumed the presidency Iwata used to make comments about reversing low GC sales with innovative peripherals and controllers. People assumed that meant the DK Bongos but in retrospect I suspect things like the Wii remote and Balance Board were what he was really talking about.

I once heard scuttlebutt that a reason they didn't go with the GameCube version Wii Remote (besides the whole thing about new console hardware helps encourage mass adoption) is because of too much latency. IIRC, the receivers connected through the Memory Card slots, which are actually quite high speed on the 'Cube.
 

Vena

Member
Like i said in my previous post, the 22nm SOI (with eDRAM etc) you suggested would be a great "middle way" solution, better than the eSRAM route imho, and especially if this doesn't prevent Nintendo from switching to HBM in the future, when Takeda is retired and nobody will think that a small pool of memory is still a good idea for a modern gaming machine. Not when there are workarounds for the latency issue like Cerny proved with the PS4 memory architecture, and HBM has way lower latency than GDDR5 iirc.

Pretty sure Redmond's in charge of the most hardware and SoC decisions this time around. That's why they hired a new head engineer last year. (Also certain people with knowledge have said that NoA is much more important now in the hardware development, which all adds up.)

Takeda is about as involved as Miyamoto is at this point.

Its simple: They will position their home console compared to the handheld. Say the handheld is a 64gflops part... then we can expect a 384 to 640gflops part for the home console.

They want Unreal support, those throughputs (especially on the low end) on the console are unrealistic. I don't even think 64GFLOPS is realistic when we effectively know they want to run mobile games/apps through droid emulation.
 
I was thinking about the "shorter cycle before releasing a new hardware" argument. In that case, would make sense for them because they would simply use HBM2 starting with NX gen 2, then they can change it with HBM3 when it's mature, etc.

Maybe yields and costs convinced them to not even consider going that route. Not that eSRAM is cheap though.


Probably, but that would imply Nintendo basically sticking with Wii U graphics in 1080p. Now, I'm all for gameplay>>>>>>>>>>graphics and i've always been, but even i wouldn't be comfortable with something like this. I don't know how many people would be thrilled to jump on something like that, no matter how low the price is.

Like i said in my previous post, the 22nm SOI (with eDRAM etc) you suggested would be a great "middle way" solution, better than the eSRAM route imho, and especially if this doesn't prevent Nintendo from switching to HBM in the future, when Takeda is retired and nobody will think that a small pool of memory is still a good idea for a modern gaming machine. Not when there are workarounds for the latency issue like Cerny proved with the PS4 memory architecture, and HBM has way lower latency than GDDR5 iirc.

They might use HBM down the road. With all the talk of making future console transitions seamless, the OS they come up with should be flexible enough. For this transition, though, it's probably a pain w/ the CPU architecture change and needing to basically rebuild their often maligned OS. That being the case, they probably want to keep everything else as close to Wii U as possible. Hence, "absorb the Wii U architecture," such as the graphics pipeline, build off GX2 API rather than throw out. Probably using very similar dev tools/kits just w/ the terrible PPC compiler switched out for ARM. Takeda already mentioned they were gradually transitioning over to better integrate their dev tools into "industry standards" read: Visual Studio. So I think on that front (dev support, tools), we should expect a continued gradual improvement rather than a complete and sudden turnaround.
 

AmyS

Member
360 was supposed to be 3.5GHz+ in the leaked doc, so it was technically cut back. RSX saw a clock reduction to 500MHz core/650MHz mem(instead of the originally announced 550/700).


NV2A (xbox) was supposed to be 250MHz as well, but they had to roll with 233.

The GPU for the original Xbox was supposed to be 300 MHz. The first cut back was down to 250 MHz, and finally 233 MHz. From GDC 2000 Xbox console announcement:

tl2y7wg.png
U235sdx.png


You are absolutely right about Xbox 360 and the leaked document, though. CPU clockspeed was listed as 3.5+ GHz.

OMqPkRC.gif
 
Status
Not open for further replies.
Top Bottom