Rumour: PS5 Devkits have released (UPDATE 4th April: Rumoured Specs)

Not really. The PS3 was better then the 360, but the blueray drive pushed the cost so high that it was not perceived as a good bargain. Add to that that developers had to work harder to get the PS3 fully working, it was a real stuggle.

The XB1 was indeed inferior, but what really drove the cost higher was the Kinect: people didn't need/want that contraption, and didn't want to spend more money just to look silly in front of the TV (PS: I used to that kinda stuff on the Amiga back in the days)
The PS3 was theorically a little bit better one year after the Xbox 360 launched, that is not a significant difference of power to justify the price.

And the XB1 even if it was the same price as the PS4, was still a clearly worse console in the hardware department.
 
I thought that was just an unconfirmed rumor? have you seen these supposed PS5 devkits?

and backward compatibility should be no large hurdle, these are both just video cards, or GPUs not cards. I doubt PS5 will have any slots at all for any kind of cards, it's all just integrated components.
Yeah this is all just rumors, but this is the first I have heard of Sony ditching AMD for nVidia for PS5.

The PS3 had nVidia GPU tech in it so it isn't without precedent. I would just like to hear more if you have any information.
 
I thought that was just an unconfirmed rumor? have you seen these supposed PS5 devkits?

and backward compatibility should be no large hurdle, these are both just video cards, or GPUs not cards. I doubt PS5 will have any slots at all for any kind of cards, it's all just integrated components.
It would create some problems CPU wise too as AMD is unlikely to deliver a customised integration with a nVIDIA powered GPU and both nVIDIA and Intel are not really much into doing lots of semi custom design work for consoles... so it would be either a switch to ARM (Denver derivate) or a switch to Intel. I do not think this is likely because of these reasons, but I also hope that the partnership between AMD and Sony continues as we can still expect semi custom designs (console specific innovation yeah :)) and AMD gets more developer mindshare (people have an extra incentive to optimise for them in the PC space too) and an injection of cash which they both need and IMHO deserve.

I do not want to see nVIDIA and Intel as the two only players in town and then technology will stagnate and prices will rise: see Intel mobile chips and how even Apple may be forced to accelerate the switch to their own chips in their laptops too.
 
I thought that was just an unconfirmed rumor? have you seen these supposed PS5 devkits?

and backward compatibility should be no large hurdle, these are both just video cards, or GPUs not cards. I doubt PS5 will have any slots at all for any kind of cards, it's all just integrated components.
They are not as abstracted as they are on PC so there could be additional emulation hurdles raising the cost of the solution and it would also increase a bit the investment by developers to start spinning up optimised PS5 titles.

I understand nVIDIA’s raytracing push is making quite a few people excited and thinking that it is something only nVIDIA GPU’s can do (DXR is a cross vendor standard) and they are perceived as the top dog in town (“why wouldn’t you want to use the top dog card everywhere?”), I still see AMD as the best partner for console makers with very competitive CPU’s (see Ryzen and Threadripper vs Intel CPU’s) and GPU’s (see Vega in DX12/Vulkan optimised titles vs nVIDIA GPU’s) as well as being very open to do semi custom designs with the console vendors.
 
Last edited:
I thought that was just an unconfirmed rumor? have you seen these supposed PS5 devkits?

and backward compatibility should be no large hurdle, these are both just video cards, or GPUs not cards. I doubt PS5 will have any slots at all for any kind of cards, it's all just integrated components.
They are not as abstracted as they are on PC so there could be additional emulation hurdles raising the cost of the solution and it would also increase a bit the investment by developers to start spinning up optimised PS5 titles.

I understand nVIDIA’s push for raytracing is making quite a few people excited and thinking that it is something only nVIDIA GPU’s can do (DXR is a cross vendor standard) and they are perceived as the top dog in town (“why wouldn’t you want to use the top dog card everywhere?”), I still see AMD as the best partner for console makers with very competitive CPU’s (see Ryzen and Threadripper vs Intel CPU’s) and GPU’s (see Vega in DX12/Vulkan optimised titles vs nVIDIA GPU’s) as well as being more open to do semi custom designs in general.
 
Last edited:
On paper it is debatable if the ps3 is better than the 360, since they work in very different ways and have similar max potential, but since they are "dead" consoles now we just need to compare the games.
I don't see any 360 game looking better than Uncharted 3 or TLOU or GoW Ascension.
Could the 360 have bested these games? We will never know since MS never invested enough on first party studios to push the console to the limit.
 
I thought that was just an unconfirmed rumor? have you seen these supposed PS5 devkits?

and backward compatibility should be no large hurdle, these are both just video cards, or GPUs not cards. I doubt PS5 will have any slots at all for any kind of cards, it's all just integrated components.
Nvidia screwed them during the PS3, why the hell would they go back to them. Plus AMD's APU are cheaper anyways.
 
On paper it is debatable if the ps3 is better than the 360, since they work in very different ways and have similar max potential, but since they are "dead" consoles now we just need to compare the games.
I don't see any 360 game looking better than Uncharted 3 or TLOU or GoW Ascension.
Could the 360 have bested these games? We will never know since MS never invested enough on first party studios to push the console to the limit.

https://www.eurogamer.net/articles/digitalfoundry-2015-rise-of-the-tomb-raider-face-off heres an example of a graphics pushing 360 game... or Titanfall .. xbox has never invested in those type of games that you mentioned.

The 360 won nearly every face off often by a significant margin .. usually far more than the ps4 / xbox one comparisons.

In reality we can be sure that none of those ps3 exclusives would have been significantly different on 360. Third party developers often spent a lot of time to use the spu’s such as EA and DICE they would reach parity at best. The games were often remastered on PS 4 and upgraded so obviously it wasnt something amazing or unheard of .. the converted the spu code to gpu code.
 
oh there is one other possibility besides AMD, and that is Nvidia. Nvidia already provides the GPU used in the Nintendo Switch. I don't have any inside information or anything, just Nvidia moving to consoles with the Switch. that is two big gaming markets they've got now: Nintendo and PC. I don't know, but I'm certain they would like to expand their market penetration even further, just like with any business. I think it's a 50/50 split chance Nvidia would be used in either next-generation Xbox or PlayStation, or even both. actually that is a bit more complex than just a 50/50 coin toss, but you get what I'm saying hopefully. component suppliers can change, and although I'm sure AMD would like to be in next-gen consoles as well, Nvidia has the fab expertise to handle anything Microsoft and Sony throw at them. in fact, I would say they have even better relationships with the fabs than AMD does!
The biggest thing working against Nvidia is that they can't use an x86 core, unless Intel decided to play nice (aka they can't use an x86 core). 8 Zen cores could run the OS, SDK, APIs, with minimal retooling for efficiency most likely.


Plus, Denver was a dud and off the shelf ARM cores aren't to Zen scale yet, only Apple and a few server players have large ARM cores comparable with low wattage x86 so far. Denver 2 has gone quiet, so it would be a lot of R&D for Nvidia to make a custom core for a low margin console.


An ISA change isn't a world changer either, but it would still mean every dev has to comb through the flags after an attempted recompile etc. I think the switch to x86 was largely intended to stay on the x86 roadmap, at least for another generation, and if in another decade ARM development has far outstripped x86 we'll see then.

In reality we can be sure that none of those ps3 exclusives would have been significantly different on 360. Third party developers often spent a lot of time to use the spu’s such as EA and DICE they would reach parity at best. The games were often remastered on PS 4 and upgraded so obviously it wasnt something amazing or unheard of .. the converted the spu code to gpu code.
Exactly, Sony invested heavily in a GPU-ey CPU right at the inflection point when the world would do the opposite with GPU compute, and GPUs do it far better. Remember they also launched within a month of the GTX 8800. Late stage PS3 games looked good, but all that Cell uniqueness was spent making up for the RSX being lacklustre compared to the Xenos. Multiplats at best got to par by the end of its life, very few did a little better, but the 360 mostly won there, while we'll never know what first party studios with their budgets could make on 360 hardware and if the PS3 enabled anything undoable.


All late stage lookers interviews had the same thing in common, move work off the RSX to SPEs so it could do more of something else, maybe at best it was a bit more capable than the 360 but being so much harder to take advantage of I'd say tilted things towards the latter, at best I'd call it a wash if we ignore developer pain and budget.

One wonders about a world where the Cell was paired with the Xenos, so it didn't have to fill the latters duty and could enable truly different experiences for the time.
 
Last edited:
It's too early to release in 2019. Release in 2021

Target for a New PS5 imo should be 4K@60FPS for all games

Specs:
Zenn /Ryzen Based 8CPU 3,2 Ghz
Navi/Next Gen CPU architecture
Memory 16GB (Fast ram for games), slow ram for applications
Storage 7200rpm HD (maybe with intelligent caching)

Also, the PS5 needs to be silent under full load
 
It's too early to release in 2019. Release in 2021

Target for a New PS5 imo should be 4K@60FPS for all games

Specs:
Zenn /Ryzen Based 8CPU 3,2 Ghz
Navi/Next Gen CPU architecture
Memory 16GB (Fast ram for games), slow ram for applications
Storage 7200rpm HD (maybe with intelligent caching)

Also, the PS5 needs to be silent under full load

I acutally think 2019 is a good inflection point that both they and Microsoft are probably interested in.

7nm ships in bulk, with a few quarters of buffer to make sure yields are good
Navi
Zen+
Nextgen Memory

If they wait another two years they do it for modest gains when their main competitor could spring on 2019 and gain a market foothold first. Sure Zen 2 will be a bit better, but will that matter if the competitor is over ten million units ahead. They won't have another die shrink by 2021, so the GPU would not be able to significantly differentiate itself.

The next big gain would be 5nm, which would mean waiting all the way to 2022, and I don't think it's worth that, don't want a repeat of the 7th gen where they were aging incredibly poorly by the end and holding games on PC back.
 
Last edited:
It's too early to release in 2019. Release in 2021

Target for a New PS5 imo should be 4K@60FPS for all games

Specs:
Zenn /Ryzen Based 8CPU 3,2 Ghz
Navi/Next Gen CPU architecture
Memory 16GB (Fast ram for games), slow ram for applications
Storage 7200rpm HD (maybe with intelligent caching)

Also, the PS5 needs to be silent under full load
You want to release in 2021 with those specs and want 4k/60fps gaming.....??
 
Don't really have any spec predictions, just hope they release a beast in 2019-2020 for $399. That will force MS to keep their price down also. I personally wouldn't mind these systems hitting $499 if that will create a much larger generational leap, I just worry about how the casual market would respond. Sony certainly would have an easier time than MS if their next "base" system releases at that price. Kind of torn.
 
https://www.eurogamer.net/articles/digitalfoundry-2015-rise-of-the-tomb-raider-face-off heres an example of a graphics pushing 360 game... or Titanfall .. xbox has never invested in those type of games that you mentioned.

The 360 won nearly every face off often by a significant margin .. usually far more than the ps4 / xbox one comparisons.

In reality we can be sure that none of those ps3 exclusives would have been significantly different on 360. Third party developers often spent a lot of time to use the spu’s such as EA and DICE they would reach parity at best. The games were often remastered on PS 4 and upgraded so obviously it wasnt something amazing or unheard of .. the converted the spu code to gpu code.

Thats kind of missing the point. Complexities of cell meant cross platform games would almost always look better on xbox because it wasnt worth the time to fully optimize for cell. However, ps3 exclusives could be optimized for cell so resulted in standout graphical showcases.

You cant point to crossplatform games being better on xbox as proof ps3 exclusives would be better on xbox because it absolutely ignroes the reasons why this was so.
 
Meanwhile Microsoft had to pay licence fees to Sony for using Bluray in their Xbox 360 console :D
360's dont have bluray, each iteration was using dvd
Target for a New PS5 imo should be 4K@60FPS for all games
just like each new gen before it, even with 4k/60fps being a hypothetical ceiling, not all devs will aim for that. Even with the X and pro weve seen mostly 1080-2160 titles at 30fps.

this is a developer choice, not neccessarily a hardware limitation.
 
https://www.eurogamer.net/articles/digitalfoundry-2015-rise-of-the-tomb-raider-face-off heres an example of a graphics pushing 360 game... or Titanfall .. xbox has never invested in those type of games that you mentioned.

The 360 won nearly every face off often by a significant margin .. usually far more than the ps4 / xbox one comparisons.

In reality we can be sure that none of those ps3 exclusives would have been significantly different on 360. Third party developers often spent a lot of time to use the spu’s such as EA and DICE they would reach parity at best. The games were often remastered on PS 4 and upgraded so obviously it wasnt something amazing or unheard of .. the converted the spu code to gpu code.
Why? They recently worked with Intel. Another competitor on the same level as NVIDIA but in different spaces.

https://www.pcgamer.com/intels-team-up-with-amd-produces-the-fastest-integrated-graphics-ever/
I think Intel was more eager to integrate the AMD GPU and AMD was the one likely doing a big deal of the work on a multi die - single package solution that has a fraction of the benefits of an APU in terms of power consumption and performance (and where the vendor designs both CPU, GPU, and interconnect and can do changes on each to best suit the design).

Still, the problem would be nVIDIA offering a semi custom design (see how far they went on Switch with an essentially untouched Tegra chip than AMD not doing integration work at the CPU side). Especially if they needed to build an APU single die design with AMD, their rival.

I also do not see AMD dropping the ball on GPU specs and performance and I do not see valid data on the PC side supporting this thesis (DX12/Vulkan performance) and they are very willing to let Sony and MS have lots of input on the design and can contribute end to end which saves plenty of time.
 
Most are interested in specs, while I am too my interest is in seeing sony grow up their usage of BSD and the network stack or cpu scheduler. Their are various improvments that exist for BSD for either and based on testing you don't even use them or the implementation of trying to is bad.

So I'm putting it here sony, will this be another generation that not a single console can claim to have debloated network stack. I don't expect MS or nintendo to fix or address this, yet I see no reason why the biggest provider for 3rd party games is walking away from the subject.

The movement has proven itself so sony will you be enabling this tech in your PS5.
Docsis 3.1 cable standards have AQM using an evolved form of pie.
Any router with good firmware and advanced linux kernel 3.18 (or higher) and on has access to aqm that can use fq-codel or cake which are infinitely better than traditional methods of managing your network queue or data.

More cpu cycles (You might have a very good cpu, nothing will ever change the fact devs will max what they have.)
Saves power
Better throughput
Better latency
Saves on packet overhead bloated links eat 10-30% of all packets with useless redundant or old data that should've been dropped.

Please get it and debloat your online gaming networks the advantages are exceptionally clear. I would love to play one console in the next ten years with this tech.

I will say it again first company of the big 3 to do this will reap rewards that the other two can't claim without actually implementing it on their own in some fashion.

Bufferbloat has been resolved and we will get better at addressing it.
 
Last edited:
Thats kind of missing the point. Complexities of cell meant cross platform games would almost always look better on xbox because it wasnt worth the time to fully optimize for cell. However, ps3 exclusives could be optimized for cell so resulted in standout graphical showcases.

You cant point to crossplatform games being better on xbox as proof ps3 exclusives would be better on xbox because it absolutely ignroes the reasons why this was so.
The spu code was put onto the gpu of the ps4 for the remasters and they ran better. The xbox 360 gpu was so capable that it wasnt far of current gen games, therefor the xbox 360 could have been running the gpu converted spu code just fine.
 
The spu code was put onto the gpu of the ps4 for the remasters and they ran better. The xbox 360 gpu was so capable that it wasnt far of current gen games, therefor the xbox 360 could have been running the gpu converted spu code just fine.
I think you are overestimating the power of the Xenos GPU a tad here ;).
 
Last edited:
Why? They recently worked with Intel. Another competitor on the same level as NVIDIA but in different spaces.

https://www.pcgamer.com/intels-team-up-with-amd-produces-the-fastest-integrated-graphics-ever/

Jen-Hsun. More Nvidias side than AMDs. Intel has a GPU gap from now until they come out with their own dedicated GPUs in anther few years, AMD gained a new revenue stream that doesn't really hurt Raven Ridge, it was a win/win. Nvidia is a lot more cocky about working with others and retains more control over their own IP and chips.

The G package is also a multichip module, not an APU, which would need more collaboration to integrate. Right now it's like a dedicated GPU just in the same package as the CPU.

The Switch looks identical to a stock Tegra X1, Nvidia might not be into putting lots of R&D into low margin console streams of revenue.



Ultimately it's easiest and most cost effective to order different cuts of AMD APUs for your R&D bang for your buck (with enhancements like Sony using 8 ACEs early from a future product), I 80% see both companies sticking to this for at least the 9th gen.

Dark horse would be if Intels dedicated GPU was out in time so someone could use their still better CPUs with it, but it's unlikely, Intel likes it's margins.
 
Last edited:
I think you are overestimating the power of the Xenos GPu a tad here ;).
I provided evidence and you didnt.
Which bit do you deny.. that 360 won almost all faceoffs, that the ps4 remasters where gpu executed spu code of ps3 games were better res and frame rate, that tomb raider on 360 isnt similar to ps4 xbox one versions?
ooh you had no specifics AND no evidence ;) 🔥
 
Last edited:
I provided evidence and you didnt.
Which bit do you deny.. that 360 won almost all faceoffs, that the ps4 remasters where gpu executed spu code of ps3 games were better res and frame rate, that tomb raider on 360 isnt similar to ps4 xbox one versions?
ooh you had no specifics AND no evidence ;) 🔥
Lol, you said that Xenos could run all SPU code on it no problemo... did not see much evidence of that ;). Then you proceed talking about running code on a 1.84 TFLOPS GPU Sony customised to help handle SPU like code better to a a CPU that pushed about 0.2 TFLOPS at best and even combined with the GPU it was a far far cry from the performance of the PS4 GPU.

You were doing that btw while talking about games that pushed the SPU’s hard like the Sony first party games (which were key for Sony to basically evaporate about 1.5 years of lead time alone in the market).

You presented several pieces of info, some correct, some stretched, and some unsupported and implied a connection between them.
 
Last edited:
The spu code was put onto the gpu of the ps4 for the remasters and they ran better. The xbox 360 gpu was so capable that it wasnt far of current gen games, therefor the xbox 360 could have been running the gpu converted spu code just fine.
Gpu can't run all cpu code, and there is the latency to consider too.
I said that no 360 first party game tried to push the console to the limit, I really Don't see what talking about Ps4 remasters and multiplats games Face-offs has to do with it.
 
n.
Yeah this is all just rumors, but this is the first I have heard of Sony ditching AMD for nVidia for PS5.

The PS3 had Nvidia GPU tech in it so it isn't without precedent. I would just like to hear more if you have any information.
I have none. but I don't think it is an absolute MUST that sony go with AMD derivative products. Nvidia has an even more impressive repertoire of experience designing custom silicon than AMD does. for example, Nvidia is currently valued at ~$224 billion and AMD is currently valued at ~$9 billion. they (Nvidia) have a vast array of experience designing silicon for their customers, as proven by those numbers and the fact that the respective companies value graphs look like this: / for Nvidia (increasing) and this: \ for AMD (decreasing)

Nvidia doesn't just make Geforce GTX cards for PCs and that's it. obviously, you don't get to be a company of Nvidia's size by being selectively picky and choosy about who you work with. they are on a particular high right now due to the bitcoin miners voraciously snagging up all their cards for an inflated price. but, I think everyone agrees that a PS5 won't be releasing anytime soon anyway. speaking of the latest releases, Nintendo and their Switch console are using Nvidia parts, perhaps a sign of things to come.

I would just like to remind everyone that an actual objective reality exists outside of neogaf.com forums
 
360's dont have bluray, each iteration was using dvd

just like each new gen before it, even with 4k/60fps being a hypothetical ceiling, not all devs will aim for that. Even with the X and pro weve seen mostly 1080-2160 titles at 30fps.

this is a developer choice, not neccessarily a hardware limitation.
Pardon me. I meant Xbox One.
 
Gpu can't run all cpu code, and there is the latency to consider too.
I said that no 360 first party game tried to push the console to the limit, I really Don't see what talking about Ps4 remasters and multiplats games Face-offs has to do with it.
The ps3 was nothing special thats why they could convert the games to ps4 a jaguar cpu with a mid level gpu and still up the res and frame rate. To say the ps3 cell does things impossible any other way is rediculous and been proven time and time again to be wrong.

https://www.eurogamer.net/articles/digitalfoundry-2015-the-challenge-of-remastering-uncharted
 
n.

I have none. but I don't think it is an absolute MUST that sony go with AMD derivative products. Nvidia has an even more impressive repertoire of experience designing custom silicon than AMD does. for example, Nvidia is currently valued at ~$224 billion and AMD is currently valued at ~$9 billion. they (Nvidia) have a vast array of experience designing silicon for their customers, as proven by those numbers and the fact that the respective companies value graphs look like this: / for Nvidia (increasing) and this: \ for AMD (decreasing)

Nvidia doesn't just make Geforce GTX cards for PCs and that's it. obviously, you don't get to be a company of Nvidia's size by being selectively picky and choosy about who you work with. they are on a particular high right now due to the bitcoin miners voraciously snagging up all their cards for an inflated price. but, I think everyone agrees that a PS5 won't be releasing anytime soon anyway. speaking of the latest releases, Nintendo and their Switch console are using Nvidia parts, perhaps a sign of things to come.

I would just like to remind everyone that an actual objective reality exists outside of neogaf.com forums


nVidia has always been larger and richer than AMD since the beginning.

That hasn't stopped AMD from developing console hardware for:

The GameCube
The Wii
The Wii U
The Xbox 360
The Xbox One
The Xbox One X
The PS4
The PS4 Pro

There is no guarantee nVidia cannot win the PS5 contract but being having more resources hasn't gotten them tons of console design wins in the past. They have 3 AMD has 8
 
nVidia has always been larger and richer than AMD since the beginning.

That hasn't stopped AMD from developing console hardware for:

The GameCube
The Wii
The Wii U
The Xbox 360
The Xbox One
The Xbox One X
The PS4
The PS4 Pro

There is no guarantee nVidia cannot win the PS5 contract but being having more resources hasn't gotten them tons of console design wins in the past. They have 3 AMD has 8
yes, because corporations typically have dollar signs for eyeballs. ATI can offer them a cheaper deal so that is typically the offer that is preferred. technological leadership belongs to Nvidia, not to AMD. if you want proof just look at the PC market: Nvidia leads AMD follows like a 3 legged puppy.

still, that doesn't mean Nvidia will 100% win the contract for PS5, remember what I said about dollar signs for eyeballs. I just want the best graphics I can have, so I want the leaders in that field in all of my consoles as well as my PCs. I'm heartened to see that Nvidia won the contract for the Nintendo switch, let's hope they can continue that streak with Sony (crosses fingers!)
 
The ps3 was nothing special thats why they could convert the games to ps4 a jaguar cpu with a mid level gpu and still up the res and frame rate. To say the ps3 cell does things impossible any other way is rediculous and been proven time and time again to be wrong.

https://www.eurogamer.net/articles/digitalfoundry-2015-the-challenge-of-remastering-uncharted
That still has nothing to do with what I said.... I never said that the ps3 was special, I said that the 360 was never used to the max and the later 1st party ps3 games looked prettier than the later 1st party 360 games, because MS devs never took time, or never had time or enough funding, or simply didn't bother to do it.
Ps4 remasters being hard or easy have nothing to do with what happened last gen. If there is a first party game on the 360 that looks better than God of War Ascension, TLOU or Uncharted 3 then please say which, because that is what I'm asking.
Stop rambling about unrelated things.
 
Last edited:
I have none. but I don't think it is an absolute MUST that sony go with AMD derivative products. Nvidia has an even more impressive repertoire of experience designing custom silicon than AMD does. for example, Nvidia is currently valued at ~$224 billion and AMD is currently valued at ~$9 billion. they (Nvidia) have a vast array of experience designing silicon for their customers, as proven by those numbers and the fact that the respective companies value graphs look like this: / for Nvidia (increasing) and this: \ for AMD (decreasing)

Nvidia doesn't just make Geforce GTX cards for PCs and that's it. obviously, you don't get to be a company of Nvidia's size by being selectively picky and choosy about who you work with.

I would just like to remind everyone that an actual objective reality exists outside of neogaf.com forums


Odd reasoning. In my experience the larger a company is the more it is comfortable turning down low margin deals, and consoles are low margin by Nvidia standards. The shrinking or small company is the one that will bend over backwards to please partners with custom solutions that take development cost for low end margins. I mean, have you worked for a large and small company both?

Heck, let's get away from the theoreticals, Nvidia very much likes making turnkey solutions with finalized hardware while AMD made semicustom a big part of their business model. They're large because they make good hardware, and they're getting a lot of VC funding for deep learning and automotive right now, but they're not selling as much custom hardware as you theorize, your example isn't an example, a stock valuation doesn't prove your point. The Switch TX1 is literally a TX1, see gif above.

Semicustom is literally in AMDs business model, they have to for survival; Nvidia will do it but there would have to be big margins in it at the end.
https://www.amd.com/en-us/solutions/semi-custom

This isn't to say that's the reason we think they're going AMD, rather it's the closely compatible APU solution, staying on x86 which Nvidia can't do (without a miracle collaboration from Intel), and the leaks from people who have broke previous hardware before pointing to AMD super_secret GPU code. Nvidia would mean ARM which would mean a clean break, and there's currently not many ARM cores outside of Apple and high end server makers that compete with Zen.

This isn't the Neogaf reality...It's just what most evidence points to so far. If its Nvidia colour me surprised, as I haven't seen any proof of it outside of an odd argument for its stock value and neogaf maybe not being right (it's not us pointing to this)
 
Last edited:
Odd reasoning. In my experience the larger a company is the more it is comfortable turning down low margin deals, and consoles are low margin by Nvidia standards. The shrinking or small company is the one that will bend over backwards to please partners with custom solutions that take development cost for low end margins. I mean, have you worked for a large and small company both?

Heck, let's get away from the theoreticals, Nvidia very much likes making turnkey solutions with finalized hardware while AMD made semicustom a big part of their business model. They're large because they make good hardware, and they're getting a lot of VC funding for deep learning and automotive right now, but they're not selling as much custom hardware as you theorize, your example isn't an example, a stock valuation doesn't prove your point. The Switch TX1 is literally a TX1, see gif above.

Semicustom is literally in AMDs business model, they have to for survival; Nvidia will do it but there would have to be big margins in it at the end.
https://www.amd.com/en-us/solutions/semi-custom

This isn't to say that's the reason we think they're going AMD, rather it's the closely compatible APU solution, staying on x86 which Nvidia can't do (without a miracle collaboration from Intel), and the leaks from people who have broke previous hardware before pointing to AMD super_secret GPU code. Nvidia would mean ARM which would mean a clean break, and there's currently not many ARM cores outside of Apple and high end server makers that compete with Zen.

This isn't the Neogaf reality...It's just what most evidence points to so far. If its Nvidia colour me surprised, as I haven't seen any proof of it outside of an odd argument for its stock value and neogaf maybe not being right (it's not us pointing to this)
consoles may be "low margin by Nvidia standards" but they have just provided the GPU for the nintendo switch, so I guess it's not too low margin after all? and yes I have worked for both big and small companies (I did IT work for a large banking company called USA Payday and was the projectionist for small theater companies called Starplex and AMC, just to give some examples). going with Nvidia doesn't preclude some form of backward compatibility, that is just a matter of software which Nvidia can handle without problem. x86 compatibility may have been an issue back in the 1970s but it is nothing new and "is that all?" it has been achieved by many companies in hardware: Transmeta, Rise Technology, SiS, DM&P, IDT, Cyrix, National SEMI, NexGen, Chips and Technologies, UMC, NEC, even IBM has processors with x86 compatibility. although Nvidia is most widely known for their GPUS, they had a lawsuit with Intel back in the day and won, resulting in Intel being forced to pay Nvidia 1.5 billion dollars.

an "APU" is just a marketing term used by AMD to refer to their microprocessors with an integrated GPU. that is not a patent owned by AMD, anyone can produce an "APU" although they have to come up with their own snazzy marketing terms for them. this type of processor was previously referred to as "Fusion" by AMD. I propose "Zoomy zoom-zoom whoosh!" although they can call it anything they want.

just to be clear, I'm not bashing neogaf, I have been a member of this forum for years :)
 
Last edited:
consoles may be "low margin by Nvidia standards" but they have just provided the GPU for the nintendo switch, so I guess it's not too low margin after all? and yes I have worked for both big and small companies (I did IT work for a large banking company called USA Payday and was the projectionist for small theater companies called Starplex and AMC, just to give some examples). going with Nvidia doesn't preclude some form of backward compatibility, that is just a matter of software which Nvidia can handle without problem. x86 compatibility may have been an issue back in the 1970s but it is nothing new and "is that all?" it has been achieved by many companies in hardware: Transmeta, Rise Technology, SiS, DM&P, IDT, Cyrix, National SEMI, NexGen, Chips and Technologies, UMC, NEC, even IBM has processors with x86 compatibility. although Nvidia is most widely known for their GPUS, they had a lawsuit with Intel back in the day and won, resulting in Intel being forced to pay Nvidia 1.5 billion dollars.

an "APU" is just a marketing term used by AMD to refer to their microprocessors with an integrated GPU. that is not a patent owned by AMD, anyone can produce an "APU" although they have to come up with their own snazzy marketing terms for them. this type of processor was previously referred to as "Fusion" by AMD. I propose "Zoomy zoom-zoom whoosh!" although they can call it anything they want.

just to be clear, I'm not bashing neogaf, I have been a member of this forum for years :)
I don't think going with nvidia is realistic, it essentially locks them into going with an ultra cheap processor.
The technical requirements for a system like the switch were completely different than an home consoles.

The switch needed a high performance low power solution to run on battery. Hence arm processors are perfect for this task. The cpu is like 1Ghz so it's still not great when compared to something we make see in a PS5 and I'd argue below even the original xbox one in performance.

In order for a home console to get the best price and performance an APU is basically a necessity and currently the only company capable of delivering such a solution at a reasonable price really is AMD.

If you think about it they could throw a modified 2400G in there and be competitive now. Its the same core in a Vega 64, and they could just add more compute units to the architecture and stuff GDDR6 have a crazy fast, crazy cheap system.
 
consoles may be "low margin by Nvidia standards" but they have just provided the GPU for the nintendo switch, so I guess it's not too low margin after all? and yes I have worked for both big and small companies (I did IT work for a large banking company called USA Payday and was the projectionist for small theater companies called Starplex and AMC, just to give some examples). going with Nvidia doesn't preclude some form of backward compatibility, that is just a matter of software which Nvidia can handle without problem. x86 compatibility may have been an issue back in the 1970s but it is nothing new and "is that all?" it has been achieved by many companies in hardware: Transmeta, Rise Technology, SiS, DM&P, IDT, Cyrix, National SEMI, NexGen, Chips and Technologies, UMC, NEC, even IBM has processors with x86 compatibility. although Nvidia is most widely known for their GPUS, they had a lawsuit with Intel back in the day and won, resulting in Intel being forced to pay Nvidia 1.5 billion dollars.

an "APU" is just a marketing term used by AMD to refer to their microprocessors with an integrated GPU. that is not a patent owned by AMD, anyone can produce an "APU" although they have to come up with their own snazzy marketing terms for them. this type of processor was previously referred to as "Fusion" by AMD. I propose "Zoomy zoom-zoom whoosh!" although they can call it anything they want.

just to be clear, I'm not bashing neogaf, I have been a member of this forum for years :)

ISA translation performant enough for extremely time sensitive console rendering budgets where many titles make use of every last millisecond in a 16/33ms render...If this was feasible without Intels legal team gunning for you already, why would ARM chips not be running full x86 code everywhere? Microsofts own attempt only manages 32 bit x86 over ARMv8, no 64 bit, which the PS4 uses. Again, I'm talking probabilities here, and the only evidence I've seen of Nvidia so far is "it's not technically impossible".

Intel Legal is also breathing down everyones neck who tries
https://www.theregister.co.uk/2017/06/09/intel_sends_arm_a_shot_across_bow/


You also know the result of the very lawsuit you mention was "don't use x86" right?
https://www.anandtech.com/show/4122/intel-settles-with-nvidia-more-money-fewer-problems-no-x86

If one chip was unabashedly better than the other sure, but again, there's not really any Zen scale ARM cores, and a custom core that large and wide is another whack of R&D again. Funny, Apple would be suited to sell in that space, but they have no interest in that.

I talked about the Switch already. It may be low margin but they also didn't customize the hardware seemingly at all, it's a Tegra X1 at tweaked clocks and voltages. They just built an adapted API and library for it. Heck, the RSX wasn't even all that customized, half the ROPS and memory bus, FlexIO, and some larger caches to compensate. Low effort for low margins, sure Nvidia bites, but AMD does extensive customization for those console margins.

You also miss my point on the APU...I don't care what they call it either, but Intels Kaby Lake G package is a Radeon fabbed in one place, an Intel Core fabbed in another, one one package. An APU by any name would be a single chip, all fabbed in the same foundry, which would need a lot more budget to port one part or the other over to another foundries transistor libraries, testing and validation, etc. This was brought up on the possibility of a joint venture with Intel for x86 cores.


So we have IF Nvidia wants to spend the development cost and IF they build a worthy ARM core and IF they have binary translation that works well OR they partner with another unlikely console partner and IF and IF and IF...

Vs 'AMD has readymade packages, suspicious GPU source code lines, previous leakers backing the idea, and would broadly be compatible with minimal retooling'.

I'll grant not impossible, there's just nothing backing it and it makes the least sense.
 
Last edited:
yes, because corporations typically have dollar signs for eyeballs. ATI can offer them a cheaper deal so that is typically the offer that is preferred. technological leadership belongs to Nvidia, not to AMD. if you want proof just look at the PC market: Nvidia leads AMD follows like a 3 legged puppy.

still, that doesn't mean Nvidia will 100% win the contract for PS5, remember what I said about dollar signs for eyeballs. I just want the best graphics I can have, so I want the leaders in that field in all of my consoles as well as my PCs. I'm heartened to see that Nvidia won the contract for the Nintendo switch, let's hope they can continue that streak with Sony (crosses fingers!)
yes, because corporations typically have dollar signs for eyeballs. ATI can offer them a cheaper deal so that is typically the offer that is preferred. technological leadership belongs to Nvidia, not to AMD. if you want proof just look at the PC market: Nvidia leads AMD follows like a 3 legged puppy.

still, that doesn't mean Nvidia will 100% win the contract for PS5, remember what I said about dollar signs for eyeballs. I just want the best graphics I can have, so I want the leaders in that field in all of my consoles as well as my PCs. I'm heartened to see that Nvidia won the contract for the Nintendo switch, let's hope they can continue that streak with Sony (crosses fingers!)
Before I get into it, I would just to point out one moment you state you would like for nVidia to win console hardware contracts because they are the richest and can deliver you the best graphics possible. But then in the next sentence you use the GPU in the Switch to drive this point home?

The Switch is a great console with great games but the Tegra X1 isn't blowing my mind in terms of graphics by any means.

All that does is underscore that the needs of the console manufacturer (in this case, Nintendo) is the most important element to shape the design of a console.


For sure, just about everyone agrees nVidia has the technology edge when it comes to designing PC desktop GPUS...

Where the stumbling block starts is when the other elements of console design come into play, like an X86 CPU for backwards compatibility and game development migration, for instance.

Nintendo doesn't have a ton of customers with massive digital game libraries like Sony does. Since this is the case, a backwards compatible ecosystem whereby customers could seamlessly "bring forward" their expensive accumulated digital purchases to a theoretical PlayStation 5 might be of more interest to Sony than it might to Nintendo, for example.

Further complicating things, is rival Xbox, who is more aggressive than anyone at delivering a seamless Backwards Compatible experience to their customers, who are probably worrying the least about their digital purchases at the moment.

If you think this isn't a big deal, just ask a Nintendo fan how happy they feel about re-buying some of the same Virtual Console games they already own once a new console like the Super NES Classic arrives!

So although nVidia is capable of designing the best, most powerful GPU on the market, I still expect Sony to go AMD next generation. What AMD brings to the table seems to be more in sync with the needs of a customer like Sony. And don't forget, AMD can also bring some heat to the graphics table when their customers demand it. The Xbox One X is pretty damn good gaming hardware for $500 in my eyes.

Can nVidia win the PS5 contract? Of course!

That said, I believe that there would be a much more complicated path towards that design since they would somehow have to also provide a CPU that would be competitive with a forthcoming Xbox One successor. Would Sony really be willing to potentially give up a CPU power and backward compatibility crown to rival Microsoft just to have the prettiest graphics?

I really doubt that and I hope they are not that dumb.