• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD's Lisa Su confirms next-gen PlayStation to have Zen 2 and Navi

Delicious news.

But:

The drive found in the current dev kit cuts load times considerably, and Cerny confirmed that it has raw bandwidth higher than any SSD available.
To demonstrate, Cerny fires up a PS4 Pro playing Spider-Man, a 2018 PS4 exclusive that he worked on alongside Insomniac Games. On the TV, Spidey stands in a small plaza. Cerny presses a button on the controller, initiating a fast-travel interstitial screen. When Spidey reappears in a totally different spot in Manhattan, 15 seconds have elapsed. Then Cerny does the same thing on a next-gen devkit connected to a different TV. What took 15 seconds now takes less than one: 0.8 seconds, to be exact.

How is that even possible on a console? WTF? I guess we can expect an SSD in the final console after all.
 
They'll have the same reason they always have, spending the horsepower budget on something else. What you need to understand about software is that it grows to fit its environment. If you give a developer a 5-pound bag he'll try to put 10 pounds of shit into it. Giving him a 10-pound bag isn't going to help, he'll just try to put 20 pounds of shit into it.

It isn't a matter of hardware holding people back, it is a matter of them having different priorities than you do.
If they go for 30 fps blurry mode then Zen2 CPU will be idling in most games while GPU will be maxed out apart from those few huge open world games with nice physics.
 

Shin

Banned
The fact that they said 8K graphics leads me to believe 4K/60 might become mandatory starting next-gen.
Because the CPU's run circle around Jaguar and games are or have been shifting more and more towards GPU side of things.
That's just the impression I'm getting and the specs or hardware they gave a glimpse of so far is pretty convincing.
 

Armorian

Banned
The fact that they said 8K graphics leads me to believe 4K/60 might become mandatory starting next-gen.
Because the CPU's run circle around Jaguar and games are or have been shifting more and more towards GPU side of things.
That's just the impression I'm getting and the specs or hardware they gave a glimpse of so far is pretty convincing.

1080p or 1440p 60 maybe, I don't expect many 4k/60 games. This 8k talk is just marketing bs, there won't be any native games that looks better than PS2 titles graphically.
 

Shin

Banned
1080p or 1440p 60 maybe, I don't expect many 4k/60 games. This 8k talk is just marketing bs, there won't be any native games that looks better than PS2 titles graphically.
That's ok I'll eat their marketing bullshit until we can pass it as such, till then the dream lives on.
 

Armorian

Banned
That's ok I'll eat their marketing bullshit until we can pass it as such, till then the dream lives on.

I don't think 8k will get major appeal (unless producers starts putting 8k screens in everything including microwaves), most people will probably have hard time seeing any differences on normal sized (~50') tvs.
 

Pagusas

Elden Member
I’d love to give up PC gaming if 4K/60 became standard (I’d even settle for checkerboarding). But I still don’t think it’s going to happen. 4K/30 with crazy effects will be too tempting to devs.
 

Pagusas

Elden Member
They'll have the same reason they always have, spending the horsepower budget on something else. What you need to understand about software is that it grows to fit its environment. If you give a developer a 5-pound bag he'll try to put 10 pounds of shit into it. Giving him a 10-pound bag isn't going to help, he'll just try to put 20 pounds of shit into it.

It isn't a matter of hardware holding people back, it is a matter of them having different priorities than you do.

Esspecially with Ray Tracing about to become the big graphical buzz word. We’re about to be overwhelmed with 30fps games utilizing it.
 

dolabla

Member


Invites are out


giphy.gif
 
Not during, but a good 10 minutes right before the MS conference starts. Would be bold, but a bit of a dick move.
Wouldn't that conflict with the press that needs to be at both events? Some Xbox and PlayStation fans such as myself, like to watch both shows, so neither would have full attention if Sony went that route. While it would be funny to see, it wouldn't be wise for Sony to do this because all eyes would not be on them and they would have better results if they had their own event at their own separate time.
 
Last edited:

GoldenEye98

posts news as their odd job
So we can expect more articles and publications since he’s not from Wired.

It's possible that Sony invited some people from the industry to this meeting and just gave Wired the exclusive scoop. Not familiar with who the guy tweeting is...but he is from Walmart Canada according to his bio. So I'm not really expecting much in the way of articles from them.
 

DeepEnigma

Gold Member
It's possible that Sony invited some people from the industry to this meeting and just gave Wired the exclusive scoop. Not familiar with who the guy tweeting is...but he is from Walmart Canada according to his bio. So I'm not really expecting much in the way of articles from them.

Not saying him specifically, but it does make sense having people from the retail chains there as well.

Time will tell I guess.
 

Shin

Banned
Pretty sure this is just referring to whatever secret media event that was already held (a la Wired article). Original tweet is from Feb 27th, not today.
That was about PSVR or something along those lines, wasn't it?
I think some French guy was teasing it as PS4 stuff back then also but on his wristband it raid PlayStation PR event or something like that.
 

Shin

Banned
8k Is a waste of resources. You will need 80 inch tv to see the difference between 4k and 8k. I have 65 inch 4k tv and i won't be able to fit in a bigger one on my wall. Maybe i need to change ma home for bigger one?
I don't care about 8K, but I care that they are willing to push towards that because it will likely get me faster to 4K/60/HDR.
More so than when they were going to go for a 4K-ish console, I want all the eye candy but in a console.
 

PocoJoe

Banned
10 / 12 / 14 amd tflops doesn't matter it's all relative weak for next gen anyway. If they honestly wanted to push things further they should have moved more towards nvidia at this point and get there future flag ship card rammed into a bigger box.

If this thing launches next year end of the year the gpu performance is already going to be 4 years old at that time.



Just throw a ssd in your playstation right now and have the same experience.

Lol dont know if you are serious or trolling..

There isnt amd/nvidia flops, flop is a flop. Only people with limited knowledge repeat this year after year.

You cant compare pc to console directly, as drivers, optimization and partnerships affect how games run on PC with amd/nvidia

On console flops are used "100%" so I bet that with 10 Tflops NVIDIA there would be no difference vs 10 Tflops AMD.

If this would be a thing, weak ass switch would be stronger than it is now in reality.

And good luck with having future flag ship card in a console, that would make it way too expensive.

I dont even know why do I bother to answer to this bullshit, I guess I am bored
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Its good knowing this. Now it is to see if Navi has some juice. The RT support is interesting too, likely a GCN based variant and not an RTX equivalent, though. But who knows.

Go back a few years and ask yourself “would you like HW accelerated vertex shaders or many more unified shaders ;)?”... :).
 

Shin

Banned
Since we now know for a fact which CPU it will be we can discuss the possibilities, though keep in mind the list below is for the desktop unofficial.
I believe the CPU alone will draw around 15 to 25W at best (about the same as Jaguar IIRC but with 16 threads (wattage is from a very old statement AMD gave about Ryzen's future).
There isn't much to on about in terms of the GPU as AMD has been very silent.
https://www.extremetech.com/computing/289299-amd-zen-2-architecture-ryzen-3-cpu-improvements

Not-Ryzen-640x227.jpg



Performance of Zen 2 mobile should be on par with the 1st gen desktop Ryzen 1600/1600X seen below.



 
Last edited:
AMD > nvidia
I wonder how did you come to this conlusion?

We don't know how Navi will fare against Turing still, but what we do know is how current AMD's Vega fares against Nvidia's Turing.

For example a 6,5 TF Rtx 2060 performs around the same as a 12,7 TF Vega 64 .

6,5 TF (2060) = 12,7 TF (v64) You get the picture.
 

GenericUser

Member
Isn't sony revealing a little too much too soon? They must be very confident that Microsoft is unable to change their plans for next gen by now.
 

Kenpachii

Member
Lol dont know if you are serious or trolling..

There isnt amd/nvidia flops, flop is a flop. Only people with limited knowledge repeat this year after year.

You cant compare pc to console directly, as drivers, optimization and partnerships affect how games run on PC with amd/nvidia

On console flops are used "100%" so I bet that with 10 Tflops NVIDIA there would be no difference vs 10 Tflops AMD.

If this would be a thing, weak ass switch would be stronger than it is now in reality.

And good luck with having future flag ship card in a console, that would make it way too expensive.

I dont even know why do I bother to answer to this bullshit, I guess I am bored

Sadly that bullshit is the reality even if you don't like it. AMD and Nvidia tflops are not the same because they are different architectures. Anybody that told you they are are just pushing a agenda or are a bunch of fanboys with agenda's.

AMD drivers on PC have a lot of CPU overhead for multiple reasons. If you however got a solid CPU and are not going for maximum framerates a AMD gpu would suffice.

However even for budget users specially if you use a lot of cpu performance in games they should avoid AMD gpu's at all cost specially on lower gpu taxation resolutions.

For example on PC, a 10 tflop AMD gpu solution can give you lower low framerates then even a 3tflop nvidia gpu because of driver overheads but that doesn't effect maximums.
Example:
Witcher 3:
i7 870
970 gtx vs 2x 290x oc'ed ( 3.5+ vs 11+ tflops )

1080p
Minimums in city's 61 fps on 970 gtx
Minimums in city's 45 fps on 2x 290x

Reason pure driver overhead that jams the cpu.

However the benchmarks they push and test GPU's with are mostly top of the line products ( where bottlenecks are no longer a thing ) and give you a solid idea on what a AMD card can do and what a Nvidia card can output without much driver interference. Both will always have a output lower then a console of equal porportion because games get specifically designed for them. but that doesn't really mean anything if you compare nvidia vs amd in a comparison as it counts for both at that point.

CPU becomes far less of a factor when resolution moves forwards. Guess why resolution was the main solution for the pro and x. not like they could opt for 140hz gaming as CPU would simple not allow it and specially not with a AMD gpu while somebody with a 6tflop nvidia gpu sits at 1080p playing fornite at 200+ fps.

That's why you see a massive difference between amd and nvidia from 1080p > 4k.
That's why anybody that aims for as high as possible hz / fps at 1080p will never opt for a AMD card to start with ( there are more reasons )

Now i schooled you a little bit on PC hardware and you start to realize how things probably work. You can easily compare a AMD card and a Nvidia card when you push resolution forwards on PC to see what they output.

ANd frankly like the guy already mentioned above a 2060 which is half the tflops of vega on 4k with it, outputs about the same performance. You wil realize real freaking quick how those tflops are not even remotely the same.

Obviously you think somebody is trolling or bullshitting when you got indoctrinated by forums of so called "tech experts" that do nothing else then copy past other people's junk all day long without critical thinking themselves.

The same as all those incredible tech sites for example started to advice people on buying low end GPU's while testing them on high end CPU's. Which result in complete different performance for people that probably use those cards on low end CPU's because shocking they don't have 3k systems to power up there 100 dollar gpu card.

I can go on and on about this all day long but will quit here.

Then about your switch.

What has a switch to do with anything i said?

And about flag ship cards in consoles. Maybe you should google xbox and and xbox 360.

Then financial level, with so much competition and gaming it could make sense to take another hit on it just to push competition out of the gate which they are getting more and more. Or even push a premium model with it.

That's why i said 10-14 amd flops aren't already that impressive. It's not going to make much of a difference. Thing is going to be crippled massively anyway with GPU.

Honestly if i was microsoft i would switch to nvidia next flag ship model.
 
Last edited:

DanielsM

Banned
8k Is a waste of resources. You will need 80 inch tv to see the difference between 4k and 8k. I have 65 inch 4k tv and i won't be able to fit in a bigger one on my wall. Maybe i need to change ma home for bigger one?

Bigger than that, you'll probably need a front projector at about 120 inches, and sitting very close. For 99+% of the population 4k is going to be plenty, as very few people use front projectors or possibly VR.... you're (developers) going to want to focus in on other things like framerate, RT maybe, etc. Its baby steps here on out in the graphics department, generally speaking.
 
Last edited:
I just wanted to emphasize a few things:

1)Although GPU is a bottle neck when it comes to graphics performance, the jump from jaguar CPU to ryzen 7nm is pretty significant and huge due to its microarchitecture change. Also the jaguar cpus in xbox 1 and ps4 had parts of HUMA implemented but not all. With the Ryzen CPU I am sure they have made significant advancements to HUMA as well. I dont think we are going to see mild to moderate improvements from previous generation nor are we going to see ' 4k pc'ish'. The CPU is not being used for photoshop, Microsoft office suite, and other desktop/laptop applications. Its customized to assist the GPU to display the best graphics possible. It is not just establishing 60 frames per second but also more life like fluid cinematic movement to characters.
2) Speaking of the GPU, yes it will be significantly better and it has ray tracing but it will do so with a combination of software and hardware which wont impact too much GPU and CPU utilization (my prediction).
3) I hope they will implement wifi 6 and 5g built into the system along with USB 3.2 Gen 2.2

Cant wait to see what xbox scarlett and ps5 can do!
 
Sadly that bullshit is the reality even if you don't like it. AMD and Nvidia tflops are not the same because they are different architectures. Anybody that told you they are are just pushing a agenda or are a bunch of fanboys with agenda's.

AMD drivers on PC have a lot of CPU overhead for multiple reasons. If you however got a solid CPU and are not going for maximum framerates a AMD gpu would suffice.

However even for budget users specially if you use a lot of cpu performance in games they should avoid AMD gpu's at all cost specially on lower gpu taxation resolutions.

For example on PC, a 10 tflop AMD gpu solution can give you lower low framerates then even a 3tflop nvidia gpu because of driver overheads but that doesn't effect maximums.
Example:
Witcher 3:
i7 870
970 gtx vs 2x 290x oc'ed ( 3.5+ vs 11+ tflops )

1080p
Minimums in city's 61 fps on 970 gtx
Minimums in city's 45 fps on 2x 290x

this is the Point where every sane Person should have stopped reading
 

Shin

Banned
3) I hope they will implement wifi 6 and 5g built into the system along with USB 3.2 Gen 2.2
Cant wait to see what xbox scarlett and ps5 can do!
I think it's too late for those given where we are in time with those technologies all-around.
Hopefully they don't misuse the power on RT/GI, I'd prefer more destruction on-screen through physics than a tech that potentially butchers everything and puts us back at E.g. 4K/30.
 
Top Bottom