• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Pachter: PS5 to be a half step, release in 2019 with PS4 BC

THE:MILKMAN

Member
What would be really helpful to know is which year PS5 is targeted to launch in. If that isn't known or hasn't been decided yet then it is probably too early to even start discussing things yet. EDIT: Detailed things

Souldestroyer Reborn said:
Firstly, Power != Current.

Power(P) correlates with Current(I) dependant on Voltage(V). P=IV

There are also numerous other equations for Power, which we won't go in to as it isn't really necessary for this subject.

Bear with me on the next bit as I'm going to go on a bit of a tangent to try and help explain the bigger picture.

As for the power supply, nothing is 1:1 transferable. There are losses due to various influences, heat primarily being the main one. We also have a thing called efficiency or Power Factor (PF) when talking about Power Generation.

Think of it like a beer with a nice frothy top.

The actual beer itself is our "Real Power" (W) our Watts that you see being used.

The frothy bit of the beer is a bit of a waste isn't it? Doesn't do much to get us drunk! This is the wasted power, or "Reactive Power" kilo-Volt-Amperes-reactive(kVAr)

The whole of the beer (the frothy top plus the liquid itself) is our "Apparent Power" kilo-Volt-Amperes (kVA)

Where I'm going with this is that nothing is able to make use of every bit of electricity we give it. Also, everything has different rates of efficiency (More/less beer, or more/less Froth)

This leads us to the conclusion that in order to drive more, the Xbox with less Power, while using stronger hardware, there will have been major gains in regards to efficency (how much is used rather than wasted) how much voltage is actually required, and how well the thermal losses are dealt with, cooling.

On top of which, to add further complexity, every PSU itself will have its own efficiency rating and operate at its peak performance at different loads.

Also, just for clarities sake, you should/would never run a PSU at its 100% load. Not only is it highly inefficient to do so, it would give off more heat and cause issues.

While I appreciate what you did here (I do understand most of this) it actual is a bit off topic/a tangent so will drop it. In basic terms I agree with what DieH@rd said.
 

jdstorm

Banned
I also choose 3

Sony first parties will likely push the limits of the PS5 as far as they can afford to. They aren't tied to any arbitrary I5 limit as they are trying to sell a closed box console. Besides after seeing Infamous SS, Horizon Zero Dawn and Uncharted 4, I'm sure Sony's internal teams will want to keep one upping eachother.

Besides that VR is a huge incentive for a new machine to push current gen games at 90FPS. Any game targeting VR will likely have graphics that are able to be run on the base PS4.
 

Shin

Banned
There is a problem with next-generation though it can easily be fixed.
If they go with BDXL the PS4 cannot read it (I think), which means there would be 2 versions of the same game, PS4 (BD) and PS5 (BDXL).
Or they'll release on BD with an additional patch required when playing on PS5, though this doesn't seem a good idea because of world wide internet speed.
They could ride it out until developers completely kill of PS4 support or when the install base is large enough (sort of the same thing really).
 
This might be true, but when games stop working on Xbox One S due to Xbox Two being released, they will still work on Xbox One X, at least that is my expectation. To me this means that Xbox One X is half current-gen and half next-gen, which is essentially what MS has been touting with their generationless approach.
Basically you want MS "next-gen" games to be hamstrung by Jaguar, while on the other hand we know that the PS5 (Ryzen is a given) will be a clean slate when it comes to exclusives.

It has sense for PS5 to be totally back compatible if they push PS4 base versions of games to 720p, PS4 PRO to 1080p/1440p and PS5 to 4K.In this way we also could get a graphical improvement but for that PS5 GPU should be near 16 tflops.
So, Jaguar will be the lowest common denominator/dev target baseline? I can't see this happening, except for 3rd party titles.

I believe the 60 FPS developer desire is there (Naughty dog Uc4 for example, lots of articles), they just cant do it with Jaguar.
I hope you realize that it wasn't just a CPU bottleneck, since they also wanted to push graphics as much as they could.

The first UC4 trailer was CGI fakery... 1.84 TF just wasn't enough for it. I remember ND saying in a slide that some rendering techniques were far too computationally intensive for a PS4. Even Pro couldn't run this in real time: https://www.youtube.com/watch?v=y1Rx-Bbht5E

A ryzen console will easily do game X from 30 to 60 FPS IMO.
Most PS4 games have a 30 fps cap though. It would take a "PS5 patch" (kinda like a PS4 Pro patch) to unlock framerates. PS5 boost mode (if that's a thing) ain't going to help fps capped games.

How likely is it that we're gonna get a PS5 patch for the Uncharted 4 Campaign for example? 3rd party games (like Witcher 3) are even less likely to receive PS5 patches.

It's the same reason powerful PC rigs are not able to run Zelda BotW at 60 fps on Cemu. It's not possible, unless Nintendo allows it.
 

Amerzel

Neo Member
The early parts of the next generation will be lot of games coming out for both generations. Hopefully a digital purchase gets you both versions. I'd like to also see multiplayer games be cross generation compatible. If people can play CoD/Madden/FIFA etc with their friends on both platforms I think that would go a long way.

Are you guys ready for all of the 4K remasters?
 

Kambing

Member
The early parts of the next generation will be lot of games coming out for both generations. Hopefully a digital purchase gets you both versions. I'd like to also see multiplayer games be cross generation compatible. If people can play CoD/Madden/FIFA etc with their friends on both platforms I think that would go a long way.

Are you guys ready for all of the 4K remasters?

I am going to be pretty sour, considering the same architecture on PS4/PS5, if they rerelease 4k remasters from this gen. Xbox is leading the way in play anywhere titles, and for christ sake Sony released PS4 pro to play ps4 games in 4k(ish). In fact at this stage, I can't imagine any upgrade on PS4 games warranting a re-release on PS5. Games need to be forward compatible now.
 

RoboPlato

I'd be in the dick
The first UC4 trailer was CGI fakery... 1.84 TF just wasn't enough for it. I remember ND saying in a slide that some rendering techniques were far too computationally intensive for a PS4. Even Pro couldn't run this in real time: https://www.youtube.com/watch?v=y1Rx-Bbht5E

This trailer is what I'm hoping for from the PS5. 60fps with that level of visual fidelity. I think they could probably get fairly close when utilizing something like temporal injection.

I know 30fps games will still exist but I think we'll see another increase in 60fps titles, just like we did this gen compared to last. I expect open world/RPGs to remain at 30 for the most part.
 
I can't imagine 4k remasters of current gen games being a big thing.

Not when all those games should work fine anyway.

Maybe a small fee for a patch?
 

madquills

Neo Member
Pachter aka Captain Obvious chiming in again with the most obvious obviousness. Thanks for the insight Pachter, the one and only gaming industry analyst to whom we all owe so much.
 

autoduelist

Member
Well, there is me for example.

This might be true, but when games stop working on Xbox One S due to Xbox Two being released, they will still work on Xbox One X, at least that is my expectation. To me this means that Xbox One X is half current-gen and half next-gen, which is essentially what MS has been touting with their generationless approach.

It would only be problematic if Xbox One X was indeed not next-gen. So from your point of view of course it is. It will be interesting to see in the coming years which concept Sony and MS decide to implement.

We're on a gaming enthusiast website. How many here do you truly think believe we've started a new gen? We've got two mid-gen upgrades, that's even how they are marketed. If everyone here [and on other gaming forums] don't think it's a new gen, do you seriously think the general public at large thinks it's a new gen?

Do you think MS would be happy with sales if they were marketing/positioning this as a new gen?

It's very odd to me... you seem to want to sell the Xbox One X as a new gen because you think it puts MS in a strategically superior position, but not only does MS not seem to position themselves that way, but if they did, they'd be horrified at their launch sales.


So, which is more likely :

1. All games have super AI and enemy density on Ps5 at 30 FPS, all jaguar consoles cant run them...at all or at 10 FPS

2, All games are normal AI and density, run at 60 FPS on Ps5, can still run on Ps4 pro / xb1X at 30

Pick one and reason why !

I pick 2, no way games will be so complex only I5 class CPU can run them at 30 FPS. Thats business suicide.

Maybe gen after next one or when ps5 > 60 million. Pushing to the max, devs do that now, they are stuck with 30 in many games because of Jaguar.

Again, that's just not how this works. First, you're talking about forward compatibility, which is a non-issue. Any games they want to sell cross gen will be [somewhat] optimized for the older console, just like earlier in this gen. Not to mention, they won't care if it's not perfect, because that actually helps push people to upgrade to the new gen.

You also can't just create two random alternatives, one that sounds ridiculous, and one that fits your worldview, and pretend it's 'either/or'. That's not even how debating works, let alone technology.

I understand that you want games to be 60fps. That's fine and good to want. But your reasoning ignores the actual reason we get 30fps games now.
 
Hopefully VR is a consideration this time. That might lead to a much more powerful cpu relative to the gpu.

We might end up with 60fps as standard then as devs will more overhead then they need.
 
Since we keep discussing Zen a lot you guys might enjoy this video (very technical): https://www.youtube.com/watch?v=Jdi5JmRmez8
INLjaVs.png


inb4 "Mobile Ryzen Pro is Jaguar v2 crap and cannot deliver 60fps across the board". :)
 

RedSwirl

Junior Member
If forwards compatibility happens it'll just be in the form of PS4 games that run with enhanced features on PS5. That way, people who already own PS4 games will have another incentive to upgrade if those games get patched for PS5.
 
If forwards compatibility happens it'll just be in the form of PS4 games that run with enhanced features on PS5. That way, people who already own PS4 games will have another incentive to upgrade if those games get patched for PS5.

That's backwards compatibility.
 
Read two papers today on real-time motion planning and continuous collision detection that only reinforced my belief that most game AI tasks can and should be done GPU-side. Both showed speedups of up to 100x over serial CPU-based state-of-the-art methods. Even heavily data dependent tasks can usually be decoupled, with data flow management done serially by the CPU and data traversal/manipulation done in parallel by the GPU.

Hopefully Sony is paying attention.
 

AmyS

Member
Why do all these guys like Pachter want the PS5 out so soon? This gen really doesn't feel anywhere near finished in my opinion

I'll take Pachter's prediction of 2020, not 2019.

That's still more than 3 years away assuming a Nov. 2020 launch in NA, the UK and EU.
 

THE:MILKMAN

Member
I'll take Pachter's prediction of 2020, not 2019.

That's still more than 3 years away assuming a Nov. 2020 launch in NA, the UK and EU.

It is weird he is commenting (or being asked to comment) this early about PS5. In January 2012 he went on record to say no next gen consoles would launch in 2013 so if he can be wrong less than two before they did launch why commit 2.5-3.5 years out this time around?

Bear in mind the node shrinks (that Sony/MS can't control) are getting harder with longer lead times too.
 

RoboPlato

I'd be in the dick
Why do all these guys like Pachter want the PS5 out so soon? This gen really doesn't feel anywhere near finished in my opinion

People keep saying this but this is still over two years away and would be six years since the release of the PS4. We've had a ton of great games the past two years and that doesn't seem to be slowing down
 

Lady Gaia

Member
Idk about that. At the very first scorpio reveal, MS seems to want a xbox one.5. something that blurs the generational line.

At the first reveal MS wanted to convince everyone that it was worth waiting another year for. That more than anything else explains why they were so eager to hint at more than. In practice the One X really does have some significant advantages in terms of bandwidth and available RAM. Neither one can be used to enable a fundamentally different game, though, so it's going likely to have as much impact as one might hope for.

"Next generation" has always been synonymous with hopes for games that weren't possible previously. Something new. Something fresh. The degree to which generations have delivered varies somewhat, of course, but generally speaking if you can't deliver something that feels new it's hard to sell people on the expense of a new round of hardware.
 

RedSwirl

Junior Member
That's backwards compatibility.

What I'm saying is that's the closest we'll get. I depends on what defines a game from one console gen versus a game from another console gen.

Earlier this gen most cross-gen games were pretty much just PS3/360 games that had been ported up to PS4 and Xbox One, given some extra graphics features or better framerates. There were very few games that were really designed for PS4/Xbox One and then broken down for the earlier consoles.

If PS5 has BC, just letting devs patch extra PS5 features into PS4 games (like with PS4 Pro) would pretty much achieve the same effect. Microsoft is almost certainly going to keep doing this with the next Xbox model after Xbox One X, even if it has a new CPU.
 

Shin

Banned
Masayasu Ito is ranked above Cerny? since he's Senior Vice President Engineering.
If he is then he would be the one calling the shots of when to start development, no?
Is Cerny on some kind of contract since he's founder and CEO of Cerny Games first and foremost.

fab3e9d205.png


4c71182a71.png
 
Jeff Rigby would know what's going on.

I miss Jeff. He was crazy but I loved it. Did he get banned?

Masayasu Ito is ranked above Cerny? since he's Senior Vice President Engineering.
If he is then he would be the one calling the shots of when to start development, no?
Is Cerny on some kind of contract since he's founder and CEO of Cerny Games first and foremost.

fab3e9d205.png


4c71182a71.png

IIRC Cerny is more of a "freelancer" than "staff" member.

Basically a consultant.
 
Read two papers today on real-time motion planning and continuous collision detection that only reinforced my belief that most game AI tasks can and should be done GPU-side. Both showed speedups of up to 100x over serial CPU-based state-of-the-art methods. Even heavily data dependent tasks can usually be decoupled, with data flow management done serially by the CPU and data traversal/manipulation done in parallel by the GPU.

Hopefully Sony is paying attention.

Don't you mean developers?

Masayasu Ito is ranked above Cerny? since he's Senior Vice President Engineering.
If he is then he would be the one calling the shots of when to start development, no?
Is Cerny on some kind of contract since he's founder and CEO of Cerny Games first and foremost.

fab3e9d205.png


4c71182a71.png

As far as I understand it, yes, Mark Cerny is a contractor for Sony.
 

Theonik

Member
Read two papers today on real-time motion planning and continuous collision detection that only reinforced my belief that most game AI tasks can and should be done GPU-side. Both showed speedups of up to 100x over serial CPU-based state-of-the-art methods. Even heavily data dependent tasks can usually be decoupled, with data flow management done serially by the CPU and data traversal/manipulation done in parallel by the GPU.

Hopefully Sony is paying attention.
*shrug* A lot of the linear algebra involved in pathfinding and collision detection can be parallelised, though the main logic itself lives in the CPU. It's also important to remember that a) It's not like current systems or even future ones have huge surpluses of GPU power and b) That approach is faster, but much less efficient overall there is a lot of considerations when implementing such systems in an actual game.
 
Don't you mean developers?

No, I mean Sony. They have to be on top of developing tech lest they get caught flat-footed with a system that doesn't facilitate the deployment of these methods.

It's also important to remember that a) It's not like current systems or even future ones have huge surpluses of GPU power and b) That approach is faster, but much less efficient overall there is a lot of considerations when implementing such systems in an actual game.

a) We'll get surplus GPU compute power (or, rather, highly parallel compute) much before we get surplus CPU (or, rather, highly serial compute) compute power.
b) Uh, yeah? There are lots of considerations when implementing anything. That's how these things go, usually.
 

mrklaw

MrArseFace
*shrug* A lot of the linear algebra involved in pathfinding and collision detection can be parallelised, though the main logic itself lives in the CPU. It's also important to remember that a) It's not like current systems or even future ones have huge surpluses of GPU power and b) That approach is faster, but much less efficient overall there is a lot of considerations when implementing such systems in an actual game.

I thought Sony talked about PS4 having a little too much GPU considering a balanced overall system, with the aim it would encourage some use of GPU compute.
 

Theonik

Member
I thought Sony talked about PS4 having a little too much GPU considering a balanced overall system, with the aim it would encourage some use of GPU compute.
It had the best CPU and GPU they could have put in that formfactor at that time in that price. The hope was that developers could cope by doing more compute to offload work from the CPU but this doesn't mean they had a surplus of GPU power... GPU workloads are not really dependent on the CPU, you can drive up resolution at no real CPU cost for example.
 
It had the best CPU and GPU they could have put in that formfactor at that time in that price. The hope was that developers could cope by doing more compute to offload work from the CPU but this doesn't mean they had a surplus of GPU power... GPU workloads are not really dependent on the CPU, you can drive up resolution at no real CPU cost for example.

I don't know how you can possibly state this... You still need the CPU to drive the game simulation, then build the command list for the GPU.

Of course GPU workloads are dependent on the CPU, as a GPU merely renders on-screen what is simulated by the CPU. Constrain a system more and more with a shittier and shittier CPU and you end up with a severely underutilized GPU. Whereas, a bigger and bigger CPU will simply ensure you're not CPU bound; hence the need to balance system CPU and GPU performance for game consoles.
 

Theonik

Member
I don't know how you can possibly state this... You still need the CPU to drive the game simulation, then build the command list for the GPU.

Of course GPU workloads are dependent on the CPU, as a GPU merely renders on-screen what is simulated by the CPU. Constrain a system more and more with a shittier and shittier CPU and you end up with a severely underutilized GPU. Whereas, a bigger and bigger CPU will simply ensure you're not CPU bound; hence the need to balance system CPU and GPU performance for game consoles.
Not true. Simulation complexity doesn't strictly drive renderer workloads. The example is that the same game running in 4K will hit the GPU predominantly. They perform different functions in a pipeline. Having said that of course there is reason to engineer them in function with one another. After all if your CPU part of the pipeline is taking 33ms and your GPU is taking 16ms you are CPU bound and could have been running at 60fps vs 30. Or you can add more GPU tasks to get that to 33.
 

mrklaw

MrArseFace
It had the best CPU and GPU they could have put in that formfactor at that time in that price. The hope was that developers could cope by doing more compute to offload work from the CPU but this doesn't mean they had a surplus of GPU power... GPU workloads are not really dependent on the CPU, you can drive up resolution at no real CPU cost for example.

I was just quoting the kinds of things Mark Cerny was talking about at the PS4 launch period. May have been 'tech PR' but they were saying it

Digital Foundry: Going back to GPU compute for a moment, I wouldn't call it a rumour - it was more than that. There was a recommendation - a suggestion? - for 14 cores [GPU compute units] allocated to visuals and four to GPU compute...

Mark Cerny: That comes from a leak and is not any form of formal evangelisation. The point is the hardware is intentionally not 100 per cent round. It has a little bit more ALU in it than it would if you were thinking strictly about graphics. As a result of that you have an opportunity, you could say an incentivisation, to use that ALU for GPGPU.

http://www.eurogamer.net/articles/digitalfoundry-face-to-face-with-mark-cerny
 

Theonik

Member
I was just quoting the kinds of things Mark Cerny was talking about at the PS4 launch period. May have been 'tech PR' but they were saying it
http://www.eurogamer.net/articles/digitalfoundry-face-to-face-with-mark-cerny
You have the option. It's also why specialised hardware accelerators are a really daft proposal I see thrown around in those threads. Since unified shaders we see a lot more power available for the developers to do with it as they need. Of course having a weaker CPU incentivised doing more compute as did the PS3 being kinda shite necessitate finding a use for the SPEs. This is spin if anything.
 
I was just quoting the kinds of things Mark Cerny was talking about at the PS4 launch period. May have been 'tech PR' but they were saying it



http://www.eurogamer.net/articles/digitalfoundry-face-to-face-with-mark-cerny

You have the option. It's also why specialised hardware accelerators are a really daft proposal I see thrown around in those threads. Since unified shaders we see a lot more power available for the developers to do with it as they need. Of course having a weaker CPU incentivised doing more compute as did the PS3 being kinda shite necessitate finding a use for the SPEs. This is spin if anything.

My explanation for that Cerny comment was as a response to Sony knowing what was in their competitor's console, and thus just concluding that with the PS4 having more GPU than it's main competitor, developers could use that extra performance on GPU compute. Games would, after all, be developed for both consoles.

That's my personal speculation, however.
 

Theonik

Member
My explanation for that Cerny comment was as a response to Sony knowing what was in their competitor's console, and thus just concluding that with the PS4 having more GPU than it's main competitor, developers could use that extra performance on GPU compute. Games would, after all, be developed for both consoles.

That's my personal speculation, however.
Microsoft bumping the CPU clock was a decision made quite late in the game, or at the very least disclosed fairly late,
Sony chose to clock their Jaguar cores at 1.6Ghz that was standard at the time.
 
Top Bottom