• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Cerny: Devs Don’t Have to Optimize in Any Way for PS5’s Variable Clocks, It’s All Automatic

Jon Canon

Member
My takeaway is that this is not a time limited boostmode, you are free to run cpu and gpu at max ghz at all times.

There are plenty of little moments though, like milliseconds of frame budgets where there is no need for maximum clock.

Sonys smartshift implementation enables a cooler console, by throttling down at every possible time.

So 10,3tf it is.

He also states that keeping fewer but faster CUs busy is easier for developers to optimize for. Time will tell if it will look better.
 
Last edited:

Whitecrow

Banned
My takeaway is that this is not a time limited boostmode, you are free to run cpu and gpu at max ghz at all times.

There are plenty of little moments though, like milliseconds of frame budgets where there is no need for maximum clock.

Sonys smartshift implementation enables a cooler console, by throttling down at every possible time.

So 10,3tf it is.

He also states that keeping fewer but faster CUs busy is easier for developers to optimize for. Time will tell if it will look better.
You got it wrong mate.

Ps5 is 10.3 TFs, EXCEPT when it has a huge workload and draws too much power, it then underclocks itself
 

darkinstinct

...lacks reading comprehension.
Noone develops a game specifically rated for a 10.28 TF part! It doesn't work like that.

You do what you can within your frame-time and memory budget, in fact memory (ram utilization specifically) tends to be the only absolutely immutable aspect because everything else can be massaged into shape.

The whole point of it being autonomous is that every unit performs consistently irrespective of ambient temperature. So if your dev build is running stably even though its requiring >9.2tf of GPU resource its going to perform identically on retail hardware, regardless of the console running in Alaska or Arizona.
Cerny made clear that's not what is happening. They aren't measuring temperatures of the APU, they are measuring load.
 

Panajev2001a

GAF's Pleasant Genius
When his console crushed Microsofts of course did not need to marketing. Now that he is in a dog fight spec wise he is pulling out marketing speak 101 mostly, a couple. Slightly ect. Its easy to when you kicked the other guys ass to not resort to marketing speak. When it is not an ass kicking its 100% marketing speak. If it was not we would have legitimate hard numbers for the variable clocks instead of each side guessing at best and worst case and using it as a fact. The SSD 100% numbers because it way out does Microsoft no coincidence.

I disagree, he gave plenty of hard data and relevant too. Until he gives reasons not to have the benefit of doubt, as a coder/producer/designer/HW architect he built a reputation that speaks for itself (PS4, PSVita, and now PS5... the architecture, development environment, and OS for each is nothing people can scream against), I will take his as a technically minded person speaking technology as he has always done.

He is not trying to win a PR war when he talks nor trying to confuse people with tons of tech detail. There is surprisingly a lot of interviews and talks he gave over the years and he is quite the consistent individual.
The HW specs supremacy PR brand Xbox was built on shows in your outlook, I have not seen Cerny talk differently about PS4 vs PS5.

You smell blood with your favourite console being faster and see only damage control and empty PR from the other one and assume the worst... maybe this is how you saw MS act with Xbox One, but I feel you are projecting here. Now, if his presentation had been given by Jim Ryan or others... I would be disagreeing with you a lot less let’s say that.
 

Clear

CliffyB's Cock Holster
Cerny made clear that's not what is happening. They aren't measuring temperatures of the APU, they are measuring load.

Reread my post: Cerny stresses repeatedly that frequency variation is (1)consistent and (2)load based. Hence ambient temp is not an issue unlike with conventional boost overclocking or temp throttling.
 

Panajev2001a

GAF's Pleasant Genius
You can believe what you want, but he can say everything he needs to say to developers directly to them.

If he’s doing a digital foundry interview it’s for marketing purposes first.

So what if he speaks to developers and gives a similar speech to a wider audience? If your experience is that performance crown is everything and if you do not have it you must resort to pie through your teeth because nothing else matters and that is what PR is for then fine, that is your experience on your box of choice in the past... keep projecting if you must or give me real reasons why he should not be taken at its word that are not console warrish ones.
 
uPRZ8G6.png


The problem with the world today is that facts no longer matter and I trace it back to the moment a famous president started freely using the term alternative facts. You don't know anything about the performance or quality of at least 2 games on that list. Other than Death Stranding, the other ones are not up to par.
Was the discussion about what Gears 5 had to sacrifice to hit its performance target? because it may be nice, but it certainly does not look as good as the best looking PS4 games... sorry to break it to you.
 

Clear

CliffyB's Cock Holster
Consistent variation.

That's an oxymoron if I ever saw one.

No its not. Inconsistent variation is bad, consistent variation is good because its predictable.

Basically the implication is that 100% boost on GPU frequency (due to load) will limit boost on cpu frequency to less than 100%. The ratio being consistent across all domestic units regardless of location and ambient conditions.

What this means is the full 10.3Tf is always available outside of circumstances where every Zen2 cpu core is at full occupancy. A shortcoming that can be mitigated by other parts of the APU offloading work the CPU would normally be tasked with in order to keep the GPU fed with data.
 
Last edited:

DaMonsta

Member
So what if he speaks to developers and gives a similar speech to a wider audience? If your experience is that performance crown is everything and if you do not have it you must resort to pie through your teeth because nothing else matters and that is what PR is for then fine, that is your experience on your box of choice in the past... keep projecting if you must or give me real reasons why he should not be taken at its word that are not console warrish ones.
What the hell are you even talking about?😂😂

Console wars got people acting crazy around here.
 

Dory16

Banned
Was the discussion about what Gears 5 had to sacrifice to hit its performance target? because it may be nice, but it certainly does not look as good as the best looking PS4 games... sorry to break it to you.
The only one that looks as good is Death Stranding. Certainly no game from 2016. And the performance level reached by GOW5 is not achievable without checkerboarding on PlayStation.
 
The only one that looks as good is Death Stranding. Certainly no game from 2016. And the performance level reached by GOW5 is not achievable without checkerboarding on PlayStation.
What performance ? Gears 5 is using an aggressive dynamic resolution (from 1080p to 1800p) + temporal reconstruction to reach 4K + variable framerate. Native 4K is only there at 30fps in the cutscenes:

 

Tripolygon

Banned
when i said that this console are already old it was not some gatcha moment on my part trying to winning a conversation, i was stating factual stuff, 3000 series from Nvidia is gonna shit on these console from a relatively high place even before the console came out, this was the all meaning of the word "old".
Unless "Old" means something else i don't think these new consoles are old in terms of architecture, they new architectures. What you are saying doesn't make sense. If you want to buy a GPU more powerful in terms of TF, than next gen consoles, you don't need to wait till 3000 series from Nvidia or next high-end AMD GPU, you should get a

RTX 2080Ti - 14TF
AMD Vega 64 12.7TF or Vega II 13.8TF

But an fyi, Xbox Series X will beat those 2 Vega GPU even though they theoretically have higher TF than Series X. This is what a new architecture provides. And i can also see PS5 trading blows with Vega II even though it has more theoretical TF than PS5. Even though both Series X and PS5 has less TF they can still beat those GPU with better theoretical TF numbers.

listen, i'm pretty noob with this technical stuff, but i saw both noobs and experts from df to nx, to insider on twitter to people who know about this stuff on this forum or other forums being really confused by all this boost clock thing, we had a technical presentation from cerny and various overview from df etc. and we still don't know how this thing is gonna perform in real game application espacially with an eye on the future when scene that use both cpu\gpu at 100% can probably occur.
my "contrived" was more of a general sentiment stating, it's surely is something more contrived than fixed clock in sex, we can at least agree on this, didnt'we?
I'm pretty "noob" about this stuff but NX gamer was not confused about this stuff and neither was I, in fact i tried to explain it on this forum before NX gamer made his video and he used a similar explanation as i did. Digital foundry was neither confused about this stuff they have tried to explain it in their videos, they just are not sure how it will actually work in real games which is what everyone is waiting on to see. This is a new design paradigm for consoles afterall. Many persons have tried to explain in this very thread and met will "lalalala 9TF lol". These are not people trying to understand how it works, these are people trying not to understand it and want to go with 9TF because reasons?
i don't know what the better question is, my point was always "more power is better" and any excuse to say different is stupid, and also the fact that if 9 to 10 is a noticeable upgrade worth of doing something "unusual" (if contrived sound soo bad to you) then the jump from 10 to 12 is even more noticeable and it's funny when people try to devaluate that.
It is the better question. Why would people go with a made up number when the console designer has put out their specifications? More power is better nobody disputes that, that is a strawman you have made up. You can't point to any single person that says more power is not better. What people are saying is that 10TF is pretty good jump from current gen and 10TF is not far off in performance from 12TF. To say something is contrived means it is false or not genuine. How is PS5 being 10TF not genuine?
(sorry for my english)
Your English is fine mate, don't worry about it.
 
Last edited:

Dory16

Banned
What performance ? Gears 5 is using an aggressive dynamic resolution (from 1080p to 1800p) + temporal reconstruction to reach 4K + variable framerate. Native 4K is only there at 30fps in the cutscenes:


Time to produce your own review. I’m warning you that you are way off the general consensus on the game.
 

Panajev2001a

GAF's Pleasant Genius
What the hell are you even talking about?😂😂

Console wars got people acting crazy around here.

No kidding, my fellow self unaware poster ;).

I was watching the IGN interview and, despite an almost fanatical cult of the leader, it is impressive how much Phil avoids console war trolling the IGN editor wanted to bring to the interview like a forum troll and yet the fans on the ground act so differently. Yeah, it does not really matter here despite me gaining more respect for him and less for the legion...
 

DaMonsta

Member
No kidding, my fellow self unaware poster ;).

I was watching the IGN interview and, despite an almost fanatical cult of the leader, it is impressive how much Phil avoids console war trolling the IGN editor wanted to bring to the interview like a forum troll and yet the fans on the ground act so differently. Yeah, it does not really matter here despite me gaining more respect for him and less for the legion...
I mean none of this has anything to do with what I said.

You really gotta step away from the console war.

It’s not me that’s lacking self awareness here.
 
The only one that looks as good is Death Stranding. Certainly no game from 2016. And the performance level reached by GOW5 is not achievable without checkerboarding on PlayStation.
Sorry, but Uncharted 4 still looks better AND has some more interactive environments... This is not as if tech changed that much on consoles since then, the lighting model on Gears 5 (consoles) is pretty simple outside the cutscenes, I mean as soon as I saw gameplay I saw that it felt kind of flat, not surprising they reached 60fps --it's still great they made this choice, I think this is the right choice for an action shooter, by example I would have preferred Uncharted 4 to offer a performance mode.

and have you seen Resident Evil 3? even on base PS4? or even Doom 2016? all pretty amazing looking 2016 games.
Time to produce your own review. I’m warning you that you are way off the general consensus on the game.

He gave you numbers, you can't really "disagree" with that... you can have a consensus that a game looks and plays great, it does not make it 4K, nor does it make it the best looking.
 
Last edited:

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
It does not make sense, if a dev whats consistent performance, they are not going to get it with the PS5.

The performance of the PS5 will be entirely predictable and consistent on all PS5s.
 
Sorry, but Uncharted 4 still looks better AND has some more interactive environments... This is not as if tech changed that much on consoles since then, the lighting model on Gears 5 (consoles) is pretty simple outside the cutscenes, I mean as soon as I saw gameplay I saw that it felt kind of flat, not surprising they reached 60fps --it's still great they made this choice, I think this is the right choice for an action shooter, by example I would have preferred Uncharted 4 to offer a performance mode.

and have you seen Resident Evil 3? even on base PS4? or even Doom 2016? all pretty amazing looking 2016 games.


He gave you numbers, you can't really "disagree" with that... you can have a consensus that a game looks and plays great, it does not make it 4K, nor does it make it the best looking.
Your posts, comparing graphics, are embarrassing. UC4 looks like a blurry POS on the pro and a 4k tv and what interactive environments are you even talking about ? Gears 5 is on a different league, when being played on a big 4k tv not when comparing YouTube videos. 4 main A.I characters on screen, a ton of enemies, dynamic 4k, 60 fps, software Ray tracing , e.t.c. Going from Gears5 to most sony exclusives is like having a Vaseline filter on your 4k screen and I actually have a pro and an Xbox one x side by side , unlike you I would guess.
 

Panajev2001a

GAF's Pleasant Genius
Your posts, comparing graphics, are embarrassing. UC4 looks like a blurry POS on the pro and a 4k tv and what interactive environments are you even talking about ?
[...]
Going from Gears5 to most sony exclusives is like having a Vaseline filter on your 4k screen and I actually have a pro and an Xbox one x side by side , unlike you I would guess.

UC4 a blurry POS on the Pro? Haha, sure. Keep showing other people the level of abuse Xbox fans get by console war trolls on SonyGAF... :rolleyes:.
 
Last edited:
UC4 a blurry POS on the Pro? Haha, sure. Keep showing other people the level of abuse Xbox fans get by console war trolls on SonyGAF... :rolleyes:.
Well Uncharted 4 is already a decently blurry game even when matched to a proper display because of its TAA. The game on Pro is only 1440p so not only are you getting blur because it's not even half of the native display resolution, the TAA compounds on top of that.

It doesn't look like shit or anything but it's really soft...
 
Well Uncharted 4 is already a decently blurry game even when matched to a proper display because of its TAA. The game on Pro is only 1440p so not only are you getting blur because it's not even half of the native display resolution, the TAA compounds on top of that.

It doesn't look like shit or anything but it's really soft...

I really wish we could use sharpening filters (like thru the Nvidia control panel) on consoles to cut through the ridiculous TAA vasoline blur
 

Hendrick's

If only my penis was as big as my GamerScore!
Faster clock speed for the GPU->slower clock speed for the CPU and the opposite. That is AMD’s smartshift technology that Cerny mentions in a nutshell. People can sugarcoat it all they want and spend insane amounts of energy in damage control but that is the way things are.
Careful, you might get thread banned for stating facts.
 
Well Uncharted 4 is already a decently blurry game even when matched to a proper display because of its TAA. The game on Pro is only 1440p so not only are you getting blur because it's not even half of the native display resolution, the TAA compounds on top of that.

It doesn't look like shit or anything but it's really soft...
If you go from gears5 to UC4 , it’s a world apart in terms of sharpness and image quality, no ifs and buts can change that. I usually don’t bother arguing with sony fans about specific games’ graphics cause it has been an exhausting thing since the ps3 days but I couldn’t let that fanboy, albatroswhatever, trying to diminish the most technically impressive game of this console generation, gears 5 on the Xbox one x with bullshit claims. What Gears5 technically achieved on the x is jaw dropping and no «oh this Sony game running at 1440p and 30 fps with 2-3 enemies onscreen looks so much better» will change that. The pro would explode trying to run Gears5.

Unlike many people here I actually have both the X and the pro connected on the same tv (Samsung KS8000 for the americans, it is KS7000 in Europe where I live) and I can actually judge the games by playing them back to back not by using Youtube.
 
Last edited:

Deto

Banned
I used RDNA1 Figures since that's the only RDNA Architecture available right now.



RDNA2 is RDNA based so there could be a slight improvement but I doubt its a huge improvement.


RDNA 2 will be 50% more efficient
It appears that Big Navi will ditch the blower cooler. (Image source: AMD)
As a part of its Financial Analyst Day, AMD released more details about its RDNA 2 architecture coming later this year. In addition to ray tracing and variable rate shading support, AMD says RDNA 2 will deliver 50% more performance per watt than RDNA. AMD also teased Big Navi with a photo that looks an awful lot like a red and black founder's edition card.



continue with FUD
 

Deto

Banned
Careful, you might get thread banned for stating facts.

“And any man who must say 'I am king' is no true king at all.”

~“And any man who must say 'fact' is no true facts at all.”

Xbox fanboy is like phil spencer, talks a lot and does little

Continue with FUD, the time that the games made by the developers, who only praised the PS5, come out you will say "Sony paid to have parity" equal to destiny 2 that could be 60fps on xoneX, according to the delusion of xbots
 
Last edited:

Hendrick's

If only my penis was as big as my GamerScore!
“And any man who must say 'I am king' is no true king at all.”

~“And any man who must say 'fact' is no true facts at all.”

Xbox fanboy is like phil spencer, talks a lot and does little

Continue with FUD, the time that the games made by the developers, who only praised the PS5, come out you will say "Sony paid to have parity" equal to destiny 2 that could be 60fps on xoneX, according to the delusion of xbots
Well right now all we have are specs, so naturally that's what is being discussed. When the games come out, then we can talk about games.
 

Genx3

Member

RDNA 2 will be 50% more efficient
It appears that Big Navi will ditch the blower cooler. (Image source: AMD)
As a part of its Financial Analyst Day, AMD released more details about its RDNA 2 architecture coming later this year. In addition to ray tracing and variable rate shading support, AMD says RDNA 2 will deliver 50% more performance per watt than RDNA. AMD also teased Big Navi with a photo that looks an awful lot like a red and black founder's edition card.



continue with FUD

You should stop your warrior schtick.

AMD likes to exaggerate their figures.
If true that would definitely be good news for every future AMD GPU.
 
Last edited:

Deto

Banned
Well right now all we have are specs, so naturally that's what is being discussed. When the games come out, then we can talk about games.


Who makes the games already spoke

You should stop your warrior schtick.

AMD likes to exaggerate their figures.
If true that would definitely be good news for every future AMD GPU.

AMD and Sony exaggerate, but you don't.
 
Last edited:
Your posts, comparing graphics, are embarrassing. UC4 looks like a blurry POS on the pro and a 4k tv and what interactive environments are you even talking about ? Gears 5 is on a different league, when being played on a big 4k tv not when comparing YouTube videos. 4 main A.I characters on screen, a ton of enemies, dynamic 4k, 60 fps, software Ray tracing , e.t.c. Going from Gears5 to most sony exclusives is like having a Vaseline filter on your 4k screen and I actually have a pro and an Xbox one x side by side , unlike you I would guess.
It still looks flat, the polygonal details are pared down, sorry but even on the X the compromises are obvious.

I inserted two screenshots below, look at the details in the rocks in both games (very few polygons), how the foliage in the trees, the vegetation, and more importantly the lighting.

Gears 5 in the snow:
Gears-5-Screenshot-2019-09-02-13-04-00.png


The snow scene in TLoU (the first one) snow:
the-last-of-us-remastered-ellie-hunting.jpg



Volumetric lighting (and overall scene details, look at all the polygons in the sand bags).

Gears-5-Screenshot-2019-09-03-21-41-55.png


God of war (are your glasses working on PS4 games? maybe you won't be able to see the details):
fe6RR7.png


Oh and here is some Uncharted 4 action:

20160603161941-22.jpg


Sorry, but confirmation bias is a thing, and you have this thing.
 

Sosokrates

Report me if I continue to console war
The performance of the PS5 will be entirely predictable and consistent on all PS5s.

Not if the dev want consistent performance, its impossible to have a locked level of performance when the clockspeeds fluctuate.
 
It still looks flat, the polygonal details are pared down, sorry but even on the X the compromises are obvious.

I inserted two screenshots below, look at the details in the rocks in both games (very few polygons), how the foliage in the trees, the vegetation, and more importantly the lighting.

Gears 5 in the snow:
Gears-5-Screenshot-2019-09-02-13-04-00.png


The snow scene in TLoU (the first one) snow:
the-last-of-us-remastered-ellie-hunting.jpg



Volumetric lighting (and overall scene details, look at all the polygons in the sand bags).

Gears-5-Screenshot-2019-09-03-21-41-55.png


God of war (are your glasses working on PS4 games? maybe you won't be able to see the details):
fe6RR7.png


Oh and here is some Uncharted 4 action:

20160603161941-22.jpg


Sorry, but confirmation bias is a thing, and you have this thing.

Are these your own screens? Solid pics, I’m impressed.
 

rnlval

Member
What performance ? Gears 5 is using an aggressive dynamic resolution (from 1080p to 1800p) + temporal reconstruction to reach 4K + variable framerate. Native 4K is only there at 30fps in the cutscenes:


All hardware has limits. Some hardware limits are lower than others.
 

rnlval

Member

RDNA 2 will be 50% more efficient
It appears that Big Navi will ditch the blower cooler. (Image source: AMD)
As a part of its Financial Analyst Day, AMD released more details about its RDNA 2 architecture coming later this year. In addition to ray tracing and variable rate shading support, AMD says RDNA 2 will deliver 50% more performance per watt than RDNA. AMD also teased Big Navi with a photo that looks an awful lot like a red and black founder's edition card.


continue with FUD
For 28nm example, the move from R9-290X's 5.6 TFLOPS to R9-Fury X's 8.6 TFLOPS has about 53% perf/watt improvement. Too bad AMD didn't scale Fury X's rasterizer hardware with 8.6 TFLOPS.
 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
My takeaway is that this is not a time limited boostmode, you are free to run cpu and gpu at max ghz at all times.

There are plenty of little moments though, like milliseconds of frame budgets where there is no need for maximum clock.

Sonys smartshift implementation enables a cooler console, by throttling down at every possible time.

So 10,3tf it is.

He also states that keeping fewer but faster CUs busy is easier for developers to optimize for. Time will tell if it will look better.

You are like 95% correct. The only thing to add is that SmartShift isn't the samething as the block in the PS5 that regulates power. Smartshift is for "additional" power. The Power block in the PS5 enables the console to be cooler.
 

Panajev2001a

GAF's Pleasant Genius
Well Uncharted 4 is already a decently blurry game even when matched to a proper display because of its TAA. The game on Pro is only 1440p so not only are you getting blur because it's not even half of the native display resolution, the TAA compounds on top of that.

It doesn't look like shit or anything but it's really soft...

Scaling 1440p to a 4K display is not like 560p to a 1080p one unless you went for a giant 4K panel and sit super close. I have experience of Switch software running on the 4K TV and not even the 1080p games look like blurry super soft games.

Still, scaling 4K to 4K is even better yes and yes resolution is one of the many components making a game’s visual markup for me. I would hardly call that game blurry still, that is still an exaggeration for me though.
 

Ashoca

Banned
My takeaway is that this is not a time limited boostmode, you are free to run cpu and gpu at max ghz at all times.

There are plenty of little moments though, like milliseconds of frame budgets where there is no need for maximum clock.

Sonys smartshift implementation enables a cooler console, by throttling down at every possible time.

So 10,3tf it is.

But what does this mean then:

if the game is doing power-intensive processing for a few frames, then it gets throttled.
and
Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core.

Why does it throttle down when there is a power-intensive scene? Why do devs have to throttle back the CPU to sustain 2.23 GHZ on the GPU?

Source: https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive
 
It still looks flat, the polygonal details are pared down, sorry but even on the X the compromises are obvious.

I inserted two screenshots below, look at the details in the rocks in both games (very few polygons), how the foliage in the trees, the vegetation, and more importantly the lighting.

Gears 5 in the snow:
Gears-5-Screenshot-2019-09-02-13-04-00.png


The snow scene in TLoU (the first one) snow:
the-last-of-us-remastered-ellie-hunting.jpg



Volumetric lighting (and overall scene details, look at all the polygons in the sand bags).

Gears-5-Screenshot-2019-09-03-21-41-55.png


God of war (are your glasses working on PS4 games? maybe you won't be able to see the details):
fe6RR7.png


Oh and here is some Uncharted 4 action:

20160603161941-22.jpg


Sorry, but confirmation bias is a thing, and you have this thing.
Why in the hell did you post some images that basically prove absolutely nothing ? I can post some images of , say, shadow of the tomb Raider that kill every Sony exclusive you are trying to defend . Screenshot wars were a thing in like 10 years ago, now with everything that is going on in these games and the differences they have when being played in front of you (differences in image quality, frame rates, A.I characters running around, HDR and its level of implementation, effects like Ray tracing e.t.c), the screenshot wars have died, there is a reason nobody compares games based on screenshots anymore. I mean seriously ? Should I really start posting screenshots of just about every AAA game to prove a point ? You can see the polygons in the bags, I can see the polygons even in your selected god of war screenshot but these comparisons are ridiculous anyway.

There is a reason Sony never attempted to put out something with the enemy density, 4 A.I friendly characters and more of gears 5 and of course completely neglecting 60 fps. The pro is about to start its engines and take off when trying to run these low res, 30 fps Sony exclusives.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Why in the hell did you post some images that basically prove absolutely nothing ? I can post some images of , say, shadow of the tomb Raider that kill every Sony exclusive you are trying to defend . Screenshot wars were a thing in like 10 years ago, now with everything that is going on in these games and the differences they have when being played in front of you (differences in image quality, frame rates, A.I characters running around, HDR and its level of implementation, effects like Ray tracing e.t.c), the screenshot wars have died, there is a reason nobody compares games based on screenshots anymore. I mean seriously ? Should I really start posting screenshots of just about every AAA game to prove a point ? You can see the polygons in the bags, I can see the polygons even in your selected god of war screenshot but these comparisons are ridiculous anyway.

There is a reason Sony never attempted to put out something with the enemy density, 4 A.I friendly characters and more of gears 5 and of course completely neglecting 60 fps. The pro is about to start its engines and take off when trying to run these low res, 30 fps Sony exclusives.

You are regurgitating hyperbolic terms about a title because it is on the console you are championing (software ray tracing like KZ: SF at PS4’s launch huh?) just shitting on the other console’s titles unfairly... “kill every Sony exclusive” as if we were all teens at a playground :rolleyes:.
 

Kokoloko85

Member
Great post, but don't forget, 360 had a 54% defect rate, people keep forgetting that. Had it not been for those high defect rates, 360 would have sold the same 40 million consoles XBONEX has. There is really not an 80 million base for XBOX outside of rebuying their consoles at such an alarming rate. Lots of Xbox fans came into this gen thinking XBONES would have sold 80 million easily, but never factored in rrod.

Ah I totally forgot about that, great valid point.. I will add that next time. Seems like Im gonna need to post it a few times for the PS5 haters lol.
 

GymWolf

Gold Member
gears 5 is one of the most impressive but really not impressive game of this gen.

ultra-optimized for both console and pc, 60 frame even in some bigger sandbox areas, great hdr, high resolution and overall a nice modern graphic but it still woow me less than let's say games like the order, infamous second son or like other said unchy4, let alone stuff like tlou2, tsushima, horizon, metro exodus, ac origins, days gone, spiderman, ryse or the last forza horizon...

can't even explain why, maybe the shitty art design, the same fucking armour who looks like children toys, or the painfully old feeling animations or most of the faces...
i mean look at edgy problematic lead in g5 and elena in u4, one looks almost like a person, the other looks like some space alien trying to imitate human face with plasticine...

i don't know, maybe the fact that i'm very annoyed by the all gears formula has some weight in m judgement...
it's still a relatively impressive game all around, performance are absolute ACE.
 
Last edited:

ZywyPL

Banned
gears 5 is one of the most impressive but really not impressive game of this gen.

ultra-optimized for both console and pc, 60 frame even in some bigger sandbox areas, great hdr, high resolution and overall a nice modern graphic but it still woow me less than let's say games like the order, infamous second son or like other said unchy4, let alone stuff like tlou2, tsushima, horizon, metro exodus, ac origins, days gone or the last forza horizon...

can't even explain why, maybe the shitty art design, the same fucking armour who looks like children toys or most of the faces...
i mean look at lesbo problematic lead in g5 and elena in u4, one looks almost like a person, the other looks like some space alien trying to imitate human face with plasticine...

i don't know, maybe the fact that i'm very annoyed by the all gears formula has some weight in m judgement...
it's still a relatively impressive game all around.

Couldn't agree more. While technically Gears 5 and 4 are in the top league, they lack the artistic direction Sony studios seems to have mastered to perfection. Same deal for Forza vs GT, technically they are more or less on par, but GT always has been pushing for photorealism, whereas in Forza it doesn't matter steel, aluminum or carbon fiber, every car looks like it's made from cheap plastic... I can understand that MS studios have to cut the corners in order to hit 60FPS even at 4K, but still...
 

GymWolf

Gold Member
Couldn't agree more. While technically Gears 5 and 4 are in the top league, they lack the artistic direction Sony studios seems to have mastered to perfection. Same deal for Forza vs GT, technically they are more or less on par, but GT always has been pushing for photorealism, whereas in Forza it doesn't matter steel, aluminum or carbon fiber, every car looks like it's made from cheap plastic... I can understand that MS studios have to cut the corners in order to hit 60FPS even at 4K, but still...
I onestly prefer forza and drive club but i'm not really an expert of car or driving game and i only saw 4k video f them and a lot of gif and screenshot.
 
Top Bottom