• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Evolution of Graphics Technology

The post I replied to didn't mention any specific test lol, maybe you should learn to be a bit more specific. If your point is that a console with a low level API will see better performance when a dev tailors their code to that hardware, compared to it running through a general API on PC, then you didn't quite so many words. In the end, who cares? You're still going to get better performance on a good PC, for most games during most of the console cycle.
These benchmarks were a huge deal, and this entire thread is based on the premise that consoles wont eventually outperform the 3080 as dev's begin to fully tap 100% of console hardware - while ignoring older PC hardware as newer hardware exists - the test will be relevant again if Bethesda release 3 game's in the time span of 5 years out, all of which leap frog one another in terms of development cycles.

And I mentioned, here in previous replies - that this was a controlled FM test meant to simulate whether or not Consoles outperformed generationally better PC hardware, and maintained a consistent steady 24-27 FPS moreso than PC - due to Development costs/cycles/standards as dev time moved on.

I keep reiterating as it will remain the relevant benchmark when anyone attempts to cite Graphics on console can not in fact exceed generationally better hardware as time moves forward and newer PC hardware is released. And the truth is counter, as Dev's generally don't devote their time to ensuring a game is optimized across all hardware platforms on pc - whilst they fully and completely understand if console hardware is being underutilized at percentages higher than 50%, and it is far easier to ensure you hit that magic 99% utilization percentile when devs do decide to focus on console.
 

eot

Banned
These benchmarks were a huge deal, and this entire thread is based on the premise that consoles wont eventually outperform the 3080 as dev's begin to fully tap 100% of console hardware - while ignoring older PC hardware as newer hardware exists - the test will be relevant again if Bethesda release 3 game's in the time span of 5 years out, all of which leap frog one another in terms of development cycles.

And I mentioned, here in previous replies - that this was a controlled FM test meant to simulate whether or not Consoles outperformed generationally better PC hardware, and maintained a consistent steady 24-27 FPS moreso than PC - due to Development costs/cycles/standards as dev time moved on.

I keep reiterating as it will remain the relevant benchmark when anyone attempts to cite Graphics on console can not in fact exceed generationally better hardware as time moves forward and newer PC hardware is released. And the truth is counter, as Dev's generally don't devote their time to ensuring a game is optimized across all hardware platforms on pc - whilst they fully and completely understand if console hardware is being underutilized at percentages higher than 50%, and it is far easier to ensure you hit that magic 99% utilization percentile when devs do decide to focus on console.
This thread is about early-gen releases versus late-gen releases. The quality of the corresponding PC ports doesn't really say anything about that. If you want to say something about the relative performance of consoles and PC, then picking a game running on a broken ass engine, made by a company that doesn't optimize for shit, maybe isn't the best example anyway, and a single instance of a game running better on a certain platform doesn't necessarily show anything either.

For last gen, take a GPU like the 980, which came out in 2014 (so, fairly close to the PS4/Xbone launch), it will still run most games at 1080p just fine. The 360 was a completely different scenario, because when it came out it was arguably stronger than the PC GPUs at the time.
 

Spukc

always chasing the next thrill
TLOU2 looks amazing and breathtaking at times , but no .

i5ovzsl9yr661.jpg

UaxQ3gC.jpg
That is hands down the fucking worst PC EXAMPLE lol. That game is as shallow as the fake RTX puddles in that game
 

mrMUR_96

Member
He wants estimates from the people here that act like they know exactly what will happen.

You mention geometry. But geometry has to be rendered. If you have a game like Spiderman MM that has to dumb down geometry/shading to render reflections, adding more geometry into the scene isn't going to change this, for example.



"tons" more? He wants to know like what? UE5 demo showed the most advanced geometry streaming this new generation. And yet the GPU couldn't render that demo at native 4k/60FPS. The limit was reached even with the SSD. This game:

fIigYJi.gif


doesn't look far from it and it's not using SSD. So the question is: what is the quantitative reality of the meaning of the word "tons" more to get out of the PS5? How much did Insomniac get out of the PS5 using it's SSD and RT? 60%, 70%, 30%?




It's not that they are lying, it's that we aren't seeing it yet. Insomniac has done their due diligence on MM and we'd like to quantify that. If they haven't even scratched the surface for what the PS5 can do - performance wise - why can't we see native 4k/60FPS or better, more accurate ray-tracing already?
VFXVeteran VFXVeteran What game/demo is this gif from?
 
Slightly better ? maybe during the cutscenes. Far better ? No way . Tlou2 in-game characters are way less detailed

83292820_699277000632516_72159590129694034_o.jpg

maxresdefault.jpg

vita_tlou2.0.jpg

4dtrco7kvms51.jpg


Meanwhile ..

dpKlEUj.jpg

QUj1h7T.jpeg

lGoKGyf.jpeg

q3CHvlP.jpeg

HEjNWQA.jpeg

WMXXkw9.jpeg

4fy20gh.jpeg

4vmPAxp.jpeg
The thing is ps5 has far more computational power than ps4, around 6~x the power of ps4. It can run circles around even ps4 realtime cutscenes.

Cyberpunk max settings need a gpu with nearly 20~x the performance of ps4, and cutscene last of us 2 still outshines it.
 

regawdless

Banned
The thing is ps5 has far more computational power than ps4, around 6~x the power of ps4. It can run circles around even ps4 realtime cutscenes.

Cyberpunk max settings need a gpu with nearly 20~x the performance of ps4, and cutscene last of us 2 still outshines it.

Cyberjunk is a massive open world game with tons of characters, impressive scale and architecture etc. while TLOU2 is way more limited in scale.

Difficult to really compare the two.
 
Cyberjunk is a massive open world game with tons of characters, impressive scale and architecture etc. while TLOU2 is way more limited in scale.

Difficult to really compare the two.
one seems to require nearly 20~x the performance to run. One would expect it to smoke ps4 cutscene models. PS4 is only about 8~x the performance of ps3, yet ingame ps4 models smoke ps3 cutscene models for the most part.
 
Last edited:

JimboJones

Member
It doesn't really matter how good the characters look when their animations are so awful, mechanical and artificial.
TLOU2 is in another league in that respect.
That's true although it's more of a game specific issue Vs "power".
I remember being disappointed in Mankind Divided in that respect, looked great in screenshots very next gen for the time but as soon as characters moved I was transported back to Human Revolution lol.

On the other hand some games have great animation but take it too far, looks great but feels like playing through sludge as each animation has to play out.
 

regawdless

Banned
one seems to require nearly 20~x the performance to run. One would expect it to smoke ps4 cutscene models. PS4 is only about 8~x the performance of ps3, yet ingame ps4 models smoke ps3 cutscene models for the most part.

Yeah but the whole package is what counts. I don't think we should only pick out in-game models to prove a point while ignoring the rest like dynamic time of day with realistic lighting vs baked static lighting, very limited scale and character count vs huge vertical open world etc.

TLOU2 is great at what it does, extremely efficient due to the incredibly talented and experienced team, being developed and designed only for one HW architecture. It has been crafted with the limitations of the target hardware in mind, so everything has been adjusted to it.

That's what I'm saying, a comparison is pretty difficult for me because of how extremely different these games are in basically every area.
 
Top Bottom