• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Graphical Fidelity I Expect This Gen

hinch7

Member

Skimmed through it and it doesn't look all that impressive really. As in not a big jump from Control. PLus must be the console version as there's lots of tearing and aliasing. Hopefully pathtracing and RR in the PC version will make a big difference.

And watching him/her play was kinda painful. Why not wait to aim for better accuracy than just spam and hit nothing.
 
Last edited:

MidGenRefresh

*Refreshes biennially
i think mid gen refresh is playing the PC version in path tracing.

Correct.

why? its a last gen game with path tracing being compared to a next gen only game.

Calling Cyberpunk 2077 last gen game is kind of missing the mark. PS4/Xbox One version of Cyberpunk feels more like a demake than anything else. It's clear that the lead platform for this one was PC, not a specific console generation.
 

GymWolf

Gold Member
What game has hundreds of cars?
Or for that matter ones that are "constantly calculated"?
Like what?

The matrix demo only did that as a flex not as an actual use case.
I said dozens/hundred between npcs and cars, it is still way more than what a small game has to render.
 

CGNoire

Member
I said dozens/hundred between npcs and cars, it is still way more than what a small game has to render.
Even a couple dozen in game isnt really all that impressive. And they dont calculate that when your not present. Idk gta 5 on pc has a shit ton of cars and npcs on screen when modded amd it doesnt seem to stress even old ass hardware.

I honestly am not keen on this trend here of hand waiving away issue because "its open world". When almost every big production has been open world for more than a generation now. Readily availble streaming tech discipline have matured and been availble for quite some time now.Open world is far more of a budget issue than a technical one today.
 

GymWolf

Gold Member
Even a couple dozen in game isnt really all that impressive. And they dont calculate that when your not present. Idk gta 5 on pc has a shit ton of cars and npcs on screen when modded amd it doesnt seem to stress even old ass hardware.

I honestly am not keen on this trend here of hand waiving away issue because "its open world". When almost every big production has been open world for more than a generation now. Readily availble streaming tech discipline have matured and been availble for quite some time now.Open world is far more of a budget issue than a technical one today.
I'm not hand waving, i asked for more physics in vg since forever.

I'm trying to understand the reason why it could be heavy.

You still have to remember that these consoles are trash for 2024 standards.
 

GymWolf

Gold Member


Alan wake 2 was built as a 30fps experience aimed at pushing visuals and ambiance. Music to my ears.

Lol it is not or it would not have a 60 fps mode dude.

That only means that they are not using the 30 fps to render more physics or destruction or anything nextgen because this stuff also need to be present in the 60 fps mode.

30 fps mode is gonna be 60 fps mode with better res, lights and maybe some texture here and there.

You are gonna recognise a 30 fps only game from a big studio when we are gonna see one by the end of the console life span.
 
Last edited:

Lethal01

Member
Lol it is not or it would not have a 60 fps mode dude.

That only means that they are not using the 30 fps to render more physics or destruction or anything nextgen because this stuff also need to be present in the 60 fps mode.

You can have next gen effects that can be scaled back.
 

CGNoire

Member
I'm not hand waving, i asked for more physics in vg since forever.

I'm trying to understand the reason why it could be heavy.

You still have to remember that these consoles are trash for 2024 standards.
I know you and me on same page. Your my physics brother in arms :). I just get triggered by seeing open world get brought up so often.

They are for sure. But its not like they "cant" do it...they just simply are refusing to. :(
 
Last edited:

GymWolf

Gold Member
You can have next gen effects that can be scaled back.
Some, not all.

If 30 fps let you put more enemies on screen, more destruction, more geometry, more interaction etc. you can't dial down this stuff because it is part of the core gameplay and presentation, and we know this stuff is related to framerate, just look at last zelda everytime there some explosions\destructions, everytime you use ultra hand or when you have more cpmpanions around you, the framerate tanks.

You can scale down graphic effects like shadows, lights, textures etc., not core elements that make the game what it is.

Would you be happy to have control 2 60 fps mode without destruction and half the enemies on screen?? I don't think so, especially if destruction is used for gameplay purpose (i don't have to explain how less enemies would change the encounters because there is no need to)

It would just be a different game.
 
Last edited:

Lethal01

Member
Some, not all.

It's very game dependent, Some methods of doing graphical effects also utilize the CPU, really wouldn't be surprisng if their lighting system utlized the CPU and was scalable.

Not saying that's how it is, just that it wouldn't be anything surprising.
 

GymWolf

Gold Member
It's very game dependent, Some methods of doing graphical effects also utilize the CPU, really wouldn't be surprisng if their lighting system utlized the CPU and was scalable.

Not saying that's how it is, just that it wouldn't be anything surprising.
Like i said, you can scale down many things, but not core elements of the game that are there in the first place BECAUSE they made the game around 30 fps.

Rtx is not a core gameplay element, no matter how many people are gonna tell me that seeing the reflection of an enemy into a glass saved their life thousands of times during fast paced shooters, we know it's highly situational at best and fucking bullshit of the highest order at worse :lollipop_grinning_sweat:
 
Last edited:

Lethal01

Member
Like i said, you can scale down many things, but not core elements of the game that are there in the first place BECAUSE they made the game around 30 fps.

Sure, but that doesn't change the fact that they may have used the fact that they are aiming for 30ps to Push the visuals rather than focusing on gameplay affecting ai or phyysics, Visuals that they then find they can scale down enough to hit 60fps
 

PeteBull

Member
You are gonna recognise a 30 fps only game from a big studio when we are gonna see one by the end of the console life span.
U can tell right away the matrix demo is 30fps target(well below quite often even), and same way u can tell its next gen af, so 100% agree, or like on ps4 u had the order 1886, 30fps with black bars, corridor shooter, just to make sure all the hardware power goes into bells and whistles.
 

GymWolf

Gold Member
Sure, but that doesn't change the fact that they may have used the fact that they are aiming for 30ps to Push the visuals rather than focusing on gameplay affecting ai or phyysics, Visuals that they then find they can scale down enough to hit 60fps
Maybe you are right but to me aw2 doesn't look nearly good enough to use the 30 fps just to pump the graphic, it looks like yet another great looking crossgen game.

It doesn't look like a 30 fps only game like it was for the order or tlou2 on ps4 if it make sense.

I need to see something that absolutely smash stuff like hfw (especially from a small, linear game vs open world game) and i absolutely not seeing that with aw2.
 
Last edited:

MidGenRefresh

*Refreshes biennially
Few more from Dog Town.

yG0acN6.jpg


OnxiHns.jpg


NX7Zq7B.jpg


CC7kNKQ.jpg


km9fsI8.jpg


VaWoAda.jpg


dwj3uKo.jpg


fsXHi0n.jpg
 

Luipadre

Gold Member
Honestly, those screenshots are quite incredible.

But just to put it in perspective, here are some spiderman 2 comparisons.
QNX5lCX.jpg


FW5BSH9.jpg


dnooaWv.jpg

IcYEskg.jpg


dKISUns.jpg

peterparkernewface.jpg


Insomniac needs to throw this engine in the trash. And then set that trash on fire. And finally scatter the ashes in the Mississippi river. Then nuke Mississippi just to be sure.

If sony doesnt want to pay Epic the insane 12% cut they are asking for then go and make Decima the goto engine for ALL Sony studios. Do what EA did with Frostbite, but instead make it an engine that EVERYONE at sony can modify and improve. I loved what KojiPro did with the Decima engine. They were doing their own commits instead of simply borrowing the engine. they helped develop it. And looking at how beautiful the cimematics looked in HFW, it's obvious that GG benefitted from KojiPro's changes. I want ND, SSM, Insomniac, Sucker Punch and Bluepoint ALL working together to make Decima as good if not better than UE5. I want ND's motion matching, Insomniac's Ray traced reflections, GG's character models, Sucker Punch's foliage all in SSM's next game.

Use the PS5 geometry engine, primitive shaders, and all that IO to maximize its potential. Get something like Nanite in there. It is obvious that after three years, Sucker Punch and ND are not even close to showing their next game because of all the backend work needed to get each engine these so-called next gen features. No, have ONE engine so that when one team implements ray traced reflection or realtime GI or physics, everyone gets it automatically.

CD project has realized that despite having a great engine, the time needed to get these features in is time that couldve been spent on making the actual game. lets hope sony devs realize this.

Cyberpunk was made for 10k pcs. Spiderman was made for a 400 dollar console. Spiderman 2 and insomniacs engine looks amazing, especially if you consider this runs at this fidelity and fps on a 400 dollar hw.
 

alloush

Member
Skimmed through it and it doesn't look all that impressive really. As in not a big jump from Control. PLus must be the console version as there's lots of tearing and aliasing. Hopefully pathtracing and RR in the PC version will make a big difference.

And watching him/her play was kinda painful. Why not wait to aim for better accuracy than just spam and hit nothing.
Yeah I wasn’t that impressed by that clip either. And I think it was running on PC cuz you can clearly see the mouse cursor when they die and restart the mission. For some reason in the other clips the game looked more refined.
 

GymWolf

Gold Member
Aw2 combat look
Honestly, those screenshots are quite incredible.

But just to put it in perspective, here are some spiderman 2 comparisons.
QNX5lCX.jpg


FW5BSH9.jpg


dnooaWv.jpg

IcYEskg.jpg


dKISUns.jpg

peterparkernewface.jpg


Insomniac needs to throw this engine in the trash. And then set that trash on fire. And finally scatter the ashes in the Mississippi river. Then nuke Mississippi just to be sure.

If sony doesnt want to pay Epic the insane 12% cut they are asking for then go and make Decima the goto engine for ALL Sony studios. Do what EA did with Frostbite, but instead make it an engine that EVERYONE at sony can modify and improve. I loved what KojiPro did with the Decima engine. They were doing their own commits instead of simply borrowing the engine. they helped develop it. And looking at how beautiful the cimematics looked in HFW, it's obvious that GG benefitted from KojiPro's changes. I want ND, SSM, Insomniac, Sucker Punch and Bluepoint ALL working together to make Decima as good if not better than UE5. I want ND's motion matching, Insomniac's Ray traced reflections, GG's character models, Sucker Punch's foliage all in SSM's next game.

Use the PS5 geometry engine, primitive shaders, and all that IO to maximize its potential. Get something like Nanite in there. It is obvious that after three years, Sucker Punch and ND are not even close to showing their next game because of all the backend work needed to get each engine these so-called next gen features. No, have ONE engine so that when one team implements ray traced reflection or realtime GI or physics, everyone gets it automatically.

CD project has realized that despite having a great engine, the time needed to get these features in is time that couldve been spent on making the actual game. lets hope sony devs realize this.
What is so special about sucker punch foliage? You included them, but not bend that made a better looking game in their first experience on ps4?

Bad slimy
spraying-squirting-water.gif


Shit you could have said sucker punch particles since some effects in inf3 still looks great today (more precisely absorbing neon and the smoke shotgun)
 
Last edited:

alloush

Member
Everyone wants better animation. It gos without saying and GymWolf GymWolf and I adore Euphoria for that reason.
Yeah I have a very soft spot for animations, being the sports gamer that I am. I feel like animations always get overlooked, when was the last time we saw improvements made to animations? EA FC (or the new FIFA) despite the advanced tech they are using have very janky animations to the point where I cannot play the game anymore. Old futbol titles had better animations despite the old tech. That's why I freakin love RDR2.
 

SlimySnake

Flashless at the Golden Globes
Aw2 combat look

What is so special about sucker punch foliage? You included them, but not bend that made a better looking game in their first experience on ps4?

Bad slimy
spraying-squirting-water.gif


Shit you could have said sucker punch particles since some effects in inf3 still looks great today (more precisely absorbing neon and the smoke shotgun)
This was pre-HFW but they had interactive foliage down in Ghost of Tsushima.

yYIqgHg.gif


B benzy made this gif and it honestly blew me away back then.

They also had tall foliage with different kinds of plants and even had weather physics applied to them.

4ffa77533224e4ed906f2ac1a4d635fb9c15e7ee.gif

ianjefo5rrcludku4da1.gif


Bend's crowd tech is very impressive and it would be great if other studios like ND could leverage it for TLOU3.
 

SlimySnake

Flashless at the Golden Globes
Correct.



Calling Cyberpunk 2077 last gen game is kind of missing the mark. PS4/Xbox One version of Cyberpunk feels more like a demake than anything else. It's clear that the lead platform for this one was PC, not a specific console generation.
Yeah, but the PS4 Pro, X1X and Series S versions could run the game just fine. I would say its more of a PS4.5 game than anything. You can also look at the vram requirements to give you a good idea of what they were aiming for. Cyberpunk in 2020 used around 5 GB on PC. Only ray tracing pushed it higher to around 7GB. Now path tracing pushes it above 12GB, but the game's assets in 2020 were designed with last gen hardware in mind.

PS4 couldve run this game at 720p. CD Project just didnt even bother optimizing those versions.

Cyberpunk was made for 10k pcs. Spiderman was made for a 400 dollar console. Spiderman 2 and insomniacs engine looks amazing, especially if you consider this runs at this fidelity and fps on a 400 dollar hw.
nah, cyberpunk runs just fine on $400 consoles like the playstation 5. It was designed with those consoles in mind. the path tracing mode is just a bonus. We have discussed this a million times in this thread. The game was designed with baked lighting in mind. It looks great even with RTGI off.

And nothing about insomniac's ending looks amazing. The matrix is running on the same hardware and looks amazing. Even better than cyberpunk on $10k PCs. (its actually more like $3k but i will allow the hyperbole since im a big fan of hyperbole myself).

03_gameplay.jpg

i4BRsja.jpg


92jxhif.gif

VlQchtR.gif
 

yamaci17

Member
Yeah, but the PS4 Pro, X1X and Series S versions could run the game just fine. I would say its more of a PS4.5 game than anything. You can also look at the vram requirements to give you a good idea of what they were aiming for. Cyberpunk in 2020 used around 5 GB on PC. Only ray tracing pushed it higher to around 7GB. Now path tracing pushes it above 12GB, but the game's assets in 2020 were designed with last gen hardware in mind.

PS4 couldve run this game at 720p. CD Project just didnt even bother optimizing those versions.


nah, cyberpunk runs just fine on $400 consoles like the playstation 5. It was designed with those consoles in mind. the path tracing mode is just a bonus. We have discussed this a million times in this thread. The game was designed with baked lighting in mind. It looks great even with RTGI off.

And nothing about insomniac's ending looks amazing. The matrix is running on the same hardware and looks amazing. Even better than cyberpunk on $10k PCs. (its actually more like $3k but i will allow the hyperbole since im a big fan of hyperbole myself).

03_gameplay.jpg

i4BRsja.jpg


92jxhif.gif

VlQchtR.gif
i mean i can get cyberpunk with path tracing 1440p dlss balanced (looks better than native 1080p) and get around 30 fps average (if not for vram limitations, it would likely never drop below that) on my aging 3070, which I got for 500 bucks msrp. got my whole rig for like around 850 bucks when I sold my older parts. it is %80 there. it only lacks the crispness of higher resolutions. but it looks good. look at how ff xvi or forspoken made joke out of PS5 for example. at least DLSS produces good results at 1440p upscaling. can't say the same for FSR or other regular old upscalers

it is really hyperbole imo. when it comes to pushing graphical fidelity, 30 FPS on consoles are considered acceptable. why it is not acceptable on something midrange like 3070 too? I find upwards of 30 fps playable with Reflex on PC so it will always be a win in my book. just assume you have a console and you play with the highest fidelity mode. So I don't get the "playable framerates" argument. it depends on the user. and, at the end of the day, I paid around 850 bucks for my PC, right? so ps5 running rt shadows at 30 fps with 1200p-1400p resolution averages is impressive, but only paying 1.7x money but getting playable PATH tracing at around 30 fps with a better upscaler and better image quality is not? The path tracing and DLSS definitely adds more than 2x to the game.

on top of that, here's what my 500 bucks investment into 3070 allowed me to experience past 3 years

- DLDSR (makes old games playing even more worthwhile, very good anti aliaser by itself)
- DLSS (peak reconstruction, dlss tweaks, being able to run at DSR 4K+DLSS performance and getting massive visual improvements over native 1080p while paying small price of performance. DLDSR+DLSS combo is also amazing. I've used both combo in countless games
- Half Life ray tracing, Quake ray tracing, the promise and potential of having all those great old games getting ray tracing treatment. they're games I will play to eternity. It is just something that a console won't do (not saying can't do, there's no modding support. if half life was to get an official ray tracing treatment by Valve to be released on consoles, they'd charge 40 bucks for it)
- Cyberpunk path tracing. I can get 30+ FPS with a good image quality. That's all I need. I can go in, drive a bike or walk around. Disable it and play the game. I can do both. The card allows me to experience and visit the city in that respect. Sure it all goes down the drain in combat. It is clear there are limitations.
- Witcher 3 RTGI + rt reflections, all suite. It looked so gorgeous, and revisiting that game in those visuals were a boon. I were able to get away with a locked 40 fps experience. granted consoles also run it with similar RTGI, it can look quite blurry. and that is where the crucial advantage DLSS has comes into play
- Metro Exodus RTGI + physx + hairworks. they all mesh together so fine. I was so grateful I had other reasons for delaying playing metro exodus for so long. I lucked out and enhanced edition was my initial first experience
- NVIDIA reflex, practically makes all games less laggier by default and lets you enjoy low lag at any framerate. the INPUT LAG I get at 35 FPS GPU bound with Reflex is most likely less than a console game running at locked 60 FPS/vsync (yet the latter doesn't complain but former does. amazing, right?)


I might sound like a shill (I really am not and I hate NVIDIA with all my guts due to VRAM) but genuinely, I still think it was a worthwhile investment. it is, for me, better to pay 500 bucks and get great experience and something NEW within the first 3-4 years rather than paying 500 bucks, then paying subscription money for years and get something that is great 6+ years later. I'm not saying the consoles are bad either, it is a good investment too. in the end, when the benefits I got from my 850 bucks investment total, I just am happy.

After seeing how much of a good stuff I can get out of PC and new tech, I will simply wait for a viable, affordable 16 GB NVIDIA card that doesn't ask for 1200 bucks. I would gladly pay for a 700 bucks 16 GB 5070 at this point . PC simply hits different. and at this point a potential 16 gb 5070 would most likely last you upwards of 6 years (which is why NVIDIA is hesitant to give 16 GB to midrange products. But I will bide my time, and still reap the benefits of what I have)

I can even see why NVIDIA is so hesitant on giving a lot of VRAM to these cards. you already get insane value out of them. if they had ample amounts of VRAM, we would use them till like 2030. it is sad but it is what it is. I hate their practices but I cannot also deny how much utiltiy they have to 2000/3000 cards in general. I wouldn't get or suggest 4060 or 4070 to anyone though. I will simply wait for 5070 at this point. If it is packed with 12 GB too, I will rest easy with my 8 GB (knowing that developers will do enough job to make sure their game runs on 12 gb with ray tracing at 1440p. means I will still be able to get away with 8 GB at a lower resolution with tweaked settings lol)
 
Last edited:

GermanZepp

Member
Honestly, those screenshots are quite incredible.

But just to put it in perspective, here are some spiderman 2 comparisons.
QNX5lCX.jpg


FW5BSH9.jpg


dnooaWv.jpg

IcYEskg.jpg


dKISUns.jpg

peterparkernewface.jpg


Insomniac needs to throw this engine in the trash. And then set that trash on fire. And finally scatter the ashes in the Mississippi river. Then nuke Mississippi just to be sure.

If sony doesnt want to pay Epic the insane 12% cut they are asking for then go and make Decima the goto engine for ALL Sony studios. Do what EA did with Frostbite, but instead make it an engine that EVERYONE at sony can modify and improve. I loved what KojiPro did with the Decima engine. They were doing their own commits instead of simply borrowing the engine. they helped develop it. And looking at how beautiful the cimematics looked in HFW, it's obvious that GG benefitted from KojiPro's changes. I want ND, SSM, Insomniac, Sucker Punch and Bluepoint ALL working together to make Decima as good if not better than UE5. I want ND's motion matching, Insomniac's Ray traced reflections, GG's character models, Sucker Punch's foliage all in SSM's next game.

Use the PS5 geometry engine, primitive shaders, and all that IO to maximize its potential. Get something like Nanite in there. It is obvious that after three years, Sucker Punch and ND are not even close to showing their next game because of all the backend work needed to get each engine these so-called next gen features. No, have ONE engine so that when one team implements ray traced reflection or realtime GI or physics, everyone gets it automatically.

CD project has realized that despite having a great engine, the time needed to get these features in is time that couldve been spent on making the actual game. lets hope sony devs realize this.
CP2077 look fenomenal. In your opinion, what are the contras of all studios working with the same engine?
 

SlimySnake

Flashless at the Golden Globes
CP2077 look fenomenal. In your opinion, what are the contras of all studios working with the same engine?
If you mean cons then Frostbite comes to mind. Bioware and other studios really struggled to get this first person up and running for their third person RPG games. IIRC, It took the fifa and madden teams years of hard work to get Frostbite just right. so there is always pros and cons.

That said, these studios have all benefitted from that move to frostbite. Anthem despite the downgrade looks stunning. Fifa is now using hair tech that DICE created. And not all devs had issues with it. NFS 2015 still looks amazing and was one of the first games to use Frostbite.
 

SlimySnake

Flashless at the Golden Globes
i mean i can get cyberpunk with path tracing 1440p dlss balanced (looks better than native 1080p) and get around 30 fps average (if not for vram limitations, it would likely never drop below that) on my aging 3070, which I got for 500 bucks msrp. got my whole rig for like around 850 bucks when I sold my older parts. it is %80 there. it only lacks the crispness of higher resolutions. but it looks good. look at how ff xvi or forspoken made joke out of PS5 for example. at least DLSS produces good results at 1440p upscaling. can't say the same for FSR or other regular old upscalers

it is really hyperbole imo. when it comes to pushing graphical fidelity, 30 FPS on consoles are considered acceptable. why it is not acceptable on something midrange like 3070 too? I find upwards of 30 fps playable with Reflex on PC so it will always be a win in my book. just assume you have a console and you play with the highest fidelity mode. So I don't get the "playable framerates" argument. it depends on the user. and, at the end of the day, I paid around 850 bucks for my PC, right? so ps5 running rt shadows at 30 fps with 1200p-1400p resolution averages is impressive, but only paying 1.7x money but getting playable PATH tracing at around 30 fps with a better upscaler and better image quality is not? The path tracing and DLSS definitely adds more than 2x to the game.

on top of that, here's what my 500 bucks investment into 3070 allowed me to experience past 3 years

- DLDSR (makes old games playing even more worthwhile, very good anti aliaser by itself)
- DLSS (peak reconstruction, dlss tweaks, being able to run at DSR 4K+DLSS performance and getting massive visual improvements over native 1080p while paying small price of performance. DLDSR+DLSS combo is also amazing. I've used both combo in countless games
- Half Life ray tracing, Quake ray tracing, the promise and potential of having all those great old games getting ray tracing treatment. they're games I will play to eternity. It is just something that a console won't do (not saying can't do, there's no modding support. if half life was to get an official ray tracing treatment by Valve to be released on consoles, they'd charge 40 bucks for it)
- Cyberpunk path tracing. I can get 30+ FPS with a good image quality. That's all I need. I can go in, drive a bike or walk around. Disable it and play the game. I can do both. The card allows me to experience and visit the city in that respect. Sure it all goes down the drain in combat. It is clear there are limitations.
- Witcher 3 RTGI + rt reflections, all suite. It looked so gorgeous, and revisiting that game in those visuals were a boon. I were able to get away with a locked 40 fps experience. granted consoles also run it with similar RTGI, it can look quite blurry. and that is where the crucial advantage DLSS has comes into play
- Metro Exodus RTGI + physx + hairworks. they all mesh together so fine. I was so grateful I had other reasons for delaying playing metro exodus for so long. I lucked out and enhanced edition was my initial first experience
- NVIDIA reflex, practically makes all games less laggier by default and lets you enjoy low lag at any framerate. the INPUT LAG I get at 35 FPS GPU bound with Reflex is most likely less than a console game running at locked 60 FPS/vsync (yet the latter doesn't complain but former does. amazing, right?)


I might sound like a shill (I really am not and I hate NVIDIA with all my guts due to VRAM) but genuinely, I still think it was a worthwhile investment. it is, for me, better to pay 500 bucks and get great experience and something NEW within the first 3-4 years rather than paying 500 bucks, then paying subscription money for years and get something that is great 6+ years later. I'm not saying the consoles are bad either, it is a good investment too. in the end, when the benefits I got from my 850 bucks investment total, I just am happy.

After seeing how much of a good stuff I can get out of PC and new tech, I will simply wait for a viable, affordable 16 GB NVIDIA card that doesn't ask for 1200 bucks. I would gladly pay for a 700 bucks 16 GB 5070 at this point . PC simply hits different. and at this point a potential 16 gb 5070 would most likely last you upwards of 6 years (which is why NVIDIA is hesitant to give 16 GB to midrange products. But I will bide my time, and still reap the benefits of what I have)

I can even see why NVIDIA is so hesitant on giving a lot of VRAM to these cards. you already get insane value out of them. if they had ample amounts of VRAM, we would use them till like 2030. it is sad but it is what it is. I hate their practices but I cannot also deny how much utiltiy they have to 2000/3000 cards in general. I wouldn't get or suggest 4060 or 4070 to anyone though. I will simply wait for 5070 at this point. If it is packed with 12 GB too, I will rest easy with my 8 GB (knowing that developers will do enough job to make sure their game runs on 12 gb with ray tracing at 1440p. means I will still be able to get away with 8 GB at a lower resolution with tweaked settings lol)
I think your ability to settle for 30 fps on PC does put things into perspective. I just couldnt. I tried playing Star wars in 40 fps but the constant stuttering ruined that too.

I do agree that you dont necessarily need $3k PCs for pc gaming. In fact, i got a lot of shit for posting a link to the 6600xt selling for $189 last month. You could actually build a ten tflops pc with a decent cpu for under $500 now. I think path tracing is basically a slideshow at 30 fps. Especially in cyberpunk which just doesnt feel smooth at 30 or 40 fps. While driving in third person, its fine. I go into FPS mode and its a disaster. Starfield feels much smoother with variable framerates.

I think Alan Wake 2 will be a better test. Thats a game that should be perfectly playable at 30 fps on our 3070s and 3080s.
 

GymWolf

Gold Member
So a mod cancelled my message where i said that i was an animation W word?

Is this globally used term not good anymore here?

Wtf?

Ok...i'm an animation woman that take money in exchange for sex.
 
Last edited:

GymWolf

Gold Member
This was pre-HFW but they had interactive foliage down in Ghost of Tsushima.

yYIqgHg.gif


B benzy made this gif and it honestly blew me away back then.

They also had tall foliage with different kinds of plants and even had weather physics applied to them.

4ffa77533224e4ed906f2ac1a4d635fb9c15e7ee.gif

ianjefo5rrcludku4da1.gif


Bend's crowd tech is very impressive and it would be great if other studios like ND could leverage it for TLOU3.
I think you had the movable grass effect on crysis already and they debunked the physics thing for the windy grass, it is just an algorithm, not really physics based.

That game just had incredible art design and use of colors, tech wise it was ok.
 

SlimySnake

Flashless at the Golden Globes
I think you had the movable grass effect on crysis already and they debunked the physics thing for the windy grass, it is just an algorithm, not really physics based.

That game just had incredible art design and use of colors, tech wise it was ok.
I see. Well then i want Sony studios Using that algorithm instead of coming up with their own tech. That’s what i was trying to say.

Imagine driveclub weather in all Sony games.,
 

yamaci17

Member
I think your ability to settle for 30 fps on PC does put things into perspective. I just couldnt. I tried playing Star wars in 40 fps but the constant stuttering ruined that too.

I do agree that you dont necessarily need $3k PCs for pc gaming. In fact, i got a lot of shit for posting a link to the 6600xt selling for $189 last month. You could actually build a ten tflops pc with a decent cpu for under $500 now. I think path tracing is basically a slideshow at 30 fps. Especially in cyberpunk which just doesnt feel smooth at 30 or 40 fps. While driving in third person, its fine. I go into FPS mode and its a disaster. Starfield feels much smoother with variable framerates.

I think Alan Wake 2 will be a better test. Thats a game that should be perfectly playable at 30 fps on our 3070s and 3080s.
that's rather very interesting. how does your vsync settings look like? did you force vsync through NVCP for example (some guides recommend doing this but it might cause frametime disruptions in certain titles) or in game? I let 3d application decide for Cyberpunk in NVCP, Vsync off in game, Reflex ON (not boost), and just VRR on top of it. could be that some general/usually works settings causing disruption with the game?

jedi survivor, that one is definitely something else. with those stutters you can get 130 fps with 60 fps baseline with DLSS3 and still get a bad experience. that actually puts things into perspective too
 

GymWolf

Gold Member
I see. Well then i want Sony studios Using that algorithm instead of coming up with their own tech. That’s what i was trying to say.

Imagine driveclub weather in all Sony games.,
I made a similar post some days ago but then i scratched the whole thing because it sounded too absurd.

A reunion between the all big studios to share their best tech:

Digital acting from ND
Ubisoft motion matching
Rockstar euphoria
Cyberpunk rtx
Etc.

Sony has the team ice but they don't share jack shit, i swear...
 
Last edited:

SlimySnake

Flashless at the Golden Globes
You know, the B word...
lol you can’t call women bitches on an online forum.

Only Jim Ryan and Phil Spencer.

that's rather very interesting. how does your vsync settings look like? did you force vsync through NVCP for example (some guides recommend doing this but it might cause frametime disruptions in certain titles) or in game? I let 3d application decide for Cyberpunk in NVCP, Vsync off in game, Reflex ON (not boost), and just VRR on top of it. could be that some general/usually works settings causing disruption with the game?

jedi survivor, that one is definitely something else. with those stutters you can get 130 fps with 60 fps baseline with DLSS3 and still get a bad experience. that actually puts things into perspective too
Yh vsync off in game and turned on in nvcp with fps capped to 60. It just feels choppy in first person view.
 

PeteBull

Member
Anthem despite the downgrade looks stunning.
Gotta disagree here its solid looking for last gen game, but even so disliked by u spiderman2 looks visibly better, even starfield at points does too xD

Here for fun DF analy(s)in(g) the supposed e3 demo running on xbox one x ofc later we found out it was full on cgi trailer made by completely different team just for e3 show purpose, and actual game was only started being made according to what it was in bullshot(full cgi, not even in engine aka cutscenes) trailer =D
 

ChiefDada

Gold Member
Tech integration really isn't that easy, folks. You can't just pick up code from Sucker Punch/GoT and plop it into a Forbidden West, etc. Different systems can break each other.
 

GymWolf

Gold Member
lol you can’t call women bitches on an online forum.

Only Jim Ryan and Phil Spencer.


Yh vsync off in game and turned on in nvcp with fps capped to 60. It just feels choppy in first person view.
But i was calling myself a birch.

Saying that you are a name of something-whore is used since forever.

You never heard the term graphic W?
 

GymWolf

Gold Member
Tech integration really isn't that easy, folks. You can't just pick up code from Sucker Punch/GoT and plop it into a Forbidden West, etc. Different systems can break each other.
That's why i scratched the post.

But they don't share anything at all...

Like days gone had legit worse melee animations than tlou1 on ps3, maybe someone could have helped them to fix the jerky animations.

Sushima face animation were meh, why not get some help from nd etc.

Why nit helping insomniac with their human models...

Why do they even have an ice team to begin with?
 
Last edited:

yamaci17

Member
lol you can’t call women bitches on an online forum.

Only Jim Ryan and Phil Spencer.


Yh vsync off in game and turned on in nvcp with fps capped to 60. It just feels choppy in first person view.
try disabling both in game and external vsync, and dont use a frame limiter and try and see if it improves
 

GymWolf

Gold Member
try disabling both in game and external vsync, and dont use a frame limiter and try and see if it improves
If he has a gsync panel he needs vsync on.

Boy do i love pc gaming where NOTHING is a certainty and noobs like me get confused by people saying everything and the opposite of everything :lollipop_grinning_sweat:
 
Last edited:

yamaci17

Member
If he has a gsync panel he needs vsync on.

Boy do i love pc gaming where NOTHING is a certainty and noobs like me get confused by people saying everything and the opposite of everything :lollipop_grinning_sweat:
yes that is true, the general recommendation

but some games have broken internal timing when you enforce vsync from outside. even though the nv driver is vsync aware

%99 of the time vrr vsync combo will work like it is supposed to. there will be outliers

also there really is no hard requirement for gsync/freesync/vrr to have vsync alongside with them. vsync handles

1) rogue frames that go over the vrr range
2) practically stops the render from going over vrr range

now, yes, 1) is unpredictable and vsync is a surefire way to eliminate that. in my case, I get 35 45 fps in cyberpunk, and rogue frames have no potential of breaching past my 0-144 hz vrr range (LFC). so in the case of Cyberpunk, I don't use vsync

actually my take is even weirder, I never globally force vsync on. i never use vsync.

I only enable vsync if I see tearing. if i don't, even if it is there, I don't care :messenger_tears_of_joy: That's one approach. I usually don't see tearing because I'm never near my upper end of VRR range in games I play.

I only enable vsync in games where I feel I can get upwards of 100 FPS

I'm not saying the recommendation is wrong, it is perfectly fine. you need vsync and gsync/vrr is never enough to handle tearing all by itself.

but in my case, I never saw tearing in cyberpunk
 

GymWolf

Gold Member
I was actually posting for the thread in general and addressing a recurring ask where tech from a prior gen game gets brought up and people complain why a more recent game can't implement.

But since I have your attention, here's a couple gifs to remind you that Spider-Man 2 is the greatest looking game of all time.




Dude you don't have to sell me on that 7\10 game, i preordered the thing when i returned ff16 to gamestop months ago.
 

SlimySnake

Flashless at the Golden Globes
Tech integration really isn't that easy, folks. You can't just pick up code from Sucker Punch/GoT and plop it into a Forbidden West, etc. Different systems can break each other.
Thats why we are saying that ALL teams work on ONE engine. So one one team does develop something like realistic hair, another team can just leverage that because they are working within the same engine.

If ND is currently working on adding RT support for their next game, wouldnt it be great if they were using the same engine as insomniac so they dont have to do any of the RT R&D?

Everyone works on the same engine making commits as they go along like KojiPro did with Decima. if Koipro and GG can do it. So can other sony studios. Especially now that R&D on these engines is more challenging than ever with primitive and mesh shaders, machine learning support, RT support and all kinds of different destruction and physics simulations being added. its a gigantic waste of everyone's time if each sony studio is doing their own R&D on each of these features.
 

yamaci17

Member
Thats why we are saying that ALL teams work on ONE engine. So one one team does develop something like realistic hair, another team can just leverage that because they are working within the same engine.

If ND is currently working on adding RT support for their next game, wouldnt it be great if they were using the same engine as insomniac so they dont have to do any of the RT R&D?

Everyone works on the same engine making commits as they go along like KojiPro did with Decima. if Koipro and GG can do it. So can other sony studios. Especially now that R&D on these engines is more challenging than ever with primitive and mesh shaders, machine learning support, RT support and all kinds of different destruction and physics simulations being added. its a gigantic waste of everyone's time if each sony studio is doing their own R&D on each of these features.
just to make sure, can you ensure VRR engages as it should? It most likely does but I just wanted to ensure cover all bases


The flag you can enable to show VRR engagement status;

7j37Uuj.png



4byRhi1.jpg
 

alloush

Member
So a mod cancelled my message where i said that i was an animation W word?

Is this globally used term not good anymore here?

Wtf?

Ok...i'm an animation woman that take money in exchange for sex.
I had four messages removed for me today that were aimed at that creature who shall remain nameless and his initial message which started this whole thing remained. Too bad though I was having fun until he put me on the ignore list :messenger_weary:
 

alloush

Member
I was actually posting for the thread in general and addressing a recurring ask where tech from a prior gen game gets brought up and people complain why a more recent game can't implement.

But since I have your attention, here's a couple gifs to remind you that Spider-Man 2 is the greatest looking game of all time.




Not gonna lie those gifs look good. Honestly speaking, yes the game has kinda disappointed me GRAPHICALLY but I cannot lie I can’t wait to play the game. I loved the first two so goddamn much.
 

SlimySnake

Flashless at the Golden Globes
Gotta disagree here its solid looking for last gen game, but even so disliked by u spiderman2 looks visibly better, even starfield at points does too xD

Here for fun DF analy(s)in(g) the supposed e3 demo running on xbox one x ofc later we found out it was full on cgi trailer made by completely different team just for e3 show purpose, and actual game was only started being made according to what it was in bullshot(full cgi, not even in engine aka cutscenes) trailer =D

Oh Anthem is not next gen whatsoever. I was just saying that it was a very good looking game by last gen standards, and a big part of that was the frostbite engine. So at the end of the day switching to Frostbite did help them despite the early adjustment period.

And yeah, I am aware of the bullshit demo they created. But that makes it all the more impressive that they came as close as they did. I played it day one on PC and its one of the best looking games of last gen. Its underrated because most people passed on it due to it being a GaaS. I would easily rank it top 10 of last gen.

anthemgif4.gif


Everyone loves to trash explosions in this thread, but this game did them right.

bnn1rsyamyb51.gif

8PlsKq0.gif


VPZW31Y.gif
 

GymWolf

Gold Member
Oh Anthem is not next gen whatsoever. I was just saying that it was a very good looking game by last gen standards, and a big part of that was the frostbite engine. So at the end of the day switching to Frostbite did help them despite the early adjustment period.

And yeah, I am aware of the bullshit demo they created. But that makes it all the more impressive that they came as close as they did. I played it day one on PC and its one of the best looking games of last gen. Its underrated because most people passed on it due to it being a GaaS. I would easily rank it top 10 of last gen.

anthemgif4.gif


Everyone loves to trash explosions in this thread, but this game did them right.

bnn1rsyamyb51.gif

8PlsKq0.gif


VPZW31Y.gif
Still worse explosions than mad max.

I played the game and it was fun with friends, but it was really a 6\10 at best.
The game was released in alpha state.
 
Top Bottom