Now this is one fat lie.I didn't really play at all before the patch that disabled VRS but played some last night and it looks really, really low res on console. When you move the camera anything with specular highlights just crawls. DLSS definitely has issues and looks awful in some games but is still absolutely light years ahead of FSR. Anyone claiming the console version has a clean image in performance mode needs to hit the opticians ASAP.
Not blind.No, i didn't bother to include the Xbox series X in the screenshot, maybe i should? Same showcase of filtering. PC also didn't have the option to disable VRS, although it didn't have PS5's problem.
Checkerboard pattern with RT mode too
Stop spending days on pcgamingwiki then, you're blind
Protect the brand????.....move sales?????.......its a frikken 4060....even the true trash RTX 3050 sold.
The lower tier cards are almost entirely price sensetive and not actually performance sensitive, there is no logical reason to buy an RTX 3050....but people still bought that piece of shit.
The RTX 3060'8G.....yup it sold too....not because of its performance, but because its an Nvidia card and is cheap.
No one with even a mordcum of hardware knowledge would get or reccommend the 3050 or 3060'8G.
Nvidia jumping to 16GB for the RTX4060 on a 128bit bus?
Whatever drugs you are on are hectic my guy.
Im pointing out that Nvidia cheaped out on the bus width already, its illogical to even think they would then suddenly pony up on getting more VRAM.
So even wishful thinking for more VRAM is borderline delusional.
The 6600GT wasnt an entry level GPU.
The 6200GS was an entry level GPU.
The GTX 960 would be the analog to the RTX 4060.
Not the GTX 1060.
If you are buying hardware that cant do the job you want it to do, yes you deserve all the shit.
If you are buying an RTX4060 to play at 1080p balanced settings......do you bro.
The 4070Ti is already 800 dollars with 12GB of VRAM.
You expect them to release a 16GB variant for 200 dollars less????
That would be a bigger bus and more VRAM for a 200 dollar discount.
If they go down a bus to accommodate the VRAM, the card still chokes at higher resolutions due to how slow the memory is.
Definitely agree, game manages to impress visually in multiple occasions.Based on YT videos I had the impression that this game looks like some typical PS4 game, but after finally playing it I'm still impressed by how good the remake looks. The developers of the dead space remake have said in one of their videos they are using real time GI, and it really shows, because the lighting in this game looks simply amazing. In DS remake character models during the gameplay FINALLY looks like they belong to the scene, because their entire body is lit correctly. That wasnt the case in PS4 era, where you could see beautiful lighting on character models during the cutscene, but during the gampelay it looked extremely flat. Also the beautiful lighting in remake takes the atmosphere and realism to the next level, and especially in HDR. Dead Space Remake is extremely dark game, and in SDR there's a black crush pretty much everywhere, but in HDR you dont even need to use the flashlight, because you can see amazing shadows details and depth. This game looked more like real film to me.
And that's how lighting looks in Uncharted 4, the best looking PS4 game. During cutscens character models are lit manualy, so they look very realistic, but when gameplay starts the lighting on character models looks just flat.
I am up for another UC4 replay this year. Last version I replayed was ps5 port 4k30 and really the game is still amazing.Definitely agree, game manages to impress visually in multiple occasions.
SSS is probably the best I've ever seen (the organic stuff in general looked so life-like), and fire effects and related lighting much, much more impressive than the one in TLOU Part I for instance. Even particles like sparks reacted realistically to Isaac's body. Great stuff, made even more pleasant by the basically non-existent downgrade between Fidelity and Performance.
People complained about facial expressions, and it's certainly true, but it's definitely worth mentioning how unlike many other games, here whatever language you select the game is lip-synced accordingly!
I only wish the mentioned GI applied to the flashlight was even remotely close to ND games, especially considering the dark nature of the game. Motive solution is far from being on par with the PS3 TLOU implementation in 2013 unfortunately, nor the simplified Alien: Isolation solution.
Naughty Dog implementation seen in Part II, Uncharted 4, Lost Legacy (PS4 Pro versions especially as later ports including PC have been downgraded in this regard) and Part I is literal generations away, and something like this would have done wonders in this Dead Space setting.
Maybe for the Sequel, which I wish is currently planned despite the game not setting the world on fire.
I need more Dead Space, I don't think I have done four playthroughs one after the other before this Remake ever before..
Have you tried disabling your integrated GPU in Device Manager?
I can't disable it. Im on a laptop, and it uses both.While not quite as good as a maxed out PC, it's good enough. Very sharp. The only problem is Fidelity + FPS. That is where I break away from console and prefer PC.
I think had the availability of PS5 been more in line with traditional launches (tried for over a year) I would lean more into the conversation.
But, anyway, this looks Solid.
Try going into your BIOS and turning it off.
6 core 12 threads still kicks ass. Will be interesting to see when the high end stuff of today(20 core 32 thread/ 16 core/32 threads, starts not being enough in the future. I remember when a 2500k was massively overkill as a 4c 4t CPU but now 4t is terrible for gaming and 8t is bad for consistent fps and high 1% lows.RTX 2060S and R5 3600 still putting in work.
Imagine someone who built the exact system in 2019?
They'd be swimming in it right now.
Yet people told me way back when 6 cores was gonna mean death.
Could probably upgrade to a R5 5600/5800X3D and float the generation.
Yet people will come and say you need a 2000 dollar computer to play PC games.
Im glad DF still use this old PC as their low end/mid range machine.
Yes some games will beat it up, but man if youve had this CPU/GPU since 2019, they have earned their keep.
Even a small bump up to the 3060Ti would have 1440pDLSS be easy work going forward.
Odd the game has loading stutter even on PS5.
The 3050 is literally failed 3060s that Nvidia produced simply to fill out demand.The 3050 had what used to be the minimum recommendable amount of Vram, so it's a flawed analogy.
You are contradicting yourself, it's a price sensitive segment, but it was overpriced AF and still sold. Maybe people saw that it was an actual gaming card that had the recommended minimum amount of Vram at the time... maybe they saw it could run Cyberpunk at 60fps by turning RT off and using dlss. So why is that ilogical?... if it was choking on cyberpunk to 10fps no matter what and still sold, I would say that was stupid.
Making in work in the sense it is doable...yes its totally doable, with 2GB GDDR modules you can easily make a 16GB 128bit card.If they can make 8gb 14gbps work on 128bit just fine maybe they can do the same with 16gb 24gbps on 128 bit?. What is so crazy about that exactly?. The last 30% of the memory could be sluggish, but we are talking about Nvidia here, sales first.
It's not wishful thinking, I already pointed out how is that problematic for nvidia and how they can patch their thing and keep it going... it is that or release a legit lemon, or cancel it and rename it a 4050... which would still have worse lasting value than a 3050, doa out of the box.
Absolutely not.Not if you want to get rigurous, but we used to call them entry level in the context of gaming, 6200gs, 8400 and gt710 was the stuff you would get for your htpc or a second monitor. Even yourself called the 4060 entry level on this thread.
No you don't, you have been shitting on the 3050 but think a 4060 is fine for 1080p with only 8gb?... Obviously you have not used the 3050 or have seen the 3060ti trying to play Hogwart's Legacy and Dead Space?. 8gb is dead bro, don't screw up people with that recommendation.
Did you read what I was quoting.We have talked about this, this is not the first time manufacturers sell lower end cards with more Vram, it's a selling point.
I've seen like 4 or so posts online from others that sound like they have my issue - the game refusing to use the main GPU but the integrated GPU instead. Not sure if this is really the case but either way I'm super sad right now.
So is the DualSense implementation worth an extra 10% over just going with the XSX version?
You can manually choose which GPU an app runs at, in the Ndivida control panel at least. I'm sure there's gotta be an equivalent for AMD as well.
PS5 and SX are priced the same, no ?
Already tried it. It's set to use my 3050ti and still not working.You can manually choose which GPU an app runs at, in the Ndivida control panel at least. I'm sure there's gotta be an equivalent for AMD as well.
Not as much struggle as Harry Potter DF, eh, EH?
PS5 and SX are priced the same, no ?
The 3050 is literally failed 3060s that Nvidia produced simply to fill out demand.
It wasnt actually originally supposed to come out.
The 3060 was supposed to be the lowest tier RTX30 card.
To put things into context the RTX 3050 is using a mobile configuration....no one should have bought it, but people who only know Nvidia as "the" GPU maker and were desperate during the shortage ended up buying it, cuz it was in stock and wasnt scalped to death so prices were "good" the performance not so much.
A bad purchase and I would never reccomend it to anyone.
Making in work in the sense it is doable...yes its totally doable, with 2GB GDDR modules you can easily make a 16GB 128bit card.
But in reality Nvidia have no incentive to do so when they are already cutting costs as is.
The 6200GS was a straight up gaming card, entry level but still fully touted as a gaming card.
The 6600GT and co would be analogues to the xx70 class cards which I doubt you would ever call entry level.
IThe RTX 3050 is a horrible card through and through, its vastly underpowered, starved on bandwidth and came way too late to be worth its own weight in shit.
A750s and RX6600s do everything better than it....thats why I shit on the RTX3050.
Its VRAM count isnt something im too worried about cuz that chip belongs in a low end gaming laptop anyway.
Did you read what I was quoting.
How does it make sense to sell a 4070Ti'12G for 800 dollars, then sell a 4070Ti'16G for 600 dollars?
Now understand that im not condoning Nvidias practices this generation, hell im very much against them, every card is overpriced.
But seeing the landscape as it is, im not sitting in some dreamland thinking Nvidia is going to give customers a bone and make the 4060 16GB.
Not in the climate we are in.
Im being a realist here.
Best best best best scenario is it gets 12GB on small bus and thats pretty damn wishful thinking..........there was the engineering sample 160bit bus 10GB card, but i dont know if Nvidia is actually gonna send that to market......so most likely the 4060 is going to be 8GB and people will still buy it, cuz it will be priced in the range that people can actually afford, pretty much regardless of its actual performance.
UC4 at 4k/40 on ps5 is so much nicer than 1440/60. I don5 understand why ND got rid of the 4k/40 mode in the LoU remake. It seems much less stable now that they uncapped it. I finished it last night and the final section I'm the hospital CHUGS now, probably due to the dynamic lighting that is so intense in that section.I am up for another UC4 replay this year. Last version I replayed was ps5 port 4k30 and really the game is still amazing.
I've seen of course all your comparisons and I still think higher resolution is probably worth the weird downgrades.... but maybe instead of playink 4k40 this time.... I replayed ps4 verison again
1440p looks shit. both me and Rofif are heavily agree on this one. but people are mostly in denial, since 1440p is a safe spot for hardware req and practically is a must for 60 fpsUC4 at 4k/40 on ps5 is so much nicer than 1440/60. I don5 understand why ND got rid of the 4k/40 mode in the LoU remake. It seems much less stable now that they uncapped it. I finished it last night and the final section I'm the hospital CHUGS now, probably due to the dynamic lighting that is so intense in that section.
Look I fully agree Nvidia is fucking us over this generation.If people buys that they are gonna be as screwed as the poor saps that paid premium for a shity 4gb 6500xt not long ago. At least we agree they deserve all the stuttering they are gonna get (after being warned of course).
The 4060 is a hard sell to anyone in the know.Those 4060 and 4050 Ti are going to be a disaster.
They will probably cost around 500$. But only bring 8GB of vram, that will be short for games being released this year and onwards, especially when using RT.
And to make it worse, only an 8 PCIe lanes. So vram will be used up rapidly, because it's only 8GB, then the GPU will need to fetch data from system RAM, but it will bottlenecked by a lower PCIe bandwidth.
And to make thing even worse, only 128bit bus for the vram.
So people will have to lower texture detail, Ray-tracing and some effects, to be able to play games smoothly on a brand new GPU.
This is the kind of crap one would expect from a sub 200$ GPU, not from something costing more than double.
4K displays are overated. The thing is such insanely high resolution is not very universal, so for example 1920x1080 BD movies, or even older games (games with low quality assets and textures) will look much worse when displayed at 4K. What's more, even if you want to play 4K content on your 4K TV/monitor, you will still need to take into account the size of the display and the viewing distance, because if your eyes cant see more than lets say 1920x1080 pixels from the place where are you sitting, so why even bother with 4K display.1440p looks shit. both me and Rofif are heavily agree on this one. but people are mostly in denial, since 1440p is a safe spot for hardware req and practically is a must for 60 fps
but 4k/upscaled is also better than 1440p. a 1200p internal resolution being upscaled to 4k will destroy and demolish so called "native 1440p" rendering. so few people are aware of the difference. those who do and see what 4k lods/assets do to a game, and if they can discer it like me and rofif do, such people like us are unable to accept or be happy with "1440p" image quality.
it is a both blessing and curse.
even at 1200p internal, 4k upscaled is much much heavier to run than native 1440p in most cases. because it still loads 4k hints, lods and assets. 1440p however is plain 1440p. 1440p lods, assets, which makes games look weird/blurry compared to 4k upscaled