• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The sad state of PC gaming(according to Tech Of Tomorrow)

Status
Not open for further replies.
why you're holding so much hatred in your heart

iytch6odsol61.jpg
yamaci17 yamaci17 instead of digging yourself a deeper hole, just tell us that you are jealous of PC gamers, for taking all of your exclusives away, and for touching you when you were younger. Cause I can't understand where you come up with some of the most nefarious lies, and blatant bullshit. What honestly entices you to write all this bullshit, that just about anyone with single-digit IQ can refute, easily. There's a reason why so many people jumped on your post, and quoted you, because you have the most blatant bullshit posted on GAF today.
 

Guilty_AI

Member
yamaci17 yamaci17 instead of digging yourself a deeper hole, just tell us that you are jealous of PC gamers, for taking all of your exclusives away, and for touching you when you were younger. Cause I can't understand where you come up with some of the most nefarious lies, and blatant bullshit. What honestly entices you to write all this bullshit, that just about anyone with single-digit IQ can refute, easily. There's a reason why so many people jumped on your post, and quoted you, because you have the most blatant bullshit posted on GAF today.
Leave him alone. I'm pretty sure a pc killed his dog once and now he's out for revenge.
 

yamaci17

Member
And where does the Series S fit in to this equation of yours genius?
you can find a lot of criticism coming from me regarding series s

but it has at least respectable 8 core 16 threads zen 2 cpu accompanied with 500 gb/s ssd

its ssd and cpu spec is still better than %90 of pc userbase XD

by omitting ray tracing out, its memory can survive

for its low tflops, series s users are always welcoming 540p, so there's no problem on that front

but developers will have to accomodate for gimped cards that are 2070, 2080, 2080s, 3070 and 3080 for a time now until nvidia graces "pc" gamers with 16 gb mainstream gpus (3060 has 12 gb rofl)

then we shall see truly generational ray tracing enabled games with actual high quality textures, unlike the hideosity that are cyberpunk (extreme texture and lod culling) and re village (textures breaking down 2 meters away from camera)
 
Last edited:

Corgi1985

Banned
The problem is devs with bland derivative bad games. One of the best games this years is a pretty version of mass effect that originally came out 14 years ago. We don't get years like 2007 or even 2008 anymore where we get new ip, instead we get the same TIRED 3rd person cinematic or boring as fuck hero shooters. For example, Fallen Order is a very bad game that does nothing new and sucks on any system, not the fault of the PC.
 

Md Ray

Member
I only played the demo but I never saw any of that. Is it related to ray-tracing or specific resolutions or graphic cards? Only have a 1080ti, played in 1080p without RT. I once noticed a drop to 130fps but otherwise it was 144fps with occasional dips to 143fps, only noticeable with a fps counter.
I finished it on 3080/3700x. Did not had 24fps drops. Ran good with gsync. Some minor stutters here and there. Nothing bad
DF highlighting perf issues (timestamped):



fc9oJST.png


I too only played the demo, never saw dips to 24fps but I did notice stutters when shooting.
 
Last edited:

GHG

Gold Member
you can find a lot of criticism coming from me regarding series s

but it has at least respectable 8 core 16 threads zen 2 cpu accompanied with 500 gb/s ssd

by omitting ray tracing out, its memory can survive
for its low tflops, series s users are always welcoming 540p, so there' s no problem on that front

So let me get this straight, the series S is fine but an equally powerful 1060 and more powerful 1070 are a problem?

So you not think owners of 1060\1070 graphics cards can pair them with a modern 6-8 core desktop CPU and an SSD? Do you not think 1060\1070 owners will likely be gaming at resolutions of 1080p or 1440p?

Sorry, but I'm not understanding your logic here. Unlike consoles, PC's are not set in stone, the parts are easily interchangeable and can be upgraded.
 

rofif

Banned
And where does the Series S fit in to this equation of yours genius?
It should not have been released but IMO it is still way easier to adjust a game to run on S, if You release it on X, than it is on pc.
S and X are very simlar. Most of the time You can probably lower resolution or/and framerate
 
i hope it dies

imagine being forced to optimize games to run on decrepit 4-6 core cpus and gtx 1060s/1070s (sx, ps5 and even series has 8 core 16 threads rofl)
imagine being forced to design games around slow hdds or slow sata ssds (even the sx can reach 5 gb/s bandwidth. most "pcmr" users still have 400-550 mb/s sata ssds which is inferior to what series x/s/ps5 has)
imagine being forced to design textures, data streaming around the majority of 8 gb vram gpus (sx will be able to provide 13.5 gb total memory available to games)

i hope microsoft changes their stance on pc and make xbox exclusive games. imagine a game taking the full power of 4.8 gb/s ssd bandwidth, 13.5 gb vram + sampler feedback streaming (which no hardware has yet to have on PC. no, sampler feedback alone does not count, because sampler feedback streaming is one step forward) and of course, 12 tflops.

sx by its gpu power alone is stronger than maybe %70-80 of pc users

decrepit pc hardware will hold back multiplatform games for quite a while sadly.

hopefully ps5 exclusives will overcome that. blame is on microsoft for giving their exclusive games to PC lmao

CdJinEy.gif
 
you can find a lot of criticism coming from me regarding series s

but it has at least respectable 8 core 16 threads zen 2 cpu accompanied with 500 gb/s ssd

its ssd and cpu spec is still better than %90 of pc userbase XD

by omitting ray tracing out, its memory can survive

for its low tflops, series s users are always welcoming 540p, so there's no problem on that front

but developers will have to accomodate for gimped cards that are 2070, 2080, 2080s, 3070 and 3080 for a time now until nvidia graces "pc" gamers with 16 gb mainstream gpus (3060 has 12 gb rofl)

then we shall see truly generational ray tracing enabled games with actual high quality textures, unlike the hideosity that are cyberpunk (extreme texture and lod culling) and re village (textures breaking down 2 meters away from camera)
So the 3080 is now a gimped card, because it doesn't have 16gb of vram? Do you even understand the basics of how computers work?

Let's use the adverse argument, does your PS5 have at least 16 gigs of RAM? Or even 32 GB of RAM? On top of the additional 10 GB of VRAM? Do You not see where you are starting to fall the fuck off the cliff of a basic argument 101?

If PS5 had 128 GB of VRAM, it still would not be able to beat a 3080.
 
Last edited:

nkarafo

Member
i hope it dies

imagine being forced to optimize games to run on decrepit 4-6 core cpus and gtx 1060s/1070s (sx, ps5 and even series has 8 core 16 threads rofl)
imagine being forced to design games around slow hdds or slow sata ssds (even the sx can reach 5 gb/s bandwidth. most "pcmr" users still have 400-550 mb/s sata ssds which is inferior to what series x/s/ps5 has)
imagine being forced to design textures, data streaming around the majority of 8 gb vram gpus (sx will be able to provide 13.5 gb total memory available to games)

i hope microsoft changes their stance on pc and make xbox exclusive games. imagine a game taking the full power of 4.8 gb/s ssd bandwidth, 13.5 gb vram + sampler feedback streaming (which no hardware has yet to have on PC. no, sampler feedback alone does not count, because sampler feedback streaming is one step forward) and of course, 12 tflops.

sx by its gpu power alone is stronger than maybe %70-80 of pc users

decrepit pc hardware will hold back multiplatform games for quite a while sadly.

hopefully ps5 exclusives will overcome that. blame is on microsoft for giving their exclusive games to PC lmao
Lol... Imagine having consoles dragging PC gaming back since the beginning of time and now that consoles finally managed to match or exceed the average PC spec, you want it dead... because it drags your brand new console back?

I mean... this post is so amazing... i'm not even mad.
 

yamaci17

Member
So let me get this straight, the series S is fine but an equally powerful 1060 and more powerful 1070 are a problem?

So you not think owners of 1060\1070 graphics cards can pair them with a modern 6-8 core desktop CPU and an SSD? Do you not think 1060\1070 owners will likely be gaming at resolutions of 1080p or 1440p?

Sorry, but I'm not understanding your logic here. Unlike consoles, PC's are not set in stone, the parts are easily interchangeable and can be upgraded.

because of directstorage, rtx io, and other technologies.

series s/x has the directstorage and sophisticated advanced techologies that can take advantage of high speed nvme ssds.

1060/1070 simply have not. nvidia specifically built rtx io to make directstorage work.

series s can still run games at 648p, that's no problem for the "casual" player base it has, apparantly.
 

GHG

Gold Member
It should not have been released but IMO it is still way easier to adjust a game to run on S, if You release it on X, than it is on pc.
S and X are very simlar. Most of the time You can probably lower resolution or/and framerate

Look, it's no secret that I'm not a fan of the series S, that's well documented here but if people are going to start propping it up as something good when comparing it to PC hardware then quite frankly they have lost the plot.

The bolded sentence - what do you think the first thing is that mid/lower spec PC gamers tend to do?
 
because of directstorage, rtx io, and other technologies.

series s/x has the directstorage and sophisticated advanced techologies that can take advantage of high speed nvme ssds.

1060/1070 simply have not. nvidia specifically built rtx io to make directstorage work.

series s can still run games at 648p, that's no problem for the "casual" player base it has, apparantly.
Simply show me one game that is held back because of the lack of direct storage or RTX I/O? All you need is one example, and your argument might actually hold more than a feather of weight. If you can't do this, you might as well shut the fuck up and stand down, as you're only digging yourself a deeper hole like I said.
 

yamaci17

Member
Look, it's no secret that I'm not a fan of the series S, that's well documented here but if people are going to start propping it up as something good when comparing it to PC hardware then quite frankly they have lost the plot.

The bolded sentence - what do you think the first thing is that mid/lower spec PC gamers tend to do?
it's the technologies that series s and x supports that puts them front over 1060/1070

you can't have;

- sampler feedback
- sampler feedback streaming (not even on rtx 3000 series)
- dx12_ultimate
- variable rate shading
- proper texture streaming via the directstorage

with old hardware. slamming a high speed nvme ssd and calling it a day is not enough. there's also the custom decompression block on xboxes for those ssds, that you don't have on PCs. this is why rtx io is important

if a hypothetical game that were designed to use all these technologies exclusively on a xbox console, it would look truly phenomenal. but we will have to wait a couple more years so that pc users can catch up.

what's there not to understand?

--

oh a final anectode.

8 core cpus are highly bandwidth starved on pc configurations due to DDR4. this is what ddr5 aims to achieve, giving more bandwidth so that you can keep high amount of cores fed more.

a series x has 336 gb/s available to cpu as bandwidth.
ps5 has 446 gb/s

what do we have here with pcs? most average scenario, you have 50 gb/s ddr4 bandwidth to feed 8 cores. most extreme case, 60-65 gb/s if you overclock your rams to the moon.

it is clear that pc cpus are bandwidth starved all the time.

even the almight i9 9900k benefitted greatly from going 3000 to 4000 mhz. it is clear that higher core count cpus like to have more bandwidth available to them.

so even if you go 8 core cpu today, you will still not be able to accomodate the consoles having superior bandwidth
 
Last edited:
it's the technologies that series s and x supports that puts them front over 1060/1070

you can't have;

- sampler feedback
- sampler feedback streaming (not even on rtx 3000 series)
- dx12_ultimate
- variable rate shading
- proper texture streaming via the directstorage

with old hardware. slamming a high speed nvme ssd and calling it a day is not enough. there's also the custom decompression block on xboxes for those ssds, that you don't have on PCs. this is why rtx io is important

if a hypothetical game that were designed to use all these technologies exclusively on a xbox console, it would look truly phenomenal. but we will have to wait a couple more years so that pc users can catch up.

what's there not to understand?
Name a single example, game, or even a demo! UE5 is coming to PC more than likely before directstorage or RTX I/O, and will run much better. I'll be waiting for any possible example.
 
Last edited:

GHG

Gold Member
because of directstorage, rtx io, and other technologies.

series s/x has the directstorage and sophisticated advanced techologies that can take advantage of high speed nvme ssds.

1060/1070 simply have not. nvidia specifically built rtx io to make directstorage work.

series s can still run games at 648p, that's no problem for the "casual" player base it has, apparantly.

This is getting laughable.

Directstorage is a DirectX technology and will work with any DirectX 12 GPU.


The Series S/X use standard specification pcie gen 3 nvme SSD's.

For games like the medium that the series S has to run at 648p, GPUs like the 1070 can run at 1080p at high settings and could probably achieve even more if the settings were dropped to whatever the Series S is running the game at:



And that's an example with a 6 core 1600 CPU.

Please educate yourself before you embarrass yourself further.
 
Last edited:

yamaci17

Member
This is getting laughable.

Directstorage is a DirectX technology and will work with any DirectX 12 GPU.


The Series S/X use standard specification pcie gen 3 nvme SSD's.

For games like the medium that the series S has to run at 648p, GPUs like the 1070 can run at 1080p at high settings and could probably achieve even more if the settings were dropped to whatever the Series S is running the game at:



And that's an example with a 6 core 1600 CPU.

Please educate yourself before you embarrass yourself further.

lol 1070 drops below 30 fps in certain scenes in this game. and medium to high settings do not matter. i played that game myself, you can't argue with that game rofl. even with medium settings, a 1070 will drop below 30 fps regardless (besides, there are even heavier scenes in the game, and you practically picked the start of the game, which is the lightest)

and pc equivalent directstorage will never work efficient like it does on series s/x. they have special hardware that pcs don't have. get over it. accept it and move on.

rtx io is the equivalent hardware chip that is similar to what series s/x/ps5 have. but then again, it will take years for pc users to catch up with these technologies, so these techs will only be used partially and not to an full extent, until all the hardware is capable.

just like tesselation, ambient occlusion. remember that in the first years these techs were introduced, they were subpar in-game because only a limited hardware supported them. it was a novelty, but a weak one. with more and more gpus supporting them, we've started to see their true capabilities. rdr 2 is a fine example what a fully fledged tesselation can do.

by the way:

ryzen 1600 can't hold 60 fps in cyberpunk

series x can.

you can keep ignoring the truth

and with that being in mind, i shall ignore you. those who are ignorant shall be ignored
 
Last edited:

Darius87

Member
yamaci17 yamaci17 instead of digging yourself a deeper hole, just tell us that you are jealous of PC gamers, for taking all of your exclusives away, and for touching you when you were younger. Cause I can't understand where you come up with some of the most nefarious lies, and blatant bullshit. What honestly entices you to write all this bullshit, that just about anyone with single-digit IQ can refute, easily. There's a reason why so many people jumped on your post, and quoted you, because you have the most blatant bullshit posted on GAF today.
acussing other people lying while yourself literally lied in previous post in this thread is kind a sad. :messenger_grinning_smiling:
 

nkarafo

Member
but developers will have to accomodate for gimped cards that are 2070, 2080, 2080s, 3070 and 3080 for a time now until nvidia graces "pc" gamers with 16 gb mainstream gpus (3060 has 12 gb rofl)
What are you smoking my man? Did you just appeared in the gaming scene out of nowhere in the last couple of months or something?

Multiplatforms have stopped targerting PCs since decades now. Consoles are and always were the target platforms ever since. That's why all these decades PC users are enjoying the same games at higher resolutions and frame rates. I'm a PC gamer ever since the late 90's and i'm enjoying console multiplatforms at much better quality/speed from the days of N64/Dreamcast to PS4. Only now i am behind the curve for the first time because i can't replace my old GPU.

The only thing that drags multiplatforms for the new consoles right now is the need for cross-gen games running on last gen consoles. If you wish for something to die, you should maybe target those.
 
Last edited:

GHG

Gold Member
lol 1070 drops below 30 fps in certain scenes in this game. and medium to high settings do not matter. i played that game myself, you can't argue with that game rofl. even with medium settings, a 1070 will drop below 30 fps regardless (besides, there are even heavier scenes in the game, and you practically picked the start of the game, which is the lightest)

and pc equivalent directstorage will never work efficient like it does on series s/x. they have special hardware that pcs don't have. get over it. accept it and move on.

rtx io is the equivalent hardware chip that is similar to what series s/x/ps5 have. but then again, it will take years for pc users to catch up with these technologies, so these techs will only be used partially and not to an full extent, until all the hardware is capable.

just like tesselation, ambient occlusion. remember that in the first years these techs were introduced, they were subpar in-game because only a limited hardware supported them. it was a novelty, but a weak one. with more and more gpus supporting them, we've started to see their true capabilities. rdr 2 is a fine example what a fully fledged tesselation can do.

by the way:

ryzen 1600 can't hold 60 fps in cyberpunk

series x can.

you can keep ignoring the truth

and with that being in mind, i shall ignore you. those who are ignorant shall be ignored

The Series S drops below 30fps at 648p using a mixture of medium and high settings with ray tracing turned off.

As for this:

they have special hardware that pcs don't have.

You have to be trolling/joking at this point.

and with that being in mind, i shall ignore you. those who are ignorant shall be ignored

Or maybe not.

tenor.gif
 
Last edited:

rofif

Banned
Look, it's no secret that I'm not a fan of the series S, that's well documented here but if people are going to start propping it up as something good when comparing it to PC hardware then quite frankly they have lost the plot.

The bolded sentence - what do you think the first thing is that mid/lower spec PC gamers tend to do?
yeah I am not arguing
 

yamaci17

Member
So a 2017 cheap CPU can't hold 60 fps on graphically demanding 2020-2021 game? Colour me shocked
Yeah, let me colour you shocked then, with a counter argument

- Majority of PC gamers have GTX 1060, 1650S, 1050Ti, 1070, 970, RX 580, RX 570 i.e. midrange GPUs
- Majority of PC gamers have ryzen 1600, i5 7400, 8400, 9400f, ryzen 2600, ryzen 3600 (still can't hold 60 fps in cyberpunk) i.e midrange cheap CPUs

This was my argument to begin with. A huge amount of PC gamers are cheapskates. they won't upgrade unless they have to. and developers will have to accomodate for them, in some form or another.

This is where the problems start, Series X triumphs these specs.

Series X triumps what a RTX 2070+3700x gaming machine can deliver. This is huge.

And people have yet to have RTX series. This is why I wish PC gaming to wither. Not to mention, midrange GPUs get more expensive with each passing generation.
 
Last edited:
Lol you wish. Pcs never change.
My pc from 2014 ran its first year nonstop 24/7 with no issues; after that year gta5 came out and it actually needed a new driver, updated, didn't reboot, no issues. It's still going strong today.

My current pc has been problem free as well.

Yeah check RE: Village out.

Runs perfect on consoles. Stutters big time on a 3090 and 10900k.
I did, here is my entire playthrough, stutter free and never dropping below 60:



There are ~20k reviews on steam and 95% of them are positive. Steam reviews are immediately awful if there is some actual widespread technical issue; not the case here.
 

nkarafo

Member
Series X can't hold 60fps on Cyberpunk either. On perf mode it runs at 50 something even when you just free roam and not doing combat or anything demanding.
 

Guilty_AI

Member
Yeah, let me colour you shocked then, with a counter argument

- Majority of PC gamers have GTX 1060, 1650S, 1050Ti, 1070, 970, RX 580, RX 570 i.e. midrange GPUs
- Majority of PC gamers have ryzen 1600, i5 7400, 8400, 9400f, ryzen 2600, ryzen 3600 (still can't hold 60 fps in cyberpunk) i.e midrange cheap CPUs

This is where the problems start, Series X triumphs these specs.

Series X triumps what a RTX 2070+3700x gaming machine can deliver. This is huge.
Wrong way to analyse this data pal. You don't look at the percentage but the actual numbers.
If you did that you'd find out theres around 9 million users with next gen specs, and thats before any games that actually need such specs to run properly even appeared.

Also, seems like series X can't hold Cyberpunk 2077 at 60 fps either....
 
Last edited:

V4skunk

Banned
GPU manufacturers should be making special cards for mining built on last gen fab process. This would make every one happier.
 

nkarafo

Member
Lol you wish. Pcs never change. X570 and ryzen 3700x caused me so many issues... It still does fuck with USB. My keyboard input sticks sometimes due to usb issues... Many other problems
I'm sorry your PC has issues. Mine is perfect. It helps that i know what i'm doing though.

Also my XBOX 360 red ringed.

You realise that is because Cyberpunk is a last gen game for Xbone?
Huh?

Shouldn't a last gen game run much faster on a much more powerful, next gen console?

I thin you are confused. Also, i was answering to the guy who claimed the Series X can hold 60fps on that game.
 
Alright, claims are being made, and several times "LOL 6 core CPUs" and "LOL 1070s" have come up.

Well, I happen to have such a configuration:
i5-9600k (a 6-core, 6-thread CPU)
GTX 1070 Ti (a "LOL" GPU)

Let me preface this by quickly highlighting another benefit of PC gaming. I upgraded to the above configuration from an i5-3570k/GTX 760 (2GB) config. That earlier config, which is still around and still works great by the way, handled all gen 7 (PSWii60) and prior games that I played *like a charm* at 1080p/60fps. No joke.

In any case, when I upgraded, I was blown away by one of the absolute top level powers that PC currently has: out-of-the-box backwards compatibility. My *entire* libraries on Steam and uPlay just worked with NO work on my part. No having to go to gaming forums to find out if the platform holder is making the game backwards compatible; no hoping that any one game didn't fall within the "1%" of games that were not backwards compatible; no having to guess which games were enhanced to 60fps; no having to just give up and hope for a remaster. No, my upgrade in itself gave a "remaster" of sorts to all of my games, across the board.

So, back to the configuration change. With the CPU/GPU upgrade, I also bought a 1440p G-Sync monitor. And, while I admittedly don't always chase the "latest AAA games" (quite simply because from a gameplay perspective, they're not my cup of tea), my "LOL" configuration has been able to run EVERYTHING I've thrown at it at a native 1440p and ALWAYS above 60 fps. Just yesterday I was messing around with RE3 Remake (having a "fun" playthrough with the unlocked infinite weapons from the in-game shop) and I was at 90fps or above.

Do I run at max settings? Fuck no, I don't need to. But so far they have always been at least high settings, and for most games it's something between High and Ultra.

Coming to the final point of PC: *I* have the choice of how I want to run my game. I'm not stuck to a single configuration. I'm also not stuck with having to pick between two pre-packaged configurations (what are they usually called? "Performance" for the 60fps one and "Fidelity" or something for the graphical one?) I don't have to run RE3 Remake at 90 fps with my "LOL" configuration. I can turn on some bells and whistles and cap it at 60fps. Or, fuck it, I can try to max it out and go even lower on the fps, to 30.

Because it doesn't matter what domain of life we're talking about: whether it's as a video game player, or as a citizen of this country, or as a human being, I like.... CHOICE. Which the PC gives me unparalleled amounts of.

Cheers, all!
 

rofif

Banned
I'm sorry your PC has issues. Mine is perfect. It helps that i know what i'm doing though.

Also my XBOX 360 red ringed.


Huh?

Shouldn't a last gen game run much faster on a much more powerful, next gen console?

I thin you are confused. Also, i was answering to the guy who claimed the Series X can hold 60fps on that game.
I also know what I am doing.
Some stuff is just out of my control. These x570 motherboards were broken on release.
I had to replace my first mobo since the pc was just not posting randomly.
Then, when I got more expensive x570, It took a year of uefi updates from amd and gigabyte, to get to the startup time I had with 2500k. It's now 19seconds from cold boot to desktop but it was about a minute on release.
That nd usb problems still happening.
There are always problems. Not everyone notices it
 

Guilty_AI

Member
XIoh7pv.png


i'm having great laughs over how pc people triggered so easily lol
Theres a meme for the your post just now you know

ElwhjqwU8AAKRnR.jpg


How about you try this:
You forget PC gaming exists since you're clearly not a fan of it;
Forget all this tech/market arguments you came up with since you clearly aren't familiar with modern pc market, and it doesn't really help you with anything;
Then you go play games on whatever console(s) you own. You'll probably feel more fulfilled that way.
 
i'm having great laughs over how pc people triggered so easily lol

keep them comments rolling ppl
People engaging you in a debate now constitutes them getting "triggered"?

I was willing to give you the benefit of the doubt, but (whatever little) credibility you had left just evaporated.

One of my high school teachers always told me that one of the easiest ways you could tell that you were winning a debate is when your opponent starts engaging in childish antics instead of sticking to the facts.

You, sir, are losing the debate.
 
The saddest part with the gpu situation is that even when you do find parts in stores, they're still double the price while consoles will be at msrp. I don't understand this.


If you already have a PC yes. If you're looking to get one, I don't think it's ever been worse.
This. It's an amazing time to play on PC but only if you are already in the ecosystem.
 
Status
Not open for further replies.
Top Bottom