We have a higher IQ so need more complex games.What was the last relevant PC exclusive?
Last edited:
We have a higher IQ so need more complex games.What was the last relevant PC exclusive?
My first sentence was "Sure, higher refresh rates are better. This should be a consensus."
But the reality is that game devs have to develop for consoles. So it's all about them.
If you have a PC, surely you'll go for higher framerates when possible, that's a given
Nice non-sequitur. The statement and question it posits have nothing to do with another. PC having no "relevant" exclusives means the world revolves around consoles? By relevant I suppose you mean AAA derivative trash? MMOS, RTS, simulators, and every other genre probably don't matter in your world where gaming revolves around consoles.Oh, but it does
What was the last relevant PC exclusive?
Consoles are the common minimum denominator. For some games they are the base, but not for others.
Curious to think you picked 2 games that are first and foremost PC tech games.
Cyberpunk, from launch was always better on PC, by a huge margin. Not only it had fewer bugs and performance issues, but it also looked better.
And it has the most advanced PC tech, such as DLSS 3.5, RTX, XeSS and soon FSR3.
Alan Wake is using PC specific tech that the PS5 doesn't have, Mesh Shaders. And then there is the DLSS and RTX 3.5 tech.
These are 2 games that were made with the high specs of PC in mind, and then cutback to fit into consoles.
You are talking about 2 games that have the biggest sponsorship from Nvidia.
Cyberpunk was 100÷ pc first. U should read the Release reviews and User reviews from that time ...
Nice non-sequitur. The statement and question it posits have nothing to do with another. PC having no "relevant" exclusives means the world revolves around consoles? By relevant I suppose you mean AAA derivative trash? MMOS, RTS, simulators, and every other genre probably don't matter in your world where gaming revolves around consoles.
I picked Cyberpunk 'cause despite running better on PC (obviously) and having unique features, it was designed to be possible to run on last gen machines still
You really think that the game wouldnt be wildly different if it was PC only?
I'm talking about something like Crysis was back in 2007. You could downscale the game all you want, it wouldnt be possible to run on the consoles of that time.
Same for Alan Wake II. Xbox supports and uses Mesh Shaders in AW II. PS5 doesnt have it, but uses Primitive Shaders instead.
DLSS and RTX are nice, but they are hardly important for the game overall design. Just bells and whistles.
Again: you dont see games like Crysis that you could cutback all you want and it would still only be possible on PC
This "the game was designed for PC first and foremost" talk is only present 'cause devs want to sell PC copies and are in bed with GPU manufacturers
And yet again I'll mention Crysis, the last meaningful PC game that wouldnt run on consoles of that time even with cutbackshave you noticed how many games run poorly on console. some dropping frame rate and other going to very low resolution.
It's like these games and engines were designed for much stronger hardware than consoles.
And yet again I'll mention Crysis, the last meaningful PC game that wouldnt run on consoles of that time even with cutbacks
And have you noticed that despite running poorly on consoles, they still use tech that fits the scope of what consoles can do? Even running poorly?
It's almost like devs are making games while having in mind that they are going to make the game work on consoles or something
You have such a strange take on this because you're complaining about both framerates.Honest question: is there a way for 60+ fps content to be displayed so that the viewer doesn’t perceive the infamous soap-opera effect?
I gave away my beloved plasma TV 3 years ago for a series of reasons, and since then my vision never fully adapted to the poor movement resolution of LED-based TVs. 30fps CG scenes and in-engine cutscenes in games are invariably a pain to watch due to sample-and-hold, and TV and movies absolutely require some motion interpolation if I want to watch more than a few minutes of footage.
On the other hand though, on modern screens the soap-opera effect is very obvious and, to use a very abused expression, “not cinematic”. Things feel too smooth, and it seems everything is moving too fast. That is perfect for actual gaming, but in cutscenes it’s a bit jarring. Would there be some way to make it “feel right” while still maintaining the smoothness?
....Cyberpunk.... could barely properly run on PS4 and Xbox One. The baseline, back in 2020. That game was not made with consoles first and foremost.Cyberpunk and Alan Wake II are curently the best looking PC games on the market, and they were made with consoles in mind first and foremost.
Oh man ...That's because you only consider Crysis as the last great PC title. While ignoring that games like CP2077 and Alan Wake are amazing showcases for PC tech.
....Cyberpunk.... literally couldn't run on PS4 and Xbox One. The baseline, back in 2020. That game was not made with consoles first and foremost.
Oh man ...
I've already mentioned those 2 cases.
Cyberpunk was built to fit consoles. Last gen consoles can "run" it on 2013 laptop CPUs. It runs poorly, but it runs.
Same for Alan Wake. It uses Mesh Shaders, but Xbox can run Mesh Shaders. PS5 uses Primitive Shaders, which have the same purpose.
They look and run better on PC, but they are consoles games beefed up on PC
It literally could. It played like ass, but I finished the game on my PS4. I wasnt dreaming.
Make Alan Wake II run on the Switch. Now this is an example of what cant literally be achieved.
I find it wise never to look at the sales when your previous purchase is down by a considerable amount.Its the LG 45" ultrawide and I overpaid to get it at launch at $2500 and here at Black Friday they were down to around $900 at some sites
Oh, boy. So since Xbox no longer has exclusives, I suppose it's no longer relevant? And since the way you framed your original argument clearly excluded Nintendo and that most current AAA games skip the platform, I guess they're not relevant either. So by consoles, you really meant Playstation, which had a whopping 2 exclusives this year, including a single 1st party. Is that what you meant?By relevant I mean games that sells
Why would I name games when I had no idea what you meant by "relevant"? Relevant in this industry is the games that make the most money and hint; they're not exclusives.Funny that you mentioned genres but not games to build your argument
This stupid shit again. Consoles now have hardware comparable to an RTX 2070S/Ryzen 3600 which is about where the average gaming PC sits. If you want something that can't run on consoles, not only would you erase 100% of the console market but you'd also erase like 85% of the PC market. This isn't 2005. Who the fuck wants a game that runs a 1080p/30fps on a 4090? Consoles are no longer machines with incredibly unique and exotic configurations like the PS3. They're much closer to PCs. Damn near everything that can run on a 4090 can also run on a PS5. They both have modern hardware. None of that pixel shader 3.0 stuff that can't run on your GPU only supporting pixel shader 2.0.And yet again I'll mention Crysis, the last meaningful PC game that wouldnt run on consoles of that time even with cutbacks
And have you noticed that despite running poorly on consoles, they still use tech that fits the scope of what consoles can do? Even running poorly?
It's almost like devs are making games while having in mind that they are going to make the game work on consoles or something
And just so you know, I would love seeing devs make something like Crysis again for PC.
But they wouldnt be crazy to drop a huge user base with consoles. That's leaving too much money on the table.
90hz is cool, but 120hz lets you get 40 hzHonestly, the real question is whether any of you guys think 60hz only screens will survive in modern display tech, especially at the rapid pace of improvement we've got with OLEDs and MiniLEDs these days. Many new TVs, LCD screens, phones.... they're all coming out with 90hz or better screens. Steam Deck OLED was 90hz too.
I feel like by 2030, 60hz only screens will be relegated to ultra budget.
Yeah, so just like Crysis on the PS360? Cyberpunk dropped to the single digits upon release and sometimes completely locked up on consoles. It was so bad that CDPR went out of their way to hide it from the press and you're sitting us with a straight face that it was "designed" to run on PS4 when all CDPR did was desperately cram something into machines that clearly couldn't handle it.It literally could. It played like ass
"How noticeable these jumps are depends on your sensitivity to smoothness."
A direct quote from the video. Some of us clearly need more visene, mucous reducers, and kleenex more than others around here. I get it, your eyes need the safe spaces of 540hz. To each their own.
Looking at those motion clarity screenshots I think we should aim a lot higher than just 120hz if we want truly amazing visual clarity. For the time being it's fine but 360hz is the endgame here90hz is cool, but 120hz lets you get 40 hz
With 90 you can get 45hz, and its fine, but 5 FPS can be a lot to ask from some hardwares
120hz would be the perfect spot. 60, 30 and 40 FPS options.
Thanks, I was about to answer something similar, the elitism of "pc gamers" over consoles and the complaining about past times where pcs had real games only possible on pc is tiresome and very outdated.Oh, boy. So since Xbox no longer has exclusives, I suppose it's no longer relevant? And since the way you framed your original argument clearly excluded Nintendo and that most current AAA games skip the platform, I guess they're not relevant either. So by consoles, you really meant Playstation, which had a whopping 2 exclusives this year, including a single 1st party. Is that what you meant?
Why would I name games when I had no idea what you meant by "relevant"? Relevant in this industry is the games that make the most money and hint; they're not exclusives.
This stupid shit again. Consoles now have hardware comparable to an RTX 2070S/Ryzen 3600 which is about where the average gaming PC sits. If you want something that can't run on consoles, not only would you erase 100% of the console market but you'd also erase like 85% of the PC market. This isn't 2005. Who the fuck wants a game that runs a 1080p/30fps on a 3090? Consoles are no longer machines with incredibly unique and exotic configurations like the PS3. They're much closer to PCs. Damn near everything that can run on a 4090 can also run on a PS5. They both have modern hardware. None of that pixel shader 3.0 stuff that can't run on your GPU only supporting pixel shader 2.0.
It's not only that developers take consoles into consideration but they also take PCs into consideration and the vast majority of them don't sport a 4090 so those statements are nonsensical no matter how you look at it. Hell, most of the time, the minimum specs of modern AAA games consider the GTX 1060, an old mid-ranger from 6 years ago. Does that mean the developers don't care about consoles with twice the GPU horsepower? Or are you smart enough to figure out that they almost always take the route that'll lead to the most profitable outcomes? Alan Wake II as far as I'm aware is the first or second game that literally is designed 100% with current-gen consoles and modern PCs only and those consoles have been out for almost 3 years.
Not that i agree with the other guy but Xbox is quite literally fading into irrelevance thanks to its insane lack of exclusives. That is the worst example you could've picked to try and disprove him, lolOh, boy. So since Xbox no longer has exclusives, I suppose it's no longer relevant?
There was a level cut from Crysis on 360/PS3 due to hardware limitations. It also had downgraded physics.Yeah, so just like Crysis on the PS360? Cyberpunk dropped to the single digits upon release and sometimes completely locked up on consoles. It was so bad that CDPR went out of their way to hide it from the press and you're sitting us with a straight face that it was "designed" to run on PS4 when all CDPR did was desperately cram something into machines that clearly couldn't handle it.
The Witcher 3 also runs on the Switch.
I think 240Hz with BFI, 120Hz with very high motion clarity, will look amazing too.Looking at those motion clarity screenshots I think we should aim a lot higher than just 120hz if we want truly amazing visual clarity. For the time being it's fine but 360hz is the endgame here
My son is a big time gamer albeit just on Series X but he swears he can't tell a difference between 60 and 120 and he plays shootersI caught that quote as well. I've been saying for a while that I think this varies from person to person, but frankly that is something I've been pointing out to these people who say there should not be any other mode than 30fps, which I think is ridiculous.
Your kid ain’t L33TMy son is a big time gamer albeit just on Series X but he swears he can't tell a difference between 60 and 120 and he plays shooters
Even showing him 60 to 240 and he can't tell a difference and has maintained that stance for as long as he has seen faster displays
I even bought him a 32" 144hz monitor for his desk to play games at 120 on his Series X and he is like, yeah ok but was fine at 60
I don't get it the difference is night and day to me.
since he doesn't care it would be really nice if i got that 32" 144hz monitor....My son is a big time gamer albeit just on Series X but he swears he can't tell a difference between 60 and 120 and he plays shooters
Even showing him 60 to 240 and he can't tell a difference and has maintained that stance for as long as he has seen faster displays
I even bought him a 32" 144hz monitor for his desk to play games at 120 on his Series X and he is like, yeah ok but was fine at 60
I don't get it the difference is night and day to me.
Point out the motion clarity, something with text will more clearly show what's running at 60 and 120 fps.My son is a big time gamer albeit just on Series X but he swears he can't tell a difference between 60 and 120 and he plays shooters
Even showing him 60 to 240 and he can't tell a difference and has maintained that stance for as long as he has seen faster displays
I even bought him a 32" 144hz monitor for his desk to play games at 120 on his Series X and he is like, yeah ok but was fine at 60
I don't get it the difference is night and day to me.
I think BFI is a nice workaround, but it's most effective in 60hz games since most refresh rates capable of displaying it can't really use it with HFR.I think 240Hz with BFI, 120Hz with very high motion clarity, will look amazing too.
Whocares what the internal resolution is. The question is, does the gaming community notice? If the image quality is good, then does it matter if it runs internally at 720p or 1440p?Not really. We are having so many games with low internal resolution that sometimes goes back to PS3/360 era
The catch is that they put "4K resolution" on the box, but its being upscaled from sometimes sub 1080p res
Alan Wake II in Performance Mode runs at 872p. It's ridiculous. And it still runs at sub 60 FPS. Same for the Quality mode that runs at 1272p and sub 30.
So what do we do? We sacrifice game design in order to have a stable game at a higher frame and resolution? Maybe many features in the game wouldnt be possible if they had to make it work on that hardware at 60 FPS decently (higher than 1080p)
How many years we would have to wait until a Nintendo console could run TOTK at 60 FPS/4K?
It's a tough topic
I have tried more than once he just doesnt see itPoint out the motion clarity, something with text will more clearly show what's running at 60 and 120 fps.
Moving text is a huge thing on my OLED. Diablo 2 at 60fps looks fine in movement vertically but terrible horizontally. This is all cleaned up at 120fps on my PC version on the same TV.Point out the motion clarity, something with text will more clearly show what's running at 60 and 120 fps.
240hz gives you all that but also 80 fps as well.120hz would be the perfect spot. 60, 30 and 40 FPS options.
Yes, we traded size and convenience, for image quality.
I don't think that we knew at the time, exactly what we were losing.
CRTs being naturally fuzzy image wise does help give the image a sort of automatic anti aliasing whereas LCD and OLED need that built in through software due to them just displaying everything as it is.Let's not pretend that CRTs had perfect image quality. When I replaced my CRT PC monitor for an LCD screen, I was ecstatic that the image was razor sharp compared to the old CRT. The difference was most striking in the corners of the screen That's of lesser importance for gaming, but it was a big deal for productivity.
Another big plus was that you could get much higher resolutions screens that weren't absurdly large and heavy like CRT screens. And we were rid of overscan.
This quite literally doesn't matter. It still gets almost every game and devs are even bitching about having to develop for the Series S. They can't even ship their games to even Playstation or PC if the game cannot run on the Series S. Only Larian delayed BG3 on Xbox as a result. The argument that the world revolves around consoles because the PS5 gets 1-3 exclusives a year is moronic.Not that i agree with the other guy but Xbox is quite literally fading into irrelevance thanks to its insane lack of exclusives. That is the worst example you could've picked to try and disprove him, lol
A whole ass DLC is missing from the PS4 and X1 version of Cyberpunk because of hardware limitations. Crysis 2 and Crysis 3 are also on consoles and came out day 1 on both platforms. While some might argue that Crysis 2 is a step back from Crysis, Crysis 3 most definitely isn't.There was a level cut from Crysis on 360/PS3 due to hardware limitations. It also had downgraded physics.
I bother mentioning it because your stance is utter nonsense. This isn't the early 2000s where literal hardware limitations prevented new games from running on older hardware. Almost every game can be downscaled enough to be run on old hardware. Rift Apart, Returnal, and TLOU Part I, PS5 only games can all run on a GTX 960 that dates back to 2015. Same for Immortals of Aveum, Remnant II, or Dead Space Remake. Those are current-gen only games on consoles but run on comparatively last-gen PC specs.Cyberpunk on last gen consoles is feature complete. You could play it from start to finish on those consoles. Period. No matter how bad it ran, it ran. It ran like shit and looked like shit, but I played the same missions on the same city as PC players on my PS4.
The Witcher 3 also runs on the switch, yes. Its also completely different from Cyberpunk and Alan Wake II, so I dont know why you bother mentioning it.
Just for kicks, you should try connecting a modern console to a CRT display.Let's not pretend that CRTs had perfect image quality. When I replaced my CRT PC monitor for an LCD screen, I was ecstatic that the image was razor sharp compared to the old CRT.
"soap opera effect" only refers to interpolation, and should never be used for gaming (since it adds a buttload of lag).Honest question: is there a way for 60+ fps content to be displayed so that the viewer doesn’t perceive the infamous soap-opera effect?
Some GTA games, Bioshock and Mass Effect were 60+fps day 1. And the others could be 60fps (TOTK was lol) on PC day 1 if not because potato consoles.BOTW/TOTK, The Last of Us, Ocarina of Time, Mario 64, all of the GTA games, RDR, RDR 2, Bioshock, Mass Effect and etc.
If you consider PC, than sure, every game is 60 FPS on day one.Some GTA games, Bioshock and Mass Effect were 60+fps day 1. And the others could be 60fps (TOTK was lol) on PC day 1 if not because potato consoles.
It doesn't. Consoles have 99% PC tech inside them. Who dictates visuals and performance is PC.Oh, but it does
No one did forbid Sony to release the game on better hardware at the time (PC) and make it 60+ fps.because they had to make the game run at 30-ish fps on the PS3 originally.