• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next Gen Consoles 60FPS vs. 30FPS

Best and most jarring examples NRS-Games (Injustice 2, MKX and MK11, maybe also in Injustice 1)... Worst Offender: MK11 - menues, intros, fatalities and Fatal blows all in choppy sloppy 30fps, gameplay itself smoothest 60fps... i fucking hate it! Then there was a herald in the modding community... putting an end to the "slow downs"... so NRS felt threatened on PC... worked in its own solution, that was borked and fucking inferior and doesn't really work since the last patch... thank you, NRS... no, wait... fuck you... it was meant to be "fuck you" !
 

Shifty

Member
Yes, but as I already mentioned that’s not how it looks in games.

Again compare it to a game. Eg.


In which case, nkarafo nkarafo has the right of it when they say that you're totally able to perceive 60.

Looking specifically at that Crysis footage on the 1080p60 quality setting, the image is composed of a lot of muddy blue-grey colours with little contrast so it doesn't jump out as hard as the white-on-black gif, but the doubled framerate is clearly still there.
It's especially noticeable if you keep an eye on the outcrops near the camera when they're projected against the light-coloured skybox.

So here is a better example in video format:


You also need to take into account that screen size also matters. Frame rate differences are more noticeable on a big screen vs a tiny gif. So watch that video at full screen, just like how you would play a game. Also watch the video at HD otherwise it will lock at 30fps which will make it useless.
This is a much better example since it has a lot of vibrant, contrasting colours.

And can you even tell that GoW was 30FPS anyway?
Boy could I. Dad of War is one of the few games that made me wish I'd bought a PS4 Pro for the performance mode.

And using a SunhiLegend gif as a point of comparison is cheating- he makes liberal use of time-stretching and other such editing tricks that change the framerate in real time for the sake of effect.

Skip it until technology is actually there.
The PS2 called. It wants its framerate back.
 
Last edited:

Rest

All these years later I still chuckle at what a fucking moron that guy is.
The PS2 called. It wants its framerate back.
The fact that only a handful of games back then even tried it and developers now still struggle to achieve or maintain it proves my point. But go on thinking whatever it is you want to.
 

Bogroll

Likes moldy games
The difference between 4k and 1080p is not very noticeable unless you have a gigantic TV. My tv is about 50 inches and unless I press my face on the TV I don't notice a difference, especially not during gameplay. I do, however, immediately notice frame drops so I want stable 60fps as a standart, graphics are good enough already.
I agree you do need to be close to tell the distance and it is a big difference imo when close especially on driving games when looking into the distance (my tv 50"). Ideally 4k 60fps or at least 1440p 60fps.
Having said that in most cases 1080p 60fps over 4k 30fps for me and i just sit further from the tv.
 

mcz117chief

Member
I agree you do need to be close to tell the distance and it is a big difference imo when close especially on driving games when looking into the distance (my tv 50"). Ideally 4k 60fps or at least 1440p 60fps.
Having said that in most cases 1080p 60fps over 4k 30fps for me and i just sit further from the tv.
Yeah, distant objects look meh on 1080p I agree, but I have 1 and 1,5 diopter so unless I wear glasses while playing video games I can't tell the difference almost at all lol. Natural anti-aliasing is the best :p
 

nkarafo

Member
The fact that only a handful of games back then even tried it and developers now still struggle to achieve or maintain it proves my point. But go on thinking whatever it is you want to.
Wut?

The PS2 (and CG/XBOX for that matter) had a higher ratio of 60fps games.

In fact, most arcade racing games were 60fps back then while in next gen they regressed to 30. Colin Mcrae and WRC games for instance. FFS, the PS3/360 generation lasted 7+ years and there are hardly any arcade racers that run at 60fps while the previous generation is filled with them.

And you didn't even feel the graphical sacrifices most of the time either. Games like F-Zero GX, Rogue Leader and Metroid Prime 1-2 are all 60fps and still some of the best looking games on the Gamecube.

"Proves your point"? Whatever my man. The only thing you proved is that you don't know much about videogames before the PS3.
 
Last edited:

Shifty

Member
The fact that only a handful of games back then even tried it and developers now still struggle to achieve or maintain it proves my point. But go on thinking whatever it is you want to.
Sounds like you're drinking your own kool-aid there, champ.

The notion that gen 6 stuff didn't push for 60 is laughable, given that most of the leading action franchises that are still around were established in that period. Gen 7's HD development is what fucked everyone, because marketers started demading the shiniest possible screenshots to go on the back of the box to the detriment of performance.

There is no "the hardware is too weak" because even the weakest GPU out there can render 60+FPS if it's not being overworked. It's entirely dependent on the amount of geometry and effects being pushed through it by the developers, so you might in fact say "the developers are too weak" (and/or have their priorities ass-backwards).

Do go on being delusional about the business and technological realities behind real-time 3D rendering if you want to though. It's your GAF-given right.
 
Last edited:
60fps on shooters, racing games must, give me that cinematic 4K 30fps with all graphical bells and whistles on third person action adventure games!
 

MilesTeg

Banned
60fps being mandatory doesn’t sound like a developer friendly solution in my (very uneducated) opinion. Letting the devs do what they want sounds like the best option.
 

johntown

Banned
I plan on getting a PS5 anyway (for exclusives) but 60fps being standard for most games would probably have me gaming more on PS5 than possibly PC. I don't like 30fps at all if I can avoid it.
 

Shifty

Member
60fps being mandatory doesn’t sound like a developer friendly solution in my (very uneducated) opinion. Letting the devs do what they want sounds like the best option.
In theory, but the whole reason we're rabbling over it in the first place is because developers were given free reign over their framerates and, in most cases, choose to sacrifice them for extra shiny.

The true best option would be for the platform holders to mandate a performance / image quality switch for all games, potentially with some exceptions for titles where it truly makes no sense (which are few and far between, imo).

It is more work for developers, but less than it would have been in previous generations since we now live in the age of scalable general-purpose engines like Unreal 4 that have backend systems to automate rendering quality.
 
Last edited:

nkarafo

Member
HD development is what fucked everyone, because marketers started demading the shiniest possible screenshots to go on the back of the box to the detriment of performance.
This isn't false but it's not the only reason. So here's my theory.

It's also because the arcades died. And also because gaming became trendy and mainstream shortly after.

See, in the arcades you had multiple games in the same room competing with each other for your coins, so they had to look as good as possible, which is why the best ones used state of the art 90's hardware to run with the bast graphics and frame rate possible, thus 60fps. If any game was 30fps it would stand out and look like crap if a Daytona USA or a Sega Rally was near it. A few years later, with the 6th gen of consoles a parity between consoles and 3D arcades was achieved. Gamers could finally enjoy similar graphical quality and frame rate as the arcades, which was a dream come true for many gamers back then.

But the arcades themselves were dying. Companies decided it's better to use console based hardware (as it was now good enough) like the Naomi or Chihiro as the top of the line instead of expensive custom hardware that looks 2 generations ahead (like the Model 3 in 1996 or the Model 2 in 1994). So people had no reason to go to the arcades anymore, why go there when the best looking games can be played at home?

During that time gaming also became more mainstream, bringing in tons of more people who never played games before and also never walked in the arcades (they were not as popular anymore, remember?), while the older gamers had other priorities like raising a family or something. Combine that with the HD TVs where marketing pushed resolutions in the same way CPUs were pushed using their GHz speed numbers in the 90s (remember the HD-Ready and full HD?). Additionally, more and more people started using the internet (instead of magazines) to see upcoming games and the screenshots had to look great, internet videos were still in their infancy at low resolutions and frame rates.

So basically, when the 7th generation started, there were no standards in frame rates and a new obsession about HD resolutions and great looking screenshots started.

Like you said, it had nothing to do with the hardware, at least not after the extremely weak PS1/Saturn and N64 (these had to do more noticeable sacrifices to achieve 60fps).
 
Last edited:

Acidizer

Banned
9ldIyaz.png

This reminds me of the dreaded "mesh effect" on SEGA Saturn. Whilst the PS was getting all the transparency goodness :messenger_grinning_smiling::messenger_neutral:
 
Last edited:

quest

Not Banned from OT
I would expect more of the same to be honest lots of 30fps games. The culprit won't be the CPU or a weak GPU it will be ray tracing it is a performance destroyer on nvidia who will do it faster. The best hope would be to ask for more performance modes in games that would disable it. Putting pressure on developers to include a performance mode is really the best bet.
 

UltimaKilo

Gold Member
That's exactly how it looks in games. Play any side scrolling game, say Sonic Generations on a console and then at 60fps on PC. That's exactly the difference.

In 3D games where you move into the center of the screen it may be slightly less noticeable because you focus on the center, where everything moves slower. But you still get the exact same effect around the screen as everything moves faster there.

You don't need to concentrate. If there was a one button toggle between 30 and 60 fps you would feel the massive difference in both visual information and responsiveness as you change it in real time.

The reason you say you can't see the difference is because you don't have a reference and adjust to 30fps faster as your eyes get used to it so it doesn't bother you. I agree that this can change from person to person. I, for instance, don't need a reference and my eyes don't adjust to 30fps as fast because the majority of games i play are 60fps or higher. But everyone can tell the difference in the end.

I usually can’t tell the difference, even on side by side videos. It has to be a very fast game for me to notice and even then, it’s usually not a big deal.
 
It seems like more and more games these days come with a performance mode that prioritizes 60 FPS / higher framerates. If performance modes were mandatory for every game, that would be awesome.
 

Bryank75

Banned
Some games need it and others don’t. I like the way Uncharted handles things.... 30 for single player since the graphics are quite demanding and 60 for multiplayer.
Multiplayer should be 60 as a rule of thumb, I hate that destiny is 30 on consoles.
 

flacopol

Member
they always going to sell the "1080 / 4k" blablabla but never talk about the FPS , because... "the human eye only can see..... blabla crap"

I hope the console delivery 60 fps , becasue IMO after half an hour you stop to see graphics and start to see mechanics in the game
 

StreetsofBeige

Gold Member
I would expect more of the same to be honest lots of 30fps games. The culprit won't be the CPU or a weak GPU it will be ray tracing it is a performance destroyer on nvidia who will do it faster. The best hope would be to ask for more performance modes in games that would disable it. Putting pressure on developers to include a performance mode is really the best bet.
Yup.

Consoles looks like they will have good cpu and gpu in 2020. But with all the BS about raytracing and also another round of hair effects (something they plugged with tress fx maybe 4 years ago, but I remember seeing articles about hair graphics again), it looks like power may be sapped for better lighting and hair strands.

So much for the bullshit days where devs would say........... "Oh, we would love to improve physics and AI, but gosh darn we don't have the power".

Nobody though in recent gens, devs would still focus on pushing visuals and hire graphics artists over AI crunchers.

Aside from some cool Frostbite destruction over the past 10 years of BF games, I don't think physics or enemy/NPC AI have really improved one bit since the 90s or early 2000s.
 
Last edited:

Rest

All these years later I still chuckle at what a fucking moron that guy is.
Sounds like you're drinking your own kool-aid there, champ.

The notion that gen 6 stuff didn't push for 60 is laughable, given that most of the leading action franchises that are still around were established in that period. Gen 7's HD development is what fucked everyone, because marketers started demading the shiniest possible screenshots to go on the back of the box to the detriment of performance.

There is no "the hardware is too weak" because even the weakest GPU out there can render 60+FPS if it's not being overworked. It's entirely dependent on the amount of geometry and effects being pushed through it by the developers, so you might in fact say "the developers are too weak" (and/or have their priorities ass-backwards).

Do go on being delusional about the business and technological realities behind real-time 3D rendering if you want to though. It's your GAF-given right.
I'm sorry, but what the fuck are you talking about? Everyone on this forum understands the idea of balancing performance and graphics, don't act like I don't know this. Consumers care more about nice graphics than an arbitrary number of screen refreshes per second that most of them don't even recognize. Developers don't care about frame rate. Consumers don't care about frame rate. Even reviewers don't care about frame rate. It doesn't sell games and developers shouldn't waste time pursuing it until the hardware available doesn't offer an either or proposition of frame rate or graphical output.
 

Shifty

Member
I'm sorry, but what the fuck are you talking about? Everyone on this forum understands the idea of balancing performance and graphics, don't act like I don't know this.
Claims of "the hardware can't take it" imply that you lack a basic understanding of how computer graphics work under the hood.

You're quite welcome to show some receipts for this knowledge that makes you think you can speak for everyone, though.

Consumers care more about nice graphics than an arbitrary number of screen refreshes per second that most of them don't even recognize. Developers don't care about frame rate. Consumers don't care about frame rate. Even reviewers don't care about frame rate. It doesn't sell games and developers shouldn't waste time pursuing it until the hardware available doesn't offer an either or proposition of frame rate or graphical output.
Pff, appeal to the masses. I don't give a toss what makes money, nor what Joe Average thinks of the framerate in his annual sports and shooter franchises. I care about games feeling good to play.

Until we hit magic singularity levels of technology, every piece of rendering hardware we make is going to have a maximum throughput. Your logic of "there's a maximum therefore focus on fidelity" is nonsense, particularly if the justification is "make what sells" rather than "make good games".
 
Last edited:

Fbh

Member
I'd definitely like to see more 60fps games but I'm fine with devs deciding.
With limited hardware I think there's valid arguments for both approaches. I think 60fps definitely enhances the experience in games like Doom, Titanfall 2, DMC5, etc, yet at the same time I'd rather have a game like Horizon or The Witcher 3 looking as nice as possible instead of going for 60fps.

I'd also be nice to see more games leave the choice to the user though, with a few different presets to choose from
 

Portugeezer

Member
Depends on the game. In an open world RPG for example, 30fps allows for more complex worlds compared to 60fps. For shooters 60fps feels nicer.

Personally after 5 mins I forget about what framerate a game is, unless it's unstable.
 

Justin9mm

Member
That's why i use a PC instead of a console. Because you can balance the GFX/FPS ratio as you like, depending on your system and your display. For instance, i have a middle of the road PC and i can't run The Witcher 3 @ 60fps / Ultra so i can choose to lock it at 30, 40 or 50 fps since i have a 240hz monitor so 40fps is evenly distributed, plus a TV that supports 50hz so all these options are viable for me. Or i can just use 60 fps with more cuts. I also have the option to run games at 80 or 120 fps if they are not very demanding, like how i played Bloodstained at 120fps and enjoyed the crystal clear (almost CRT quality) side scrolling with minimal ghosting/motion blur.

With consoles i would never have all these options my monitor/TV provide. Sure there are a few games that give you a 60fps option if you have the Pro/X variants. But even with these consoles the games that let you do this are rare. And i don't think this will ever change with fixed hardware consoles.
I completely agree with everything you said and I also have a gaming PC but some console gamers generally don't want options. They just want it to be given to them in the best way it can be provided without all the tinkering. This is why console thrives and the best argument against PC imo.
 

nkarafo

Member
Developers don't care about frame rate. Consumers don't care about frame rate. Even reviewers don't care about frame rate. It doesn't sell games and developers shouldn't waste time pursuing it until the hardware available doesn't offer an either or proposition of frame rate or graphical output.
That's not how fixed hardware works my man.

Also, you are wrong. Everyone cares about frame rate, it's just that many people don't realise or know exactly what that is, i explained this before.

Average Joe doesn't know what it is but i you cut the frame rate of CoD of FIFA in half, he will immediately tell the game is sluggish now. Most people don't know what antialiasing is either. But when you remove it they will see the jaggies. And yet, how many of consumers will care? And what about polygon counts? You think the average Joe will notice the difference if you cut the geometry in half in every game? And yet modern games have so much geometry that even if you have half of it the game would still look almost the same, unless you zoom in to details and look for sharp edges.

Basically, it's the same with every other visual aspect. You say consumers "don't care" about frame rate, do they care about anti-aliasing though? Or 4K? Or Ambient occlusion? Or poly counts? Or anisotropic filtering? You think they wouldn't buy a game if it had slightly lesser graphics or resolution but better frame rate? Seems like the opposite is happening.

Look at the games people play the most. Look at Fortnite and Minecraft. Look at all those mobile games casuals play with. You still think people care about graphics more than how responsive a game feels? On the contrary, responsiveness and smooth motion is even more important if you sell to the masses, have you ever seen a phone operating at 30fps? Doesn't matter if you aware about what "frame rate" is, you would still probably throw it out of the window in frustration.

Even worse, have you ever played a PC game using a mouse at 30fps? Nobody can stand it. I assume a VR game at 30fps would be intolerable as well. So i don't know why you ignore every market besides consoles, as if it applies to a tiny part of the population. It's only the console market where 30fps (or even lower) can be tolerated because the controllers themselves smooth out the inputs. You feel like you have good control but the games still have the reduced responsiveness and precision behind the filters. Plus, many games that are made with consoles in mind are usually slower paced, "cinematic", walking/climbing simulators with somewhat automated controls that don't even need to be responsive.

Consumers care, Developers care (console devs only act differently because they mistakenly think the average Joe won't care) and reviewers who don't care simply suck at their job (which is true for most of them regardless). I do find it very interesting though, how you, personally, care so much about what the average consumer might think and how everyone should cater to their low standards.


I don't give a toss what makes money, nor what Joe Average thinks of the framerate in his annual sports and shooter franchises. I care about games feeling good to play.
I'm with you but wait... most of these games you mentioned run at 60fps anyway and feel good to play. They feel better this way even for the average Joe regardless if he knows it or not. The Call of Duty franchise (that was the benchmark of big sales for decades) was always a 60fps series on consoles. Same with the Battlefield games. Same with FIFA, NBA2K and Pro Evolution.

So what's going on? It's almost as if better responsiveness feels better to the majority. It's almost as if smoother running games were made for the masses all this time and 30fps console games cater to the "graphics whore" crowd.
 
Last edited:

Whitecrow

Banned
Pff, when in TVs you switch from Normal color mode to warm or warm 2, you also see everything more yellowish, but lot of times, that is closest to the 6500k color temperature lots of games are made with.
One must be reasonable and dont fear the initial change shocks and let the brain time to get used to that.

A well made 30fps game you forget about the frame rate about 5 mins of playing.
If you run away from it because at the first second you find it jarring, well, at least you should not forget you can get used to it.
 

Poordevil

Member
This is my theory on 60 vs 30 fps. Your brain knows what smooth motion looks like. What happens when I play a slow game, is at first the game looks sluggish or choppy. But after I play for a while I get use to the slower frame rate and it looks normal. What is going on is my brain starts filling the gaps of 30 fps so it looks smooth. The problem is this is a very fatiguing process. The CPU is slacking off, so your brain has to step up and do the heavy lifting. This is why gamers will say you can't tell the difference and that 30fps ( or 28 or 26 ) is just fine.
 

Shifty

Member
I'm with you but wait... most of these games you mentioned run at 60fps anyway and feel good to play. They feel better this way even for the average Joe regardless if he knows it or not. The Call of Duty franchise (that was the benchmark of big sales for decades) was always a 60fps series on consoles. Same with the Battlefield games. Same with FIFA, NBA2K and Pro Evolution.

So what's going on? It's almost as if better responsiveness feels better to the majority. It's almost as if smoother running games were made for the masses all this time and 30fps console games cater to the "graphics whore" crowd.
That's why I didn't call out any franchises specifically- as you mention, there are some notable ones that ship at 60 and still sell gangbusters.

I'd argue that it's about the brand- smooth gameplay feel is part of those franchises' identity, and the AAA studios that make them are evidently wise to that otherwise it'd be fidelity > performance across the board. There's also the idea of 'brand power' to consider- in part, CoD and FIFA sell because they're CoD and FIFA, and always will unless their respective publishers find a conclusive way to run them into the ground.

In the more general case with titles that are still AAA but not juggernauts, looking visually pretty is a proven way to attract new customers via advertising and media coverage and thus ends up getting prioritized.

That does raise an interesting question though- how big would the meltdowns be if the next Call of Duty shipped at 30FPS instead of 60? Given that smooth gameplay feel has always been a part of the series, it would be interesting to see whether the mass market would respond negatively enough for it to be considered a notable misstep on Activision's part.

Depends on the game. In an open world RPG for example, 30fps allows for more complex worlds compared to 60fps. For shooters 60fps feels nicer.
That depends on how you define 'more complex'. If we're talking how much dynamic stuff is in the world itself (enemies, physics objects, NPCs, yadda yadda), then the added strain is going to be more CPU-side than it is GPU-side.

Naturally, more models to render = more GPU cost as well, but that ultimately comes down to the complexity of the geometry rather than the world. There are enough standardized techniques out there to mitigate that (baking, culling, streaming, automated mesh simplification, draw call batching, LOD, etc) for it to be less of an issue on modern engines.

I completely agree with everything you said and I also have a gaming PC but some console gamers generally don't want options. They just want it to be given to them in the best way it can be provided without all the tinkering. This is why console thrives and the best argument against PC imo.
While it's a valid argument, it's also undermined by the addition of simple performance / quality switches in games released following the inception of mid-gen refreshes like the X and PS4 Pro.

Going full ham with nitty-gritty PC settings on console would indeed scare off part of the target market, but it doesn't need to be an all-or-nothing "no settings or bust" kind of situation.
 
Last edited:
With the next Xbox and PS5 coming in the near future and promising better graphics, 4K, 8K you fucking name it. Would your purchase change if 1 manufacturer made 60FPS mandatory for every game.

Would a 60FPS console in all games vs. slightly better graphics but 30FPS on most games change things for you? Which would you rather choose? Personally I care more about framerate and good game play rather than pretty graphics. Which console would you buy?

If the platform holder made 60FPS (or an uncapped framerate option) mandatory on all games, I would actually pick up the PS5.

Otherwise, I probably won't. Not enough PS4 games with the option aside from God of War to have me justify getting a PS5 soley for backwards compatibility, and games like Bloodborne, Persona 5, and Uncharted 4 are doomed to low framerates and high input latency without emulation, because there hasn't been any patches for those.

If consoles are trying to act similarly to PCs, developers better start treating their games as such, and future-proof them.
 
Last edited:
Your just used to seeing 60FPS.
Most people are not.
There is nothing wrong with stable 30FPS.
You are part of the minority.
The majority of big-name games for casuals (We are talking casual games with a focus on gameplay), Call of Duty, NBA 2KXX, FIFA, Madden, Fortnite, Battlefield run at 60FPS even on consoles. A ton of games from the PS2 era, and arcade games in general have always run at that target too. It was standard for games back in the 80's and 90's to run at 60FPS, but that changed when hardware acceleration was lacking in the 32/64-Bit era aside from 3D accelerated GPUs. Nowadays, AAA developers are only doing so, because of weak CPUs, and because they'd rather slap on a billion different shader effects that aren't optimized.

The only time someone says "framerate doesn't matter" are either JRPG nuts who haven't played any JRPG in the NES/SNES era, or people who just play walking simulators and are more concerned with games being movies than actually playing like video games.

If developers can't hit 60FPS, they should at least offer an uncapped option like God of War and Kingdom Hearts 3 does, since backwards compatibility and trying to emulate PC gaming has been more of a concern of platform holders.
 

bitbydeath

Member
The majority of big-name games for casuals (We are talking casual games with a focus on gameplay), Call of Duty, NBA 2KXX, FIFA, Madden, Fortnite, Battlefield run at 60FPS even on consoles. A ton of games from the PS2 era, and arcade games in general have always run at that target too. It was standard for games back in the 80's and 90's to run at 60FPS, but that changed when hardware acceleration was lacking in the 32/64-Bit era aside from 3D accelerated GPUs. Nowadays, AAA developers are only doing so, because of weak CPUs, and because they'd rather slap on a billion different shader effects that aren't optimized.

The only time someone says "framerate doesn't matter" are either JRPG nuts who haven't played any JRPG in the NES/SNES era, or people who just play walking simulators and are more concerned with games being movies than actually playing like video games.

If developers can't hit 60FPS, they should at least offer an uncapped option like God of War and Kingdom Hearts 3 does, since backwards compatibility and trying to emulate PC gaming has been more of a concern of platform holders.

What about GTA?
That’s one of the largest franchises out there and nobody says anything nor cares.
 

lukilladog

Member
As much as I like solid 60fps, I´ll gladly trade it for locked 37fps/74hz with "supercharged" visuals, too bad TV manufacturers droped the ball there because PC monitors have been doing 75hz since forever.
 

bitbydeath

Member
Why do you think so many people double-dipped for it on PC?

They didn’t.


PC is a very niche crowd compared to consoles.
 

nkarafo

Member
They didn’t.


PC is a very niche crowd compared to consoles.
I don't get it. You say they didn't and you post an article that reports they actually did.

That's a huge number of sales considering the game got released on PCs way too late and most of those sales were probably people who double dipped (i know i did). If the game was released along the PS3 and 360 versions in 2013 it would have sold multiple times more on Steam (and the total sales would be lower as less people would get it on PS3/360).
 
Last edited:

bitbydeath

Member
I don't get it. You say they didn't and you post an article that reports they actually did.

That's a huge number of sales considering the game got released on PCs way too late and most of those sales were probably people who double dipped (i know i did). If the game was released along the PS3 and 360 versions in 2013 it would have sold multiple times more on Steam (and the total sales would be lower as less people would get it on PS3/360).

That’s like a 90+ million difference. I did state in comparison to console sales.
 

nkarafo

Member
That’s like a 90+ million difference. I did state in comparison to console sales.
A huge amount of those sales are from the original PS3/360 versions. The PC version didn't exist to compete. There wasn't even an announcement for a PC version until much later, to prevent people holding off to get it later on. And when it was finally released it still had to share sales with the PS4/XBOXone versions and it still managed to sell as much as it did.
 
You're only looking at the statistics and those most often doesn't show full picture. How many 10s of millions bought gta5 at launch on last gen and then again on pc. The same people counts in console sales.

There's no way in hell I would have voluntarily suffered 30fps if the game launched at the same time or within 6 months from unlocked 60+ fps version.
 
Last edited:
Top Bottom