• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why is there always a gap between what the hardware engineers claim a system can do vs what the software programmers/developers are able to achieve?

FMX

Member
We all heard that the greatness of the XSX and the Ps5 while it is still early we all at this point see the sacrifices that the developers have to make to achieve 120fps or to get ray tracing in a game. I even remember hearing that Xbox One and the Ps4 were capable of doing 4k and we saw how that went. The same can be said with pc graphic cards as some spend hundreds on them and then complain about the performance. Are the hardware engineers protected because they can say "we never said that it could do all those things at once" or are the software developers "lazy" like I often read on here. It's shameful that here were are just entering year 2 of this gen and I am hearing things that these systems will not be able to do until the rumored pro systems come out. So who is at fault for the gap? Call me stupid but I was expecting games to do 4k 120fps with ray tracing but from my understanding that goal is not achievable even on pc. I am even starting to question the overall importance of ray tracing in games due to the performance hit that games have to take to implement it.
 

EverydayBeast

thinks Halo Infinite is a new graphical benchmark
Every year internet nerds beat up on developers, being a game developer is incredibly rare I can’t remember an actual gamer being enraged at a developer sorry games have bugs, poor fps. etc. game performance so far this generation is exactly the same as last generation. FPS is a totally irrelevant point and a figment of the OP’s imagination. Games like god of war, Spider-Man, Forza, will trounce any FPS 120 tests. PS5, XSX will get to 4K and it will scare graphical testers, their paths are lined with gold. And that’s the bottom line buddy:

the rock wrestling GIF by WWE
 

ParaSeoul

Member
They can do everything on the box technically so its not false advertising. The PS4 and Xbox One were never advertised as 4K capable,even at the time they knew they were underpowered because the entire industry thought console gaming was dead which is why they pushed tv features,always online and episodic games so much early on.
 

TheSHEEEP

Gold Member
Does it really anymore? These machines are using essentially standard PC architecture, nothing to learn like Cell or Emotion Engine. I'd say it's because the hardware guys are either talking peak efficiency, or benchmarks don't translate to the real world.
Benchmarks never translate to the real world (apart from game-specific benchmarks).

The truth is that games are software.
You can make software run efficiently and optimized.
You can make software run like shit.

As you said, consoles nowadays are "just" PCs with a custom OS and different default controls. Otherwise, the differences are negligible.
That said, the console OSs are something devs have to learn to work with first - it is definitely possible that it takes a bit until the most efficient ways of doing X on a given OS have been found.
No idea how different the latest PS operating system is from the last-gen one...
 
I think he's on about Pro and XOX.
If that's the case, then technically those machines could do 4K and many games on the XOX were native 4K. It's just that 4K is very demanding and it makes more sense to use upscaling techniques than to push for native 4K as the upscaling looks close enough to 4K (in most cases).
 

BbMajor7th

Member
Same reason a Bugatti Veyron is rarely driven at 250mph, despite it being perfectly capable of reaching that speed: real world scenarios don't often allow for it.

Hardware engineers are building systems in isolation that can certainly do what they say in isolation, but games - like cars - are never simple technical exercises and developers and programmers will utilise those features both for their own purposes and to the best of their abilities. The results will mean that theoretical technical benchmarks are rarely met as the system is never pushed wholly in a single direction.

Then there's talent and resources. I love me FromSoft, but it's actually mind-boggling to me that a game as limited in technical scope as Bloodborne performs as poorly as it does. Placed alongside TLOU 2 (which targets the same resolution and performance but pushes much more advanced AI, lighting, animation and physics) it looks positively a generation behind, on exactly the same hardware, and yet still somehow manages to turn in worse performance metrics. None of which is to say some devs are 'lazy', but budget, resources and talent are a reality and a project like TLOU probably had more of everything.

And it cuts both ways - you'll often see some games doing things that were far outside the scope of the system's original specification, but have been achieved through smart programming workarounds.
 
Last edited:

ButchCat

Member
The engineers statements are intentionally vague, they aren't lying per se about achieving 4k/ 120 fps with ray-tracing it's just that they are saying you probably can't achieve these things if you want complex game design with crazy polygon count, shaders, crowd AI, destructibility among many others.

Obviously, optimization and funding are a huge part of it too. Very few games from last-gen really stand out and reach levels close to what was promised and that's because of the unlimited funding and talent behind them.
 
Last edited:

yurinka

Member
To make games needs a lot of time, and to optimize and fine tune them to the max even more. Traditionally they don't have enough time to highly optimize them specially when a game is shipped in multiple devices or hardware specs. And on top of that, they also need time to learn how to take advantage of the new hardware. And/or have to wait to get the proper engine/drivers/firmware/tools/etc. to take full advantage of the stuff.

On top of that, theorical limits of what a device can do is in perfect contidions that aren't met in real life due to hardware and software bottlenecks + games having to dedicate CPU/GPU/memory to other stuff that isn't to draw shiny stuff.
 
Last edited:

N1tr0sOx1d3

Given another chance
Does it really anymore? These machines are using essentially standard PC architecture, nothing to learn like Cell or Emotion Engine. I'd say it's because the hardware guys are either talking peak efficiency, or benchmarks don't translate to the real world.
Not true. Sony are still releasing “quality” patches for PS4 to this day, and with every quality firmware upgrade, quality is further increased….
 

StreetsofBeige

Gold Member
The engineers statements are intentionally vague, they aren't lying per se about achieving 4k/ 120 fps with ray-tracing it's just that they are saying you probably can't achieve these things if you want complex game design with crazy polygon count, shaders, crowd AI, destructibility among many others.
I agree.

It's like car makers saying a car can go 200 km/hr. Believe it or not even low end cars like Sentras and Civics can hit this (according to the specs). But nobody is going to drive at this speed. Not even street racers. And even if someone did, the noise and handling would be atrocious. Hold on to that steering wheel.
 

M16

Member
Most games I would say are still running on engines built for last gen. Updating the game engines to take advantage of new consoles takes time. Developers at launch and in the first few years are just occupied with shipping their games on time. The performance and new hardware features is secondary

This literally happens every generation
 
Last edited:

brian0057

Member
If nothing else, there's one thing I'll always praise Nintendo for: when they reveal a game, you basically get to see what the final product will look like.
For better or for worse, I can't remember the last time I saw the Big N showed a game that didn't look the same when it finally launches.
 
Last edited:

Wooxsvan

Member
So many engines are written today to make use of a broad set of cpu/gpu/ram combos that then have their own firmware and low level APIs. In addition you have middleware. All this adds layers of language that get between the 1s and 0s the chips are trying to calculate. The Theoretical numbers will never actually be met.
 

*Nightwing

Member
It’s human nature, or at least marketing at work.

When selling something new, you show the most extreme positive outliers as selling points even though the average person can never achieve it with said product.

Like a Sports car advertised to go 0-60 in 2.3 seconds. sure it’s possible to get that specific car under controlled conditions with a knowledgeable drive to achieve that, but that same car with additional luxury packages like: comfortable leather seats instead of low weight racing ones, A/C, hybrid automatic/manual transmission that adds additional weight, softer suspension for more comfortable ride with less grip to road has no chance in hell of hitting the same targets.

Same with games, they sell the visuals as the best the console can do without all the overhead of the AI, physics, and all other game mechanics not related to visuals that need to run for a game will take processing power away from the highest level the visuals can achieve.

Add to that as everyone has mentioned the actual lack of talent and voilà!
 

Fbh

Member
I still don't get how at this point people still expect $400 consoles to be magically able to to run everything devs throw at them with maxed out graphics at native 4K and 120fps. "Oh no, this $400 console didn't outperform a $1500 PC GPU, how can this be?!?!?!??!".

Anyway, it's a mix of marketing being intentionally vague and people making stupid assumptions based on that vague information. When a company says their console support resolutions of up to 4K it means just that, the console is capable of outputting a 4K image. It doesn't mean every game will run at 4K or that visual and/or performance sacrifices won't be needed to reach 4K.

I also think engineers often talk about the theoretical max output under absolutely ideal conditions. Sort of how some electric car manufacturers will give unrealistic ranges because they are calculated under unrealistic ideal conditions (perfectly flat road and dry road, no traffic, no wind, new tires, etc).

If nothing else, there's one thing I'll always praise Nintendo for: when they reveal a game, you basically get to see what the final product will look like.
For better or for worse, I can't remember the last time I saw the Big N showed a game that didn't look the same when it finally launches.

That's one of the advantages of Nintendo usually not announcing games years in advance. By the time they show stuff it's usually coming out within a year (usually less) at which point the visuals and technical aspects should be more finalized.
I'm not saying other publishers aren't intentionally misleading, but I think downgrades are a natural part of game development and devs/publishers could avoid some outrage by simply not showing games so early.

Even Nintendo gets into the bullshit territory when they start showing stuff too early:
 
Last edited:

Mr Moose

Member
PS4 and XBox One never said they were capable of 4K since they didn't even have 4K as a choose-able resolution.
08th, Oct. 2013
He also confirmed that the Xbox One will not only support 4K resolution for entertainment media but also for gaming, but ultimately this will depend on the developers.

“Ye! We are really excited about that too. I am sure your audience knows that 4K is the next big thing. Xbox One is a home entertainment system that is built for future. It supports 4K gaming and entertainment. In fact we are shipping an HDMI cable that is 4K rated. So when you get your HDMI cable out of the box it is 4K rated. We are looking forward to bringing 4K capabilities to our consumers in the future, but it depends on the developers,” he added.
 

StreetsofBeige

Gold Member
I still don't get how at this point people still expect $400 consoles to be magically able to to run everything devs throw at them with maxed out graphics at native 4K and 120fps. "Oh no, this $400 console didn't outperform a $1500 PC GPU, how can this be?!?!?!??!".
Because the average person sees some numbers stated and assumes the product can max out performance and quality at those specs at all times.

It's like someone saying "why doesn't my car do 10L gas per 100 km like the brochure says? I'm needing 12L/100 km" Well, the car techs in the lab rolling a car on an industrial strength treadmill at a 100% smooth level with all options turned off is getting 10L. You, me and everyone else is driving in shitty weather, bumpy roads, and you got chauffer around your fat ass aunt and uncle every once in a while. So the fuel economy sinks. It makes sense. Heck, when they do fuel economy tests, I dont even think the lab guy even sits in the car as a 200lb anchor.

Yet, some people will still complain why real world driving doesn't compare against perfect lab tests.
 
Last edited:
In my experiences unrealistic expectations like this are almost always the result of people who are fundamentally clueless about the way computer hardware works getting caught up in exaggerated fanboy hype claims. XSX spec sheet lists 8K resolution? Xbox first party games must be running at 8K. Epic releases a tech demo on PS5 only? Must be because all other platforms, including PC, are too weak to handle it. There were posters on this very board arguing that PS5 would run that demo better than a 3090 in a thread full of videos showing it running just fine on significantly weaker hardware.

There's just no helping some people.
 

Knightime_X

Member
You can definitely get 4k out of xbox one.
They just never mentioned how deep of a cutback is required to achieve it.

So technically they're not wrong.
 

bargeparty

Member
I still don't get how at this point people still expect $400 consoles to be magically able to to run everything devs throw at them with maxed out graphics at native 4K and 120fps. "Oh no, this $400 console didn't outperform a $1500 PC GPU, how can this be?!?!?!??!".

Anyway, it's a mix of marketing being intentionally vague and people making stupid assumptions based on that vague information. When a company says their console support resolutions of up to 4K it means just that, the console is capable of outputting a 4K image. It doesn't mean every game will run at 4K or that visual and/or performance sacrifices won't be needed to reach 4K.

I also think engineers often talk about the theoretical max output under absolutely ideal conditions. Sort of how some electric car manufacturers will give unrealistic ranges because they are calculated under unrealistic ideal conditions (perfectly flat road and dry road, no traffic, no wind, new tires, etc).



That's one of the advantages of Nintendo usually not announcing games years in advance. By the time they show stuff it's usually coming out within a year (usually less) at which point the visuals and technical aspects should be more finalized.
I'm not saying other publishers aren't intentionally misleading, but I think downgrades are a natural part of game development and devs/publishers could avoid some outrage by simply not showing games so early.

Even Nintendo gets into the bullshit territory when they start showing stuff too early:

There's nothing outlandish about that reveal, it's just so early in concept it doesn't reflect the state or design of the world at release.
 

RoadHazard

Member
Nobody who knows anything has ever claimed these consoles would be able to do 4K120 with RT. That's completely unrealistic. I've certainly not heard Cerny claim anything like that. Not even the most expensive PC GPUs can manage that (except maybe in some very specific scenarios). So if you believed that would happen, that's really on you.

I expected 4K30 and 1440p60 WITHOUT RT (maybe 1440p30 with it), so I've been positively surprised in a few cases.
 
Last edited:
Does it really anymore? These machines are using essentially standard PC architecture, nothing to learn like Cell or Emotion Engine. I'd say it's because the hardware guys are either talking peak efficiency, or benchmarks don't translate to the real world.
Judging by how games that look near launch compared to late gen games look, yes.
 

Hari Seldon

Gold Member
Because they all use engines and the engines need to be optimized to specific hardware. No one codes to the metal anymore.
 

MHubert

Member
We all heard that the greatness of the XSX and the Ps5 while it is still early we all at this point see the sacrifices that the developers have to make to achieve 120fps or to get ray tracing in a game. I even remember hearing that Xbox One and the Ps4 were capable of doing 4k and we saw how that went. The same can be said with pc graphic cards as some spend hundreds on them and then complain about the performance. Are the hardware engineers protected because they can say "we never said that it could do all those things at once" or are the software developers "lazy" like I often read on here. It's shameful that here were are just entering year 2 of this gen and I am hearing things that these systems will not be able to do until the rumored pro systems come out. So who is at fault for the gap? Call me stupid but I was expecting games to do 4k 120fps with ray tracing but from my understanding that goal is not achievable even on pc. I am even starting to question the overall importance of ray tracing in games due to the performance hit that games have to take to implement it.
Okay, but no one said that every game would run at 4K, 120hz with rt one these machines so I don´t get why you would expect that.
 

Lethal01

Member
Are the hardware engineers protected because they can say "we never said that it could do all those things at once"
Totally, and it's not a case of them having an "out" to excape what they promised or anything, it's just common sense that they were never saying all these things were possible at the same time. This isn't even "vague marketing"'s fault or anything. I'm sure there are times where marketing has been shitty but this isn't a case of it. That was never something that anyone was trying to hint at and you had no reason to ever expect it.

Call me stupid but I was expecting games to do 4k 120fps with ray tracing but from my understanding that goal is not achievable even on pc.
Sorry bro but we can all be a little stupid sometimes, that is 1000% on you.

If someone tell you a card can do 4k, can do raytracing, and can go beyond 120fps, it's just common sense that they aren't saying it can do all these things at once. More FPS ALWAYS means cutbacks somewhere else, the same goes for resolution. which is why unless dev intentionally hold back graphics you will keep probably have to make cutbacks to get there in modern games for the next 20 years. Because in 2040 you may be able to play these games at 8k 240fps or you could play at 1440p 30fps and have something indistinguishable from reality.
 
Last edited:

SlimySnake

Member
Call me stupid but I was expecting games to do 4k 120fps with ray tracing but from my understanding that goal is not achievable even on pc.
Why?

We have had ray traced GPUs in the PC space for 2.5 years before the consoles ever came out. We knew the sacrifices everyone had to make t get ray tracing from day one. Every youtuber who were playing games at 120-165 fps were running the games at shit resolutions at really low settings to hit that framerate. Even on fancy $1000 GPUs. I had to run Control at 960p to get it to run at a stable framerate on the second best GPU available in 2019. $700, and $1,500 PC bought me ray traced debris and reflections at 960p DLSS'd to 1440p.

This was before launch of consoles which had specs revealed almost a year in advance. We knew exactly how powerful they were. No one who knew anything about video games knew 4k 120 fps with ray tracing was impossible. Except for you of course, so this thread is basically about your ignorance and quite frankly, your insane beliefs.
 

Dr Bass

Gold Member
The problem is, software programmers these days are coddled and don't know how to do any Assembly.
What are you basing that on? I know how to do assembly. I've written small programming languages that compile to machine code. I guarantee you engine programmers (the ones that matter in game dev) know how to work with assembly, it's basically a requirement. If you're talking about web front-end programmers, fine, that is surely true but where does the coddled idea come from? Programmers tend to work really hard because there aren't enough of us to go around.

I also find the idea of average gamers calling programmers lazy a hilarious concept.
 
Consoles never are maxed out. You could always in theory do more the more the tools develop. But in theory even Communism works. In theory.
In reality games are done on a budget and with a deadline. Thats why.
 

RoadHazard

Member
I don't know. Why are you asking this to me ?

You are claiming people have lied for marketing, presumably in relation to what this thread is about? Or were you just making a general statement with no relevance to the thread you made it in?
 

cireza

Member
You are claiming people have lied for marketing, presumably in relation to what this thread is about? Or were you just making a general statement with no relevance to the thread you made it in?
I was making a general statement that is most certainly relevant, as it was verified numerous times in the past, for other occurrences than what is specified here.

The title of the thread takes a shortcut implying it happens all the time, and not only in this specific case.
 
Last edited:
Top Bottom