• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Was the switch to Unreal Engine 5 a mistake?

It'll be interesting if a lot of games switch to UE5 and it's the PS5 Pro console that resolves a lot of the issues.
People will feel like the Pro isn't optional anymore, and Xbox will be pressured to do a Pro, if they're really not so far.
 
No. DLSS, FSR and XeSS were a mistake.
Because they allowed devs to shit on optimising their games and just say "our game is optimised with FSR/DLSS/XeSS in mind."
So instead of them being used as a great bonus for framerate, they are used as a necessity for good performance.
Weird opinion. ANY optimisation is for the goal of increasing framerate and or visual effects or whatever while reducing necessary hw budget.
Any better effect, native resolution, less/no pop in, higher LOD models, bigger draw distance, more bounces for RT etc. everything comes at a cost. The whole point of optimisation is to reduce "perfect" stuff in a way that it does not look much worse but runs significantly better. The very reason those upscalers exist. Sure, you can fantasize about them just being used to enhance an already perfectly fine runnning game to beyond what average people desire, but as any optimisation it gets used not as a premium render path and result but as a core element of the renderer, to achieve better efficiency. Any dev not using at least one of those techs has imho just odd priorities. Any "trick" that gives you major performance jumps should be used. Therefore different antialiasing solutions were build over the years, some we kept even though hw can handle now the more costly versions, kinda like we now have 3 solutions for this via the three card makers, but also those checherboarding and temporal projection stuff or whatever is/were used in some games. There are a ton of tricks that don't get discussed and paraded as much, but certainly exist. Didn't some devs explain that native hardly means anything anymore, because hardly anything is truly native anyway today.
 

Herr Edgy

Member
People misunderstanding game development, what else is new.

One - UE5 is mostly UE4; new branding (logo & editor look) and a bunch of new systems on top. Not that different from if we had jumped from UE4.27 to UE4.31. There is no 'switch' to UE5 in that sense. It's not a new engine that people don't know how to use.
Two - An engine does not dictate game look. It dictates tools and options; and even then, as it's sort-of open source, developers can implement whatever they are missing, or change whatever they don't like, if they have the resources.

That is not to say Unreal Engine is perfect; it's an iterative product. Lumen and Nanite, while impressive in 5.0, are continuously being improved upon, just like most other parts of the engine.

There are base optimizations to rendering tech with each release, but in the end none of these optimizations matter if a developer doesn't know how to make use of the graphics pipeline. There is a performance budget any game has. The question here is twofold:
1) Is the budget reasonable in the first place (i.e. only high end PCs can run it reasonably as a target? If so, target accomplished if it's 'unoptimized' for consoles)
2) Can the budget be achieved? This is a 'skill issue'. Does the developer have the skill and resources to achieve the target frame rate without detracting from the visual & gameplay experience too much, and if so, in what way?

Again, devs know how UE5 works, they have used UE4. Skill levels will vary of course. They might not know how to encorporate the new tech that has been added since then in a way that works for their projects; but if so, that would have been the same for UE4 projects. You have tons of rendering options in the form of general graphics techniques, and a bunch of console variables on top that change rendering behavior.

The knowledge to make good use of that is usually limited to graphics programmers and advanced technical artists. If you don't have them on your project, it's guesswork as to how the different rendering techniques interact and influence each other, where it's easy to make gains towards the perf budget, and where it isn't.
 

SJRB

Gold Member
Unless you are an actual videogame software engineer and have actual data on why these games are not performing well this discussion is ridiculous.

money-ball-what-the-fuck-are-you-talking-about-man.gif
 
I'm disappointed with Lords of the Fallen on ps5. Image quality is doo doo and the visuals are in a constant state of bugging out. One of the least stable looking games I've played. There are some little flickers of promise, such as the particle effects and some distant detail that I suppose is benefitting from Nanite but yeah. As someone else says either Unreal 5 is a disappointment or the consoles are and it's pretty clear to me it's more a matter of the consoles being too weak to really handle it (in the hands of the average dev today who lacks technical chops).

I think it won't be until Hellblade 2 or more likely Gears 6 until a dev with real skill comes along and makes it work on these sorta limited consoles
 
I work with it day to day.

It was massively overhyped and not ready for prime time. It is not an easy process to port over to it, and attempting to use the same features as 4.27 results in worse performance - even using nanite and other new optimisations.
Towards the end of this gen it might be ready.

The features are legitimately forward thinking, but lumen still breaks in various conditions and there's fuckall you can do to fix it except for allowing performance to drop to 30fps on a 4090 or turning lumen off. There's lots I could go into but attempting to use it for production work has been 2 steps forward and one step back with every point release.

That explains why Lords of Fallen looks the way it does on my ps5. The visuals are in a state of flux lol. Lumen is def not working properly when you get shadow smears across the screen when a character moved, as well as textures and light flickering into and our of existence
 

Loomy

Thinks Microaggressions are Real
Good thing nobody's talking about art style in this thread them. We're talking about the tech and performance which hasn't been good at least not on console.

This is from the OP
So far, I have hardly liked any Unreal Engine 5 games. They have very high hardware requirements and run barely on mid-range PCs and not exactly great on the consoles. They also look just ok visually. The Matrix demo was impressive and Fornite in Ue5 looks good too, but all the rest so far has been more than disappointing at least for me. Did Ue5 impress you?
UE5 has nothing to do with everyone trying to make horror games or dark fantasy souls like games. The games look the way they do because the art director at those studios wanted them to look the way they do.

UE4 had a huge variety of games. UE4 will eventually too.
 

Neilg

Member
People misunderstanding game development, what else is new.

One - UE5 is mostly UE4; new branding (logo & editor look) and a bunch of new systems on top. Not that different from if we had jumped from UE4.27 to UE4.31. There is no 'switch' to UE5 in that sense. It's not a new engine that people don't know how to use.

lol that's not true. it's a huge change in a lot of areas. Not as big as switching to an entirely different engine mid-development, but it's nothing like going from any version of 4 to another version of 4.
 
Last edited:

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
It's not an engine problem. Most of the games you're talking about didn't start development on UE5, they updated halfway through to take advantage of some of the features but lacking the experience to optimize them for the hardware.

I remember UE4 being criticized because "too heavy" for the PS4/XBO generation, it was similar to what we're seeing today. By the end of the generation no one was thinking UE4 was a mistake or too much for the consoles

The earliest versions of UE4 probably were too heavy and too buggy for devs to work with. Devs did have to learn how to get the best out of UE4, but at the same time Epic's UE4 engine developers were working hard to fix bugs, improve performance and add new features. UE4 was released in 2012, the latest version (4.27) was released in 2021.
 

damidu

Member
hoping for a coalition magic, to turn the ue5 narrative around.
currently it feels like an engine thats not great at scaling down to current gen.
 

Spyxos

Gold Member
Hasn't it only been like 3-4 games Overall, none of them really big massive developers who know how to squeeze the most out of UE, besides fortnite.

Once we see what the coalition does for example with the tech will really be telling if the engine is up to the task
Lords of the Fallen
Fort Solis
Immortals of Aveum
Remnant 2
Robocop Demo
Fortnite
 

Fbh

Member
This gen? yes.
It's the diminishing returns engine:
"Would you like games with the same tired gameplay and marginally better graphics at the cost of massively increased system requirements on PC and/or having to drop down to 720p on consoles to achieve 60fps? We've got the engine for you!!!".

Maybe the Ps6 and nextbox will be able to properly take advantage of it. Though even then I'll probably always wonder what devs could achieve with the same hardware on a less demanding engine.
But for this gen every time I hear a game is using UE5 my hype and excitement instantly diminish.
 
Last edited:

FoxMcChief

Gold Member
no, but devs need more time to implement shit and make it work, time that corpo won't give them.

also, ppl need to stop bringing Matrix up as a metric of quality, that wasnt a game, it was just a tech demo.
Pretty much this. Fortnite as well, as an unfair comparison.
 

Go_Ly_Dow

Member
Customised and updated UE4 for games on current gen console title seems the way to go.

UE5 for next gen console titles.

However with cross-gen, I'd guess it'll take 2-3 years into next-gen until we see games properly using its full feature set.
 

Herr Edgy

Member
lol that's not true. it's a huge change in a lot of areas. Not as big as switching to an entirely different engine mid-development, but it's nothing like going from any version of 4 to another version of 4.
My guy, I work for Epic Games. Most of the engine is the same as before, functionally.
Most of the changes when you look at the UE4 vs. UE5 editor are cosmetic only. Core workflows are similar or even identical to before.
New tech that is creeping in is added on top, not replacing old systems, for the most part.

EDIT:
In fact, upgrading from 4.27 to 5.0 is the exact same experience as going from 4.26 to 4.27.
Rightclick your .uproject file, "Switch Engine Version", select your version, adapt to small API changes & rebuild the thing (if it's a C++ project), start it up, and hope that all assets migrated cleanly, which it usually does.
 
Last edited:

Neilg

Member
My guy, I work for Epic Games. Most of the engine is the same as before, functionally.
Most of the changes when you look at the UE4 vs. UE5 editor are cosmetic only. Core workflows are similar or even identical to before.
New tech that is creeping in is added on top, not replacing old systems, for the most part.

EDIT:
In fact, upgrading from 4.27 to 5.0 is the exact same experience as going from 4.26 to 4.27.
Rightclick your .uproject file, "Switch Engine Version", select your version, adapt to small API changes & rebuild the thing (if it's a C++ project), start it up, and hope that all assets migrated cleanly, which it usually does.

That's not been our experience at all, and from what I gather, there are others like us who have not had this breezy one click move over. I spoke to a few people at unreal fest about this - including epic employees. I'm sure for many simple things it is that easy, but lots of people aren't using the engine the exact same way epic have designed for it to be used around the needs of the fortnite team.

Even keeping the setup identical to our 4.27 scenes - not using lumen, sticking with baked light, the quality of the bake dropped and performance got worse in 5.1
we just tested 5.3 for another project and VR still doesnt work as well as 4.27, although it's much closer than it's been and should resolve in 5.4. Our current project is a raytraced VR scene, so it's not a standard end result, but our last project heavily relied on tesselation for the look. There are a lot of features that got left behind because they weren't a priority.
 
Last edited:

Herr Edgy

Member
That's not been our experience at all, and from what I gather, there are others like us who have not had this breezy one click move over. I spoke to a few people at unreal fest about this - including epic employees. I'm sure for many simple things it is that easy, but lots of people aren't using the engine the exact same way epic have designed for it to be used around the needs of the fortnite team.

Even keeping the setup identical to our 4.27 scenes - not using lumen, sticking with baked light, the quality of the bake dropped and performance got worse in 5.1
we just tested 5.3 for another project and VR still doesnt work as well as 4.27, although it's much closer than it's been and should resolve in 5.4. Our current project is a raytraced VR scene, so it's not a standard end result, but our last project heavily relied on tesselation for the look. There are a lot of features that got left behind because they weren't a priority.
Yeah, I wasn't saying the 5.0 version upgrade has to have no side effects. But previous UE4 version upgrades could have them as well.
Migrating from 4.23 to 4.24 for example didn't smoothly work for any C++ project as the build file architecture changed.
Point being, rendering changed, there were regressions, yes; I'm not saying 5.0 (or UE5 for that matter in general) are in a perfect state; I stated that before.
But it's still fundamentally the same engine and a direct upgrade from 4.27. The rebranding and combined efforts focused on differentiating the engine more than a typical UE4 version release would; which is why I compared it to a theoretical jump from 4.27 to 4.31.

If the only fallout is having perf regressions due to rendering tech changes, that is, what I'd call, a smooth migration as it has nothing to do with upgrading engine versions itself.
 
Last edited:

MLSabre

Member
I'll say I'd rather have an UE4 game that looks and runs great than an UE5 that chugs along and barely lets me enjoy a game.
"The last of the old is better than the first of the new"

An old engineering saying regarding products that went through several iterations (updates in this context) are better than fancy new products that hasn't went through the rigors of practical usage.

It'll take some time before UE5 becomes generally accepted (or outright rejection if things goes south).
 

Neilg

Member
Yeah, I wasn't saying the 5.0 version upgrade has to have no side effects. But previous UE4 version upgrades could have them as well.
Migrating from 4.23 to 4.24 for example didn't smoothly work for any C++ project as the build file architecture changed.
Point being, rendering changed, there were regressions, yes; I'm not saying 5.0 (or UE5 for that matter in general) are in a perfect state; I stated that before.
But it's still fundamentally the same engine and a direct upgrade from 4.27. The rebranding and combined efforts focused on differentiating the engine more than a typical UE4 version release would; which is why I compared it to a theoretical jump from 4.27 to 4.31.

If the only fallout is having perf regressions due to rendering tech changes, that is, what I'd call, a smooth migration as it has nothing to do with upgrading engine versions itself.

Fair enough - I didn't realize how much of a pain it was to move between some versions of 4. My experience there was significantly breezier, but I don't believe I experienced 4.23 to 4.24 mid project.
 
Last edited:
In capable hands we've seen it provides great results (Fortnite) so it's too early to judge based on just a few UE5 releases coming from mostly smaller studios. Will wait for whatever The Coalition puts out before saying if UE5 was a mistake or not.
 
Honestly, the engine has it all to prove so far this gen, initially it was hyped as the magic pill engine for this gen which would make developing games much easier, streamlined, and deliver amazing visuals with immense flexibility we all bought into the hype even developers. In practice, it has delivered underwhelming visuals and abysmal performance that chokes the consoles and even pcs. The only ones who seem to be able to use it well ironically are epic themselves and thats not a good look but the output is a far cry from the promised land we were sold.

Theoretically, it should be able to deliver amazing results on the consoles but....well we are waiting.

You know reading this comment actually made me think of a cheeky question: Is UE5 Epic's Sega Saturn?

That system was notoriously difficult to program for but I feel a lot of that was due to lack of highly optimized and dev-friendly APIs in a timely fashion. Epic have the money, they technically still have time on their side too. And I don't necessarily see any other 3P engine coming along that would usurp Unreal out of the market as defacto standard. Maybe Unity becomes more popular? I doubt it.

But it's not like UE has to be usurped by any one singular engine to "miss its moment", so to speak. It could just be aa bunch of other engines becoming preferred picks among studios, including proprietary engines. If UE5 doesn't see major adoption for this generation of consoles that would suck, but it's also perhaps its time to shine is with the farther back-half of this gen and once 10th-gen systems are ready to release.
 

SABRE220

Member
You know reading this comment actually made me think of a cheeky question: Is UE5 Epic's Sega Saturn?

That system was notoriously difficult to program for but I feel a lot of that was due to lack of highly optimized and dev-friendly APIs in a timely fashion. Epic have the money, they technically still have time on their side too. And I don't necessarily see any other 3P engine coming along that would usurp Unreal out of the market as defacto standard. Maybe Unity becomes more popular? I doubt it.

But it's not like UE has to be usurped by any one singular engine to "miss its moment", so to speak. It could just be aa bunch of other engines becoming preferred picks among studios, including proprietary engines. If UE5 doesn't see major adoption for this generation of consoles that would suck, but it's also perhaps its time to shine is with the farther back-half of this gen and once 10th-gen systems are ready to release.
Naw theres no real alternative in the industry currently, cryengine is basically dead since last gen, unity is committing suicide and funnily enough devs would just choose ue4 and customize it if they don't like ue5. They have a vast grip on the market and devs dont want to go through the hassle of moving to a completely new platform especially with epics robust documentation and support.

Ue5 has it all to prove yes but the technology itself is impressive they just need to make it practical and efficient enough to deliver with console spec hardware otherwise this gen might just skip the engine and the next gen consoles would be the right fit for it but alot can change in 5-6 years.
 
I'll wait before judging. I'm an avid advocate for proprietary in-house engines that fit the studio's needs. Look how many great engines we have in RE Engine (All Capcom games moving forward since RE7, excluding MH Rise), Snowdrop Engine (The Division engine, still used by Massive Entertainment in recently announced titles like Star Wars Outlaws), Decima Engine (Guerilla Games and Death Stranding 1 and 2), Square's no named engine used for FF XVI (believed to be a heavily modified Crystal Tools Engine), All the other Sony Studios in-house engines (they share technology between each other, which bolster their innovation and production pipeline), EA's Frostbite (it's showing its age but still a great engine for some impressive graphical showcase), CryEngine has some incredible graphical potential as well.

Unreal is just one in a myriad of gaming engines. It has impressive stuff that many developers aren't still using or aren't familiar with, and that's why we didn't get any mind-blowing games with UE5 just yet. I don't blame any developer packing their stuff to work with UE5 but it does feel like we are losing more than we can gain with a move like that. There isn't an engine that single-handedly do everything and be on top of every other when talking about games. So no, I don't believe UE5 was a mistake but I don't think it is this be-all-end-all engine some preach around here.
 
the engine is very forward thinking. it will be very good in the coming years, although perhaps not so much in 2023 when the ability to run your games at 90+ FPS is for some reason regarded as the height of video gaming.

silly people don't realize that video games have yet to achieve their true ultimate final form of ~240 fps with the PS5 Pro. of course, 240 fps is just a stepping stone towards 500 fps, which in turn will only lay the foundation for 20,000 fps, but they have to start somewhere...
Once upon a time I played console games at 30 fps and 60 fps for years.

60 was great whenever achievable and always preferable.

Now I have a 4090 I play at 120fps on a LG C1.

Guess what?

60 fps is noticeably worse than 120. Noticeably. In all areas of experience.

If 240 is that much better than 120, tell me why I should go back and pretend it's all meaningless?
 

Raonak

Banned
If you're gonna switch to UE5, you gotta make sure you have the developer resources to understand and optimise it properly.
 

CGNoire

Member
That explains why Lords of Fallen looks the way it does on my ps5. The visuals are in a state of flux lol. Lumen is def not working properly when you get shadow smears across the screen when a character moved, as well as textures and light flickering into and our of existence
Shadow smears or ligtmap smearing is what I was afraid of. Its not a bug but is Temporaly unresolved lightmaps indicating that the system running Lumen isnt up to the task.
 

BbMajor7th

Member
I'm not impressed so far, but I'm not a big visuals whore. Fast loading, steady performance and excellent gameplay are what I'd like from this generation.
 

magnumpy

Member
Once upon a time I played console games at 30 fps and 60 fps for years.

60 was great whenever achievable and always preferable.

Now I have a 4090 I play at 120fps on a LG C1.

Guess what?

60 fps is noticeably worse than 120. Noticeably. In all areas of experience.

If 240 is that much better than 120, tell me why I should go back and pretend it's all meaningless?
that's what I'm talking about, but set your sights higher. forget this ~ hundreds of FPS nonsense and realize we could easily get in the low thousands with todays machines. of course that is just a stepping stone on the way to millions and ultimately billions of FPS. true gaming ♥
 

KXVXII9X

Member
Hi Fi Rush say Hi.

How you doin?

the_making_of_hi_fi_rush_7.jpeg



I know it's using UE4, but the point is how versatile the engine is, it's just a tool, when used correctly you can achieve any result.
I was just thinking about Hi-Fi Rush reading this. UE4 has been used for many stylized games. Same with Sifu, Stray, Dragon Quest XI, and Pikmin 4. I wish there were more UE5 games/projects that were stylized. Fortnite looks great.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
No. DLSS, FSR and XeSS were a mistake.
Because they allowed devs to shit on optimising their games and just say "our game is optimised with FSR/DLSS/XeSS in mind."
So instead of them being used as a great bonus for framerate, they are used as a necessity for good performance.

Native resolution is for fools.
Even when checkerboarding was first introduced that should have been a sign that native rendering wasnt going to be a thing much longer

Ihave no idea how you can say devs shit at optimising their games when utilizing upscaling is a form of optimization, cuz you dont actually need to render everything at native res when the results will be barely noticeable.
Devs have been using different resolutions for different buffers forever.

If a dev using post processing after the upscaling pass, games looking just as good as native if not better in the case of DLSS.

But a reduction in overall graphical presentation in order to hit native res is very noticeable.

Hell theres so few PS4Pro, PS5 games that run at native 4K im honestly shocked anyone wants native 4K at all when we have XeSS and DLSS (FSR) can join the party too but I aint using that it.
 

MidGenRefresh

*Refreshes biennially
Most impressive games this gen:

Cyberpunk Phantom Liberty (path tracing)
Alan Wake II (looks like pre rendered cinematic)
Horizon Forbidden West (best visuals on console ever)
Spider-Man 2 (neck breaking traversal speed and little to no pop-in)
Microsoft Flight Simulator (how this is even possible? I don’t know)

And you can go on and on. Unreal Engine 5 is only good at marketing. They have the brand and nothing else to show.
 

MidGenRefresh

*Refreshes biennially
No. DLSS, FSR and XeSS were a mistake.
Because they allowed devs to shit on optimising their games and just say "our game is optimised with FSR/DLSS/XeSS in mind."
So instead of them being used as a great bonus for framerate, they are used as a necessity for good performance.

DLSS was a mistake because it’s so amazing that other companies had to come with half-assed solutions of their own? Lol.
 

Dice

Pokémon Parentage Conspiracy Theorist
Devs are just starting to get a feel for it. When they learn how to do things like use path-traced baked lighting for the majority of a scene and lumen on dynamic objects for a balance of performance and looks combined with nanite you'll see how genius the engine is. It's just so advanced with so many innovative solutions it is past the conventional training devs have had. Let them get used to it and figure out how to maximize results.
 

yamaci17

Member
No. DLSS, FSR and XeSS were a mistake.
Because they allowed devs to shit on optimising their games and just say "our game is optimised with FSR/DLSS/XeSS in mind."
So instead of them being used as a great bonus for framerate, they are used as a necessity for good performance.
they aren't a mistake and this is a big misconception the games never run at native before that. the native is a lie. it is fake as much as xess, dlss and fsr is.

from onwards 2016, most games rely "temporal accumulation". they use "temporal anti aliasing". this is the critical part. developers rely on temporal to accumulate to... UNDERSAMPLE effects. such as resolution that trees are rendered. resolution that shadows are rendered. hey, you can be at full native 1440p SHADER resolution and the oh so Digital foundry counts 1440p. but oh, you render your shadows at 1/8 of the screen resolution! and they only lookt "coherent" with temporal accumulation. oh your trees are at 1/64 of the screen resolution. and they look MUDDY and shit (like what happens with rdr2 if you play it at 1440p/1080p). but native is native, right?????

NATIVE was NEVER native with modern games that already heavily RELIED on temporal accumulation. only thing that is native about them is the "shader" resolution which only pops up if you count pixels. if you look at things at a closer level, you will see that more than half the effects you see are being rendered at 1/2 resolution and made coherent and made full with temporal accumulation. and guess what happens then? what happens if you "temporally" accumulate 1/2 resolution worth of shadows into full resolution? you guessed, they appear smudgy, blurry and garbage unless you play at "4K". this is why most modern taa games suck at 1080p and 1440p "NATIVE" because so many effects are being rendered at poor low resolutions.

this is why I chuckle whenever I see someone proudly saying "I play at native!" as if native have any meaning or whatsoever with modern implementation of temporal accumulation. yeah good luck playing RDR2 at so called "native 1080p". the game clearly looks like 540p THE SECOND you move your camera. trees are rendered at abnormally bad low resolutions, foliage also. quite literally the entire game is in shambles once you disable TAA. like, almost half of the game is never rendered. THE GAME LITERALLY RENDERS 540p worth of rendering all the time, but just tells you that it has a 1080p shader resolution. THAT's it.


the SECOND you move the camera, illusion breaks, temporal accumulation breaks and everything appears like it should: 540P. the game literally loses %50 worth of resolution once you move the camera. YET ! YOU WILL BE PROUD. YOU WILL BE PROUD THAT YOU PLAYED AT THE MYTHIC NATIVE RESOLUTİON. oh SO NATIVE. looks SO GOOD!

I've had enough of this bullshit.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
It's not an engine problem. Most of the games you're talking about didn't start development on UE5, they updated halfway through to take advantage of some of the features but lacking the experience to optimize them for the hardware.

I remember UE4 being criticized because "too heavy" for the PS4/XBO generation, it was similar to what we're seeing today. By the end of the generation no one was thinking UE4 was a mistake or too much for the consoles
Agreed, but some people also get burned because they think in-house is always more wasteful and they think this will save tons of money as makes the effort zero or something magical (some bean counters level are typically the ones making these kind of shots more than the in the trenches devs themselves).
 

Panajev2001a

GAF's Pleasant Genius
they aren't a mistake and this is a big misconception the games never run at native before that. the native is a lie. it is fake as much as xess, dlss and fsr is.

from onwards 2016, most games rely "temporal accumulation". they use "temporal anti aliasing". this is the critical part. developers rely on temporal to accumulate to... UNDERSAMPLE effects. such as resolution that trees are rendered. resolution that shadows are rendered. hey, you can be at full native 1440p SHADER resolution and the oh so Digital foundry counts 1440p. but oh, you render your shadows at 1/8 of the screen resolution! and they only lookt "coherent" with temporal accumulation. oh your trees are at 1/64 of the screen resolution. and they look MUDDY and shit (like what happens with rdr2 if you play it at 1440p/1080p). but native is native, right?????

NATIVE was NEVER native with modern games that already heavily RELIED on temporal accumulation. only thing that is native about them is the "shader" resolution which only pops up if you count pixels. if you look at things at a closer level, you will see that more than half the effects you see are being rendered at 1/2 resolution and made coherent and made full with temporal accumulation. and guess what happens then? what happens if you "temporally" accumulate 1/2 resolution worth of shadows into full resolution? you guessed, they appear smudgy, blurry and garbage unless you play at "4K". this is why most modern taa games suck at 1080p and 1440p "NATIVE" because so many effects are being rendered at poor low resolutions.

this is why I chuckle whenever I see someone proudly saying "I play at native!" as if native have any meaning or whatsoever with modern implementation of temporal accumulation. yeah good luck playing RDR2 at so called "native 1080p". the game clearly looks like 540p THE SECOND you move your camera. trees are rendered at abnormally bad low resolutions, foliage also. quite literally the entire game is in shambles once you disable TAA. like, almost half of the game is never rendered. THE GAME LITERALLY RENDERS 540p worth of rendering all the time, but just tells you that it has a 1080p shader resolution. THAT's it.


the SECOND you move the camera, illusion breaks, temporal accumulation breaks and everything appears like it should: 540P. the game literally loses %50 worth of resolution once you move the camera. YET ! YOU WILL BE PROUD. YOU WILL BE PROUD THAT YOU PLAYED AT THE MYTHIC NATIVE RESOLUTİON. oh SO NATIVE. looks SO GOOD!

I've had enough of this bullshit.
Fun that we are talking about this in a UE5 thread because actually ghosting and other glitches due to all these techniques coming together were quite evident in The Matrix demo (just focus on the main character’s head and rotate the camera for example).

Still, it is also somewhat of a fair point as games have been composited out of layers of blended buffers at various resolutions each for a long while. Still worth calling out when more and more resolution is dropped hoping these algorithms pick up the slack (there is a limit).
 

muno

Neo Member
EHej5ca.png


Opinions like this suck. People don't understand how much work it is to train new hires on an engine that has little to no documentation. Fox Engine is 10 years old and hasn't been updated for over 5. Most of the people working on this remake have no idea how to use it and is harder to work with when it comes to dynamic lighting solutions and material simulation. For instance, Fox Engine has its own implementation of PBR requires a specific workflow when making or editing textures. It is important to understand and properly apply this workflow - your textures won't look like they should if you don't. The material editor in Unreal is far more streamlined and is similar to other 3D creation suites such as Blender. It's just far easier to work with.

As an Unreal developer myself, having solutions like Blueprints, Lumen and Nanite make it far easier to iterate and build out a mock up for testing. It allows for more time to be spent on the game itself and since it is well documented, less time is spent learning the engine.
 
Top Bottom