• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Was the switch to Unreal Engine 5 a mistake?

We will see. So far UE5 games are a bit ot a let down. But I'm not a developer.

Also I've read that Capcom is making a nextgen version of RE Engine codename REX - RE Engine neXt generation. And they will be looking into licensing it similaraly to UE5.
 

Ribi

Member
I wonder if things would be different if Epic still depended on UE licensing as their main source of revenue.

ee686150-2cba-4d58-8de9-71ea45bcfb59_Epic+games+revenue.png
mind you that fortnite was in beta when it was making billions.
 
In the long run, it might be a good thing with developers being able to push the engine come the PS6 etc. In a way this is almost like a training phase where we are seeing experimentation and as always most experiments fail but that's the point. I will say that the best looking games I have seen this generation are all on proprietary engines and I can't see that being any different come next gen.
 
I think because most are still using UE4 et engines built on ps4 etc , UE5 is still not well optimized for Ps5 and Pc's , they will get there.
 
Yeah from Digital Foundry vid it looks like it needs proper optimalization, they should work out traversal stutter and shader compilation stutter and also work more on multi threaded performance.

Assets look really solid. All the upscale plugins integration is there. More titles should ship with software and hardware raytracing switch.

Image clarity should be a focus here too. Nice assets won't do much when image quality is subpar and there are stutters. Plus performence isn't there yet.

Then and only then it should be a nice industry standard.

Hopefully Epic works this one out when the second batch of UE5 games drop.

I remain hopeful but I will temper my expectations and hype till then.

But hopefully when Witcher 4 drops or Cyberpunk 2 it will be sorted out- sooner rather than later.

It feels like the engine is still in alpha or beta state and the engine released too soon?

Or maybe some devs are at fault here and there switching engines mid-development.

I'm no dev mind you but that's how it looks through the eyes of a gamer. Seeing all of these games that have issues here and there.

But it looks like it could and should be worked out in the near future.

Some of the issues look like engine issues to me - based on DF vid.

So that's why it feels like crapfest lately, I guess.

But not all of the issues.

I will wait a bit as I have mentioned.

I got other things to do than just spread hate or whatnot.

It's just games but it feels like they just started to test UE5 when it should be dandy from the start and some games pay the price...

But hopefully it gets there.
 
Last edited:

Killer8

Member
The engine is great, it's the hardware that can't keep up with it just yet. Lumen is a fantastic ray-traced lighting solution and Nanite basically vanquishes LOD pop-in like black magic. Pushing LOD for example has always traditionally brought hardware to its knees, so it's not surprising to see the high requirements when Nanite is used. Its inclusion goes such a long way to eliminate one of the biggest visual flaws in motion though that I think it's well worth the performance hit.

It's also worth noting that all of the games released so far are AA, indie or are fairly low budget AAA projects. The fact that gamers are saying "it isn't much better looking than UE4 games" is something of a good sign. Why? Because if a middle market developer of today, when using UE5, is able to make a game look as good or even better than a big AAA production on UE4 last-gen, that's a big deal. We've moved forward.

A lot is often said about pushing the envelope in the AAA bleeding edge space, but not enough is said about allowing developers of all sizes to push better visuals with the budget they have. The Immortals developer highlighted in an interview the immense time saving that the engine allows. In a world where game budgets and development times are bloating out of control, UE5 allowing developers to reign that in can only be good for the medium.
 
that's what I'm talking about, but set your sights higher. forget this ~ hundreds of FPS nonsense and realize we could easily get in the low thousands with todays machines. of course that is just a stepping stone on the way to millions and ultimately billions of FPS. true gaming ♥
I get the joke, but I can actually perceive the different between 30 to 60, and 60 to 120. I have yet to go above that, but if it provides a better experience, then.. why not make that the default?

That's a real question.

Maybe there is an imperceptible difference between 120 and 240, or 240 and 480, but why would we not want to find out, evolve the industry, and settle on the value that provides the greatest experience?

I have serious doubts that 99.99% of gamers would be able to perceive the difference between 1000 fps and 10,000 fps, but let's find out while we're here.
 

Kataploom

Gold Member
No. DLSS, FSR and XeSS were a mistake.
Because they allowed devs to shit on optimising their games and just say "our game is optimised with FSR/DLSS/XeSS in mind."
So instead of them being used as a great bonus for framerate, they are used as a necessity for good performance.
I'll be devil's advocate here and say that, while I agree that native resolution should be the baseline and not upscaled, currently graphical leaps require way more hardware, time and money to reach... So yeah, it's shit but a necessary evil in order to reach decent visual fidelity improvements.

Technology has matured a lot and diminishing return will be a thing until engines implements latest hardware features such as SFS, SSD, Mesh shader, RT and whatnot. Thing is, even if they did today, those features require way more power than current consoles have just to make a leap in fidelity while keeping more or less same performance/IQ as we've seen in latest UE5 games.
 
Last edited:
they aren't a mistake and this is a big misconception the games never run at native before that. the native is a lie. it is fake as much as xess, dlss and fsr is.

from onwards 2016, most games rely "temporal accumulation". they use "temporal anti aliasing". this is the critical part. developers rely on temporal to accumulate to... UNDERSAMPLE effects. such as resolution that trees are rendered. resolution that shadows are rendered. hey, you can be at full native 1440p SHADER resolution and the oh so Digital foundry counts 1440p. but oh, you render your shadows at 1/8 of the screen resolution! and they only lookt "coherent" with temporal accumulation. oh your trees are at 1/64 of the screen resolution. and they look MUDDY and shit (like what happens with rdr2 if you play it at 1440p/1080p). but native is native, right?????

NATIVE was NEVER native with modern games that already heavily RELIED on temporal accumulation. only thing that is native about them is the "shader" resolution which only pops up if you count pixels. if you look at things at a closer level, you will see that more than half the effects you see are being rendered at 1/2 resolution and made coherent and made full with temporal accumulation. and guess what happens then? what happens if you "temporally" accumulate 1/2 resolution worth of shadows into full resolution? you guessed, they appear smudgy, blurry and garbage unless you play at "4K". this is why most modern taa games suck at 1080p and 1440p "NATIVE" because so many effects are being rendered at poor low resolutions.

this is why I chuckle whenever I see someone proudly saying "I play at native!" as if native have any meaning or whatsoever with modern implementation of temporal accumulation. yeah good luck playing RDR2 at so called "native 1080p". the game clearly looks like 540p THE SECOND you move your camera. trees are rendered at abnormally bad low resolutions, foliage also. quite literally the entire game is in shambles once you disable TAA. like, almost half of the game is never rendered. THE GAME LITERALLY RENDERS 540p worth of rendering all the time, but just tells you that it has a 1080p shader resolution. THAT's it.


the SECOND you move the camera, illusion breaks, temporal accumulation breaks and everything appears like it should: 540P. the game literally loses %50 worth of resolution once you move the camera. YET ! YOU WILL BE PROUD. YOU WILL BE PROUD THAT YOU PLAYED AT THE MYTHIC NATIVE RESOLUTİON. oh SO NATIVE. looks SO GOOD!

I've had enough of this bullshit.
this is a weird post.

games dont have a native resolution.
they have an internal resolution, which can be anything, and an output resolution, which can also be anything.
assets are also created using various resolutions.
shaders can be various resolutions.

playing games at your display's native resolution is important and produces the best image quality (unless youre on a CRT, since a CRT doesnt have a native resolution/isnt a fixed grid).

rendering different parts of a game at different resolutions has been done since almost forever.
ever change the shadows setting in a game's menu?

and not every game uses TAA.
and "temporal" just means it's time-based, i.e., samples data from previous frames.
doesnt mean it's lower resolution.
 

Shut0wen

Member
No, whats happening now is the most common thing, though i will say unreal engine 5 being publicly released a year after next gen hasnt helped, most engines used to be announced a year or 2 before next generation consoles are even announced, apparently shifting development from UE4 to 5 isnt that much of a big deal plus i doubt any game thats came out hasnt even had the chance to modify the engine
 

DaGwaphics

Member
So far, I have hardly liked any Unreal Engine 5 games. They have very high hardware requirements and run barely on mid-range PCs and not exactly great on the consoles. They also look just ok visually. The Matrix demo was impressive and Fornite in Ue5 looks good too, but all the rest so far has been more than disappointing at least for me. Did Ue5 impress you?

UE5, like Unity, is an engine that is used by developers of all sizes. Just like with UE4, a lot of the projects that get released with it don't have a huge budget. I'm sure when the biggest projects get released they will impress.
 

magnumpy

Member
I get the joke, but I can actually perceive the different between 30 to 60, and 60 to 120. I have yet to go above that, but if it provides a better experience, then.. why not make that the default?

That's a real question.

Maybe there is an imperceptible difference between 120 and 240, or 240 and 480, but why would we not want to find out, evolve the industry, and settle on the value that provides the greatest experience?

I have serious doubts that 99.99% of gamers would be able to perceive the difference between 1000 fps and 10,000 fps, but let's find out while we're here.

fair enough. I imagine it is somewhat subjective, perhaps some people are very sensitive to frame rate and others aren't. perhaps this should be something you can select, I know some games will let you prioritize visuals or prioritize frame rate.
 

WitchHunter

Banned
In order to avoid another AI winter, you have to sell new hardware to the totally brainwashed "gamers" who only get mild erections when they see higher fps numbers. So it's a perfect vehicle to drive more gpu sales.
 

winjer

Gold Member
Nah, but putting slow CPUs in the current generation consoles was indeed a mistake tho..

I'm always surprised at how some people expect a 400$ console form 2020, to match a 5000$ PC form 2023.
You have to consider that consoles have to work in a tight budget of cost, power and die space.
 

Rubim

Member
I'm always surprised at how some people expect a 400$ console form 2020, to match a 5000$ PC form 2023.
You have to consider that consoles have to work in a tight budget of cost, power and die space.
The issue with that comment is:
Do you really think you need a 5k pc to outmatch a PS5 CPU?
 

twilo99

Member
I'm always surprised at how some people expect a 400$ console form 2020, to match a 5000$ PC form 2023.
You have to consider that consoles have to work in a tight budget of cost, power and die space.

Why blame UE5 for trying to push fidelity and graphics ahead by relaying on the latest modern hardware available?

As the budget option which uses older/weaker hardware, can’t really expect consoles to be showcasing the latest graphics.. they become more of a bottleneck.
 

winjer

Gold Member
Why blame UE5 for trying to push fidelity and graphics ahead by relaying on the latest modern hardware available?

As the budget option which uses older/weaker hardware, can’t really expect consoles to be showcasing the latest graphics.. they become more of a bottleneck.

That has nothing to do with some people having the expectation of a 400$ console to perform like a high end PC.
As with every generation, consoles will lag behind a high end PC, especially after a few years of launch.
What I'm talking about is having realist expectations of what a console can do.
 

Klik

Member
I wonder if both PS5/Xbox are too weak for proper full scale UE5 games?

I feel like the true UE5 potential will be unleashed with PS6/Next Xbox and 50+TF
 

Akuji

Member
every damn iteration of the engine the same talk.

are these always new people ? i feel like it should be kinda known by now that these transitions need time and always have.
The first games hardly ever are the ones u will remember a decade later.

UE4 was talked down exactly the same.
 
Top Bottom