Ezquimacore
Banned
That's your answer, because unreal is good and performs great.Why should anyone care about which engine devs choose as long as the game is good and performs great?
That's your answer, because unreal is good and performs great.Why should anyone care about which engine devs choose as long as the game is good and performs great?
If so that will be bad for game engine diversity if everyone is using the same thing it will all look the same and feel the same
lastly don't forget that Tencent owned 40% of Epic Games
Borderlands, Gears of War and Fortnite say hi.If so that will be bad for game engine diversity if everyone is using the same thing it will all look the same and feel the same
Thus another reason to not want a Unreal Game Engine Monopoly in the gaming industry
as Tencent/China will have their hands on everything in the Western world and that is bad for everyone
The SDK includes a lot of assets and shaders, if you use those you'll end up with a generic game. The issue is when devs switch engines and plot to undercut those costs at the same time.From devs who know the engine. A lot of ones inexperienced with it still have that unreal look. Although thats probably more to do with them than the engine. Theres a lot unreal probably isn't suitable for.
There was indeed a "look" that UE3 was prone to achieving on X360/PS3, but it was as you said a mix of factors. For starters X360 and PS3 were quite RAM deprived so one solution is reducing color depth which actually helps twice, it helps on source textures color depth (reducing space taken on RAM, HDD or DVD/Bluray) and masks lower color depth buffers and even some dithering.This whole "every game will look alike" is hilarious. The 360/PS3 generation made everyone paranoic.
The jump to HD was a tough one for many studios, so they just CTRL C + CTRL V a lot of the basic tools from Unreal's toolbox in order to deliver AAA games.
But does Mirror Edge look like Batman Arkham City? Bioshock Infinite? Dishonored? A Hat in Time?
Good studios were able to modify it to their needs.
Thats the general way people look at things on GAF. Microsoft is bad no matter what they do. epic is good when they put out a demo on ps5 first and no longer good when the same tech is shown on Xbox later. We saw that with people claiming there was some kind of downgrade after it was shown on xbox.the issue here is epic = bad.
at least that's the narrative on here. same goes for microsoft.
Look at how difficult it is to cut out Russia from the world economy now try to do it with China.Who care about epics position amongst the other engines? The real tragedy is that a chinese company is allowed to goble up 4(9?) percent of the company's shares. This is a gift rapped money funnel into the chinese regime. Appalling.
I think that they needed to switch after what happened to cyberpunk 2077... A lot of the studios that end up switching do it because they figure out the hard way that they don't have the resources or skills to keep up with the big engines.After big huge Game Dev like CDPR say they will switch over to UE5 for the Witcher 4 instead of using their own in house REDengine i started to get worry
Nope, games don't have to use the engine's built-in assets (however small and indie studios will definitely have to go down this road a lot).If so that will be bad for game engine diversity if everyone is using the same thing it will all look the same and feel the same
Don't they already? We should definitely get away from this, not embrace it. However, we are not starting from a purity position, this ship sailed back in the 70s.Tencent/China will have their hands on everything in the Western world and that is bad for everyone
If you combine UE5 with all the other existing commercial (and heck, throw open-source in there too) license-able engines on the market, they probably just barely reach around 10% of Unity userbase.The Monopoly in question is their Unreal Game Engine. With the latest being the Unreal Engine 5.
Have you guys ever even used Unreal? it does not come with many assets or shaders, where did you get this from. There's barebones example shaders like 5 of them lol and that's about it, where do y'all get this nonsense from???..you have free templates/projects and the marketplace where you can get additional stuff but that's a different thing altogether.The SDK includes a lot of assets and shaders, if you use those you'll end up with a generic game. The issue is when devs switch engines and plot to undercut those costs at the same time.
First time you use a new engine there's usually a big investment, as flexible as UE is you need to implement your workflow and dependencies if the ones supplied are not ideal and development usually takes longer.
There was indeed a "look" that UE3 was prone to achieving on X360/PS3, but it was as you said a mix of factors. For starters X360 and PS3 were quite RAM deprived so one solution is reducing color depth which actually helps twice, it helps on source textures color depth (reducing space taken on RAM, HDD or DVD/Bluray) and masks lower color depth buffers and even some dithering.
It was preferable for devs at the time, specially seeing games like Gears of War and Call of Duty 4 getting away with it to go with the sepia look. It was also something that was already in motion looking in the prior generation, "realistic games" were already going for sepia overtones due to ram constraints as well. Also remember that Xbox 360 was stuck with DVD's, PS3 didn't have that problem but it's RAM pool design meant less memory for textures/graphics which further increased the bottleneck for multiplatform games. (read: textures had to both fit on DVD's courtesy of X360 and had to fit in 256 MB counting with the framebuffer courtesy of the PS3)
The direct alternative, which is saturating the palette would create visible artefacts (think jpg compression and chroma subsampling) which is what happened with Bioshock Infinite and Dishonored to a fault. Mirror's Edge sidestepped this nicely though, most likely with b&w texture work and overlays or tinting.
It doesn't help that indeed UE3 was more RAM hungry than more barebones engines meant for previous-gen games that got updated and that meant less RAM for colorful textures. IMO, other engines had a better balance on that front but even then the difference was not huge in hindsight.
I feel that's still true, UE4/5 is still more demanding than other engines, which is why it's seldom used for mobile and underpowered consoles like the Switch (and when it is, it shows), but we're getting to a point on home consoles that it doesn't matter as much (on the texture side of things it already didn't matter with PS4/Xbox One). Plus the investment to support PS5/XSX with "next gen tech" that pushes boundaries is probably steep and Unreal Engine 5 seems like it might have an advantage, specially if Nanite reaches usable state in third party development.
I prefer a healthy engine/dev tool market with lots of competition (which is in fact at risk), but fact is Unreal Engine has no direct competitors that match their investment and independent status. But Epic is not at fault for the ineptitude of others, I haven't seem monopolistic plays done by them, like buying a competitor to close them or something.
I didn't. I read it over the years that they were making libraries available with such things.Have you guys ever even used Unreal? it does not come with many assets or shaders, where did you get this from. There's barebones example shaders like 5 of them lol and that's about it, where do y'all get this nonsense from???..you have free templates/projects and the marketplace where you can get additional stuff but that's a different thing altogether.
Ray traced doggo piece constructed out of 1 trillion polys.imagine a Monopoly game made with this monopoly engine.
/threadYou're right.
No one cares.
OP should have known how we feel 'bout them libs and their "Forced Engine Diversity" Woke nonsense thats RUINING video games. I hate it when game devs use another engine just for the sake of "Engine Diversity" to appeal to the woke mob.This is quite a thread fail. Is that why the OP abandoned it?