• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Black Myth: Wukong - Official Trailer | gamescom 2023

See combat against fearsome foes in this latest trailer for Black Myth: Wukong, an upcoming action-adventure RPG from developer Game Science. Black Myth: Wukong is set in the 16th Century and focuses on traditional Chinese folklore, and the game will be available in 2024.


Experience the beauty of Black Myth Wukong with performance multiplying NVIDIA DLSS 3 in this extended gameplay reveal.



12 Minutes of New Gameplay (English Dub & Subtitles):

 
Last edited:
Waited a long time for this one. If it releases I'll be pretty happy. Seems like an atomic heart situation for me. I never expected it to release, but it was a nice short and sweet game.

Hope this is the same
 

Clear

CliffyB's Cock Holster
Cool, except that isn't true (it's mostly down to the developers not the game engines, especially a mature engine like UE)

Wrong. The only reason developers re-engineer/customize engines is because either the inbuilt feature set doesn't cover their needs, or because performance is lacking. And why would anyone adopt an engine going in with the assumption that the engine they are paying for is lacking in performance?

So, its always a matter of how far below expectation the final product is, and never above target.

Yes. Developer quality/investment matters, because art and design quality are massively impactful. But expecting devs to magically pull performance way above the norm is simply unrealistic. The whole notion of third-party engines is that the core render-tech is already optimized, because that's a key component of the offer of these technologies, along with offering a pre-built pipeline and other productivity conveniences.

Stuff like native render resolution, vram and ram usage as a consequence of geometric and texture density are very predictable and as the target hardware has hard resource limits end up being decided upon relatively early in development. These parameters have to be stayed within, with the major delta in the end result being performance level for those targets.

So when you see early footage, chances are that its the absolute best-case scenario as running on the most performant and least constrained hardware. By the end of development when all content and the overall vision is complete, AND it also needs to function on the weakest target hardware, then trims and compromises are almost guaranteed to have been applied across the board for consistency.

Its not an attempt to deceive when the final result shows cut-backs or reduced performance, its just there are limits on what can be optimized back.
 

midnightAI

Member
Wrong. The only reason developers re-engineer/customize engines is because either the inbuilt feature set doesn't cover their needs, or because performance is lacking. And why would anyone adopt an engine going in with the assumption that the engine they are paying for is lacking in performance?

So, its always a matter of how far below expectation the final product is, and never above target.

Yes. Developer quality/investment matters, because art and design quality are massively impactful. But expecting devs to magically pull performance way above the norm is simply unrealistic. The whole notion of third-party engines is that the core render-tech is already optimized, because that's a key component of the offer of these technologies, along with offering a pre-built pipeline and other productivity conveniences.

Stuff like native render resolution, vram and ram usage as a consequence of geometric and texture density are very predictable and as the target hardware has hard resource limits end up being decided upon relatively early in development. These parameters have to be stayed within, with the major delta in the end result being performance level for those targets.

So when you see early footage, chances are that its the absolute best-case scenario as running on the most performant and least constrained hardware. By the end of development when all content and the overall vision is complete, AND it also needs to function on the weakest target hardware, then trims and compromises are almost guaranteed to have been applied across the board for consistency.

Its not an attempt to deceive when the final result shows cut-backs or reduced performance, its just there are limits on what can be optimized back.
But you said we know the performance/fidelity based on the game engine. You cannot determine that because it depends on the skill of the developer working with that engine that determines the outcome. Whether you modify the engine via plugins or other tools is irrelevant because most engines allow this by default.

Some games are high frame rate and look great, some are high frame rate but are basic looking, some are low frame rate but look amazing, some are low frame rate and look like crap... All on the same engine (especially UE because it's open to any developer and is the widest used engine).
 
Last edited:

Clear

CliffyB's Cock Holster
But you said we know the performance/fidelity based on the game engine. You cannot determine that because it depends on the skill of the developer working with that engine that determines the outcome. Whether you modify the engine via plugins or other tools is irrelevant because most engines allow this by default.

Some games are high frame rate and look great, some are high frame rate but are basic looking, some are low frame rate but look amazing, some are low frame rate and look like crap... All on the same engine (especially UE because it's open to any developer and is the widest used engine).

Do you not understand that there are baselines set in terms of performance when dealing with any shared technology? And if you know where these baselines are you can then predict with reasonable accuracy variances above and below that based on user skill?

Naturally the potential "floor" differential can be much greater than the ceiling "above expectation" because its subtractive versus additive. So in terms of mean average over x number of iterations what we see as the norm is likely to be on the lower end; but that doesn't mean that even if the observed normal outcome slightly under-represents the baseline potential, that the high-end is going to be unexpectedly performant!

Because ultimately its about deviation from the projected baseline, and you aren't likely to achieve that without radical re-engineering or some other impacting novel approach. And the plain truth is that people who create novel solutions tend to have a track record of that prior to adopting third-party tech!
 

midnightAI

Member
Do you not understand that there are baselines set in terms of performance when dealing with any shared technology? And if you know where these baselines are you can then predict with reasonable accuracy variances above and below that based on user skill?

Naturally the potential "floor" differential can be much greater than the ceiling "above expectation" because its subtractive versus additive. So in terms of mean average over x number of iterations what we see as the norm is likely to be on the lower end; but that doesn't mean that even if the observed normal outcome slightly under-represents the baseline potential, that the high-end is going to be unexpectedly performant!

Because ultimately its about deviation from the projected baseline, and you aren't likely to achieve that without radical re-engineering or some other impacting novel approach. And the plain truth is that people who create novel solutions tend to have a track record of that prior to adopting third-party tech!
But this isn't just some hardly used game engine that has a low baseline and low ceiling, this is Unreal Engine. If a developer made a crap looking game on UE that is on the developer not on the engine. Sure, there is a ceiling (kind of, engines can be updated and new plugins and such can be made to take advantage of those),

All I am saying is, every developer will get different performance and fidelity out of an engine because talent differs at each studio. Otherwise all games would have the same fidelity and performance, which is what you alluded to (maybe you meant the theoretical max of the engine, I dunno)

Anyway, this is all off topic, but this game looks nowhere near as good as its first showing in my opinion.
 
Last edited:

Exentryk

Member
This is day 1. I do hope it has enough story content though and isn't just a boss rush or like Souls games.
 

Clear

CliffyB's Cock Holster
New Gameplay (English Dub & Subtitles)





Ehhh. Looks like prettier version of Nioh/Wo Long but with way less interesting gameplay. Obviously gameplay may get more involved, but the combat looked like it lacked the sort of push-pull reactivity between player and target that marks out the best experiences of this type.
 


Most exciting gameplay I've seen recently from this.

The 3rd person hack and slash Genre is getting a bit saturated but I think this stands out.

The art style and especially shape shifting are what sets it apart. Just smacking dudes about with your staff for 30 hours would get old.

I think I've seen enough now for it too no longer be vapourware!

What's encouraging is if you go back to the original gameplay trailer is all the mechanics shown there are still here. Accept being a fly but that's probably a bit OP so likely a late game power.
 

Exentryk

Member
The world looks very interesting and unique. The combat can be made to look really flashy if you get good at it. I was hoping the game would be more like a traditional RPG with more story/characters/towns/inns/etc, but it seems to be a souls-like (which I'm not a fan of). But we have seen some very cool story cutscenes which does give me hope that there is more.

Regardless, the world and combat here is a lot more interesting and so I'll likely end up enjoying it despite the game having those annoying souls-like elements. Day 1 on Steam.

Story cutscene if you haven't seen it before:

 
Top Bottom