• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Final Fantasy real time tech demo - Luminous Studio [Up3: Survey/Screens/Video]

I feel it would also be kind of weird if a ton of companies were investing what must cumulatively be hundreds of millions of dollars in technology that won't work on next-gen consoles.

Exactly. These demos aren't there to show off engines and graphical output that'll never be achieved in the next 8 years. They're not running some kind of novelty expo.
 

RedSwirl

Junior Member
If I can see an actual Final Fantasy battle going on and town exploration with this level of graphical fidelity, we will have truly reached "next gen". At that point I also think we'll finally be beyond the need for FMV in these games.
 

XiaNaphryz

LATIN, MATRIPEDICABUS, DO YOU SPEAK IT
I get what you mean, but when you have the SE guy talking about changes/interactions in realtime while camera work, lighting and assets switching show Maya menus/tabs, isn't it weird?

Not at all, as again it's very similar to what was done with TFU. S-E is just driving the changes through their editor. Assuming they're doing it like LEC did, how the engine is registering those changes would be exactly how it would react in-game. There's no point in creating a remotable engine if the results you're getting isn't representative of the final result - the reason to set this sort of thing up is to be able to quickly preview and iterate without having to wait for a rebuild.

Shouldn't all this stuff happen during the playback of the executable (not the source code one) and maybe have a UI of his own?

I think I see what you're trying to say. That's one approach, but the one being demoed I feel would be more efficient at streamlining workflow. If your artists are using these authoring tools, it would be best to save time during the more iterative tweaking stages by being able to see how their changes are occuring live in the engine as opposed to making a change, then pushing it to the build, loading up the runtime, looking at how the asset looks, figuring out what needs to be tweaked, then going back to the authoring tool and starting it all over again.

So yeah, while the runtime they're using may be a dev or debug build as opposed to a retail one, that doesn't necessarily mean it won't be representative of what a release build would look like. If anything, I'd expect a release build to have better performance - and yes, in that case if they wanted to show off what they were doing through the editor they'd have to make sure it was triggered appropriately - but the fact that some attribute change was being driven by code as opposed to an external source doesn't negate the fact it's still all real-time.
 

Kisaya

Member
I just want a main FF game that isn't turn based :l

This looks amazeee though. Too bad there's a low chance of ever seeing this again, but it's good to know that visually their games will look great next gen.
 
Not at all, as again it's very similar to what was done with TFU. S-E is just driving the changes through their editor. Assuming they're doing it like LEC did, how the engine is registering those changes would be exactly how it would react in-game. There's no point in creating a remotable engine if the results you're getting isn't representative of the final result - the reason to set this sort of thing up is to be able to quickly preview and iterate without having to wait for a rebuild.



I think I see what you're trying to say. That's one approach, but the one being demoed I feel would be more efficient at streamlining workflow. If your artists are using these authoring tools, it would be best to save time during the more iterative tweaking stages by being able to see how their changes are occuring live in the engine as opposed to making a change, then pushing it to the build, loading up the runtime, looking at how the asset looks, figuring out what needs to be tweaked, then going back to the authoring tool and starting it all over again.

This is a more clear now, but definetely shady because it's based a lot on "presuming this work, it will look like this", sounds as something scripted on the fly than genuinely generated while it's happening.


So yeah, while the runtime they're using may be a dev or debug build as opposed to a retail one, that doesn't necessarily mean it won't be representative of what a release build would look like. If anything, I'd expect a release build to have better performance - and yes, in that case if they wanted to show off what they were doing through the editor they'd have to make sure it was triggered appropriately - but the fact that some attribute change was being driven by code as opposed to an external source doesn't negate the fact it's still all real-time.

I get this, I remember looking at the SONY BEND guys having Unchy GA running at 12-15 fps while they were in a "realtime in-engine" workspace even with some of the textures/assets missing to help the output.
So, their way to change the attributes plays definitely better, but I can't shake from my head that the right approach to techdemos
it's like the Zelda WiiU one I linked previously, surely has a lot less control over the scene but it's undeaniable that is happening realtime.

The Agni's one sounds like you mess with some factors on a timeline and then let the player/engine prerender, and then show it as it was in realtime.
 

djtiesto

is beloved, despite what anyone might say
Mhm menu based, my bad. But yeah, pretty much why I'm excited for Versus. Would like to see FFXV have a different battle system too ;x

Boo you, there's enough action RPGs out there for you to play, especially nowadays. Turn-based RPGs on the other hand? Not really.
 
Mhm menu based, my bad. But yeah, pretty much why I'm excited for Versus. Would like to see FFXV have a different battle system too ;x

If FFXV didn't have menus/wasn't turn based to some degree then you can bet there would be a shitstorm of epic proportions. People claiming SE are 'westernising' it and whatnot.
 

Kisaya

Member
Boo you, there's enough action RPGs out there for you to play, especially nowadays. Turn-based RPGs on the other hand? Not really.

:l Sure I could go play all the action RPGs there are out there, but I like Final Fantasy too. Not hating on turn based or anything, just would like one action based FF that isn't a spin off.
I won't be that disappointed if they decide to go menu/turn based anyway. As long as they execute the game well ;p
 

Durante

Member
I get this, I remember looking at the SONY BEND guys having Unchy GA running at 12-15 fps while they were in a "realtime in-engine" workspace even with some of the textures/assets missing to help the output.
So, their way to change the attributes plays definitely better, but I can't shake from my head that the right approach to techdemos
it's like the Zelda WiiU one I linked previously, surely has a lot less control over the scene but it's undeaniable that is happening realtime.
I don't understand your point at all. So if you have fantastic integration between your authoring tools and your new engine (something particularly important for S-E who have had productivity problems this gen), then you shouldn't show that integration in a tech demo? Why?
 
I don't understand your point at all. So if you have fantastic integration between your authoring tools and your new engine (something particularly important for S-E who have had productivity problems this gen), then you shouldn't show that integration in a tech demo? Why?

I'm not arguing the tool, which for now seems to be doing something already known in the industry but with prettier assets also thanks to specialized
hardware that will hardly to be found in a home console, but how they stressed the realtime and in the end, it's just something in between.

Luckily, the chances of a second FFVII HD tech demo fiasco seems less, but they're on the overhyping train as usual.
 

USIGSJ

Member
One more cool thing from the demo games could use. Light shafts where light source is not screen space dependent.

agnisphilosophy1nm7o.jpg
 

dramatis

Member
I'm not arguing the tool, which for now seems to be doing something already known in the industry but with prettier assets also thanks to specialized
hardware that will hardly to be found in a home console, but how they stressed the realtime and in the end, it's just something in between.

Luckily, the chances of a second FFVII HD tech demo fiasco seems less, but they're on the overhyping train as usual.
The key point here is that the Zelda demo is a gameplay demo. Its primary function is to demonstrate the game and the use of the WiiU gamepad as opposed to showing off graphics.

Agni's Philosophy is a tech demo. It showcases the real-time graphical capabilities of Luminous Engine. The demonstration following the demo shows the ease of using the engine editors.

While it is running on a high-end PC, it is real-time and not "staged", which is the word you used to begin with. What Unreal4 is showing is obviously not on a console either, but since the next-gen consoles are unavailable, they can't show you what things will look like. But you're not going to turn around and call every tech demo staged (or are you?).

Ironically, one of the benefits reaped from Crystal Tools for this project is SE's experience with developing Maya/XSI/Photoshop plugins, so it's not surprising to have real-time editing from Maya to Luminous. Cryengine totes a "Live Sync" with Maya too, just that it's not part of the free SDK.
 
I refuse, REFUSE to get hyped.

I remember, back when this gen was announced, that the FFXIII videos shown were what convinced me to get a PS3. Imagine my profound sadness when I played that game. :(

It does look very pretty, but I'm watching with much, much more caution.
 

legacyzero

Banned
Finally got to watch this.

GAT DAMN!! I sincerely hope that next gen Final Fantasy is based on this concept, only without the dudes running around in favelas with AK's....

Less somalian pirates more dragons and magic first was awesome then when pirates bursted in i was like meh.

THIS.
 

Anuxinamoon

Shaper Divine
Ahahah My husband and I watched this trailer yesterday and he was like "Holy... okay yes I can see why they didn't even want to know about me when I applied for a job there! HAHAH!"

Just amazing. It's on another level and a strong female character lead. I love how they approached this and the realtime cloth and hair and omg the lighting. So good.

One thing I did mention in the survey was I hope they can pull off this kind of quality throughout a sustained development. Its all well and good to have amazing in game graphics but being able to apply them to a reasonable production schedule, without burning out 90% of your team. That's the key, and that key is found in artist tool support and smart asset creation.
Work smarter, not harder.

Anyway, wow, totally amazing! I wish them all the best and canny wait to see what else they can achieve!
 
Dont know how I missed this...

Yeah it looks pretty impressive and also looks imperfect in all the right places to make it clear its actually a real time thing and not CG.


If this is what next gen Final Fantasy is then sure. Ill give it a look.

FFXIII is still one of the best looking games this gen despite all its other flaws and it sucked they couldnt have used that engine more this generation.
 

Cartman86

Banned
I'll just say that this E3 is the first in maybe 5 years that i've bothered to actually download a game trailer to my PC because I wanted to see the visuals in perfect clarity. Between this, Star Wars, and Watch Dogs.
 
The key point here is that the Zelda demo is a gameplay demo. Its primary function is to demonstrate the game and the use of the WiiU gamepad as opposed to showing off graphics.

Don't mean to act like like a, erhm, but could you please link to me such gameplay demo?
Because the one I linked shows nothing about gameplay, it shows real time controls on lighting and the camera while a cutscene plays, and that indeed show off the graphics of an engine in realtime.
A tech demo.

Agni's Philosophy is a tech demo. It showcases the real-time graphical capabilities of Luminous Engine. The demonstration following the demo shows the ease of using the engine editors.

Real-time graphical capabilities changed in the Maya workspace and then injected in the game engine, it's different.
The only shot where I see luminous engine assets looks something like an After Effect/Premiere workspace: http://agnisphilosophy.com/files/img/philosophy/screenshots/original_l.jpg
And it indeed shows something like realtime controls on camera, and there is jpg artifact noise only on the game window that puts me
off on how much genuine this capture can be, or maybe it's a filmlike noise effect ala Siren, exif data doesn't help.

Still, the fact that the window name is CutSceneEditor MainWindow can suggest there are other parts of the Engine that mess with other things.

Please, understand that I'm just trying to learn something about this thing, it's not a witch hunt.

While it is running on a high-end PC, it is real-time and not "staged", which is the word you used to begin with. What Unreal4 is showing is obviously not on a console either, but since the next-gen consoles are unavailable, they can't show you what things will look like. But you're not going to turn around and call every tech demo staged (or are you?).

Implying game, not my thing, gonna pass, sorry about it.
Epic has always been clear on the specs used in their demos, and most of the time they're interactive and such, Gears 2 meatcube docet (oh, but that's gameplay)
Square Enix demos are different.

Ironically, one of the benefits reaped from Crystal Tools for this project is SE's experience with developing Maya/XSI/Photoshop plugins, so it's not surprising to have real-time editing from Maya to Luminous. Cryengine totes a "Live Sync" with Maya too, just that it's not part of the free SDK.

Interesting, but crytek demoscan't compare, the past ones from CGD even ran on different platforms showing off how different the hardware performs, the footage it's without any doubt realtime.
 

Kagari

Crystal Bearer
I went to an hour closed door demo of this. Some really interesting things came out of it. Interview and impressions incoming later.
 

Kagari

Crystal Bearer
Many staff were on hand including the character designer and Takeshi Nozue who is also working on FF Versus XIII.
 

Quazar

Member
Hashimoto couldn't specify when Luminous-powered games will hit the market, but he smiled and said, "It won't be that far from now."

While this is the next-generation of Square Enix's game engines, Hashimoto said the engine can also be adapted for current-generation consoles, but stopped short of confirming a current generation Luminous Studio game.

Hmmm
 

XiaNaphryz

LATIN, MATRIPEDICABUS, DO YOU SPEAK IT
Interesting, but crytek demoscan't compare, the past ones from CGD even ran on different platforms showing off how different the hardware performs, the footage it's without any doubt realtime.

Still quite not sure why you're hung up on this. You can still have an engine be real-time and reacting to external input like an authoring tool. It's not spitting out images and playing them back as a movie file or anything. I'm sure if this demo was running on lesser hardware, you'd also see performance get impacted.

Again, the whole point of integrating real-time game engine output with your authoring tools is so you can see how things are rendered in the final output without having to boot up the engine separately. It would defeat the purpose of it if what its showing isn't representative of what the engine would produce, or if it couldn't display it at interactive framerates (which is worse case, i.e. for lighting render preview tools for CG production where renders would take hours, ideally for in-game runtime previewing you want full framerate obviously).
 

Aroo

Neo Member
Am I the only one who doesn't care? I played about 30 minutes of Final Fantasy on the PS3. The characters looked way better but they still had these annoying "Jerk Neck left right" method of showing emotion.

This looks nice but I don't care if it puts me to sleep just like the first on the PS3.
 

raven777

Member
Yes it can be scaled. Part of it is used for Versus on PS3 already.

yea this was told in nomura's interview a while ago. looking at Agni's Philosophy, I may feel little optimistic about using the lighting of luminous engine to Versus. The versus footage we got last year didn't seem to have great lighting. But it was just a footage and the quality wasn't great.
 

squidyj

Member
uhm, if this should answer my question, sorry, I don't get it :p



I get what you mean, but when you have the SE guy talking about changes/interactions in realtime while camera work, lighting and assets switching show Maya menus/tabs, isn't it weird?
Shouldn't all this stuff happen during the playback of the executable (not the source code one) and maybe have a UI of his own?

WiiU had a techdemo of Zelda that played like this on the console.
http://www.youtube.com/watch?v=arHNcSMXaBk

...Maybe replicating maya ui is the best way to make the experience of creating content comfortable for content creators?
 

Anuxinamoon

Shaper Divine
Still quite not sure why you're hung up on this. You can still have an engine be real-time and reacting to external input like an authoring tool. It's not spitting out images and playing them back as a movie file or anything. I'm sure if this demo was running on lesser hardware, you'd also see performance get impacted.

Again, the whole point of integrating real-time game engine output with your authoring tools is so you can see how things are rendered in the final output without having to boot up the engine separately. It would defeat the purpose of it if what its showing isn't representative of what the engine would produce, or if it couldn't display it at interactive framerates (which is worse case, i.e. for lighting render preview tools for CG production where renders would take hours, ideally for in-game runtime previewing you want full framerate obviously).

This seriously. For the longest time I have wanted my 3D app to be so integrated into the game engine that it was the game engine (I believe there was a siggraph talk on this back in 06)
Having to switch between apps is a time waster and the more blurred the lines between the apps gets, the happier I will be.
I have high hopes for their tools, don't let me down square! Let me know that you accept a less manual and a more organic approach game dev.
 
Still quite not sure why you're hung up on this. You can still have an engine be real-time and reacting to external input like an authoring tool. It's not spitting out images and playing them back as a movie file or anything. I'm sure if this demo was running on lesser hardware, you'd also see performance get impacted.

Nothing to argue with this, but we haven't see that engine at work, just the output.

Again, the whole point of integrating real-time game engine output with your authoring tools is so you can see how things are rendered in the final output without having to boot up the engine separately. It would defeat the purpose of it if what its showing isn't representative of what the engine would produce.

Also nothing against this, it's the fact that the engine shown changing assets in real-time it's maya, and they tell it's their it's what baffles me.

...Maybe replicating maya ui is the best way to make the experience of creating content comfortable for content creators?

Well that would be fun, it's maya, or a a maya lookalike?
The icons are the same as my 2011 copy
 

Durante

Member
Nothing to argue with this, but we haven't see that engine at work, just the output.



Also nothing against this, it's the fact that the engine shown changing assets in real-time it's maya, and they tell it's their it's what baffles me.
I think you still don't get it. They replace the Maya preview renderer with their own engine. So what you see is the real-time output of the engine, in Maya. It's not so hard to understand really. Also, how have we "not seen that engine at work" when we have seen its output? The output is what the engine does.
 
Top Bottom