• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Game Graphics Technology | 64bit, procedural, high-fidelity debating

dr_rus

Member
Damn. Ive been hoping to get some devs giving their real world results of how useful ID buffer really is compared to "software" solutions
ID buffer is a software solution - it's just another bit being written to the memory when rendering to identify a primitive. You still need to do things with it to actually use it anywhere.

This is really, really clever

I'm shocked Horizon is using FXAA as well. Never seen such good results from it before. I guess that tangram pattern works well with it.
They are using FXAA + TAA (+ soft resolve + sharpening on Pro). It's not just FXAA.
 
hmm didn‘t half-life 2 already do the screen stuff? this somehow does not seem very impressive to me... could be just me, though.

Far from an expert on this but I guess they don't need to render scenes into the world to render something to a texture. Half Life 2 did have to render what was seen on screens into it's world first.
Rendering something to a texture itself is not new, the way it is done makes the difference (I think)
 

zooL

Member
took a course in computer graphics and rendering to a frame buffer texture and then putting that texture on geometry is simple stuff... but I guess I just don‘t understand what the new idea here is haha! no offense...
 

KKRT00

Member
Yeah, none of this is new, though that implementation seems to be one of the better ones.

Yep, though i dont know if holographic projection tech is not new.
Definitely it is the best implementation of second view port i've in any game and seems super customizable with ability to relight elements in real time.

----
im missing something i think

Really? Can you name any other game that does something like that in degree that they are showing it here?
The application of this tech, in its full scope, are immense for any game like that.
 

Snefer

Member
Yep, though i dont know if holographic projection tech is not new.
Definitely it is the best implementation of second view port i've in any game and seems super customizable with ability to relight elements in real time.

----


Really? Can you name any other game that does something like that in degree that they are showing it here?
The application of this tech, in its full scope, are immense for any game like that.

Render to texture like they show there have been easy to achieve for a long time in unreal 3 even. Now there might be lots of optimisations, and how it fits into the pipeline, culling etc. They talk about some level of detail optimisations based on size of the rendertexture, but other than that I cant tell what is very new about it tbh.
 
Render to texture like they show there have been easy to achieve for a long time in unreal 3 even. Now there might be lots of optimisations, and how it fits into the pipeline, culling etc. They talk about some level of detail optimisations based on size of the rendertexture, but other than that I cant tell what is very new about it tbh.

It is definitely cool how it scales the RTTs quality based upon so many factors, but I guess the wholly unique thing is mainly how the holographic projections use it?

Most games for those kinda things tend to just use normal geometry with a holo-rim shader on it, don't they (or in DOOM 2016's case, with a dither + a holo rim shader)? Instead of it being a volume being projected into world space via RTT like it is here.
 

Snefer

Member
It is definitely cool how it scales the RTTs quality based upon so many factors, but I guess the wholly unique thing is mainly how the holographic projections use it?

Most games for those kinda things tend to just use normal geometry with a holo-rim shader on it, don't they (or in DOOM 2016's case, with a dither + a holo rim shader)? Instead of it being a volume being projected into world space via RTT like it is here.

Yeah, but what does that MEAN? It assume they just grab a secondary scene and then renders that in place basically, within a predefined volume. So the only thing that would be new in that case is how it fetches the geo, because what they are doing in their demos is easy peasy to do in other engines too. Fetching another scene isnt all that tricky.
 

Snake29

RSI Employee of the Year
It's different because everything in the background is in realtime. So when they add this to the PU, and some player walks behind the characteryou will see it in realtime at your holodesk.

This will also mean that they can add body cams on some exploration suits, so that the crew at the bridge can have a realtime video feed to follow the expedition party. Same we've seen in the movie Prometheus.
 

Sylfurd

Member
So the GG presentation on Anti-Aliasing, Checkerboarding, spherical area lights and height fog was posted.

The AA and Checkboarding sections are rather awesome. They make note of how using full-resolution hints (like triangle index buffer, depth, or A-tested stuff) was too expensive for them to get 2160c running at 30fps in Horzion (probably bandwidth bound), so they opted just doing everything at checkboarded resolution... albeit rotated and rearranged so FXAA and and TAA make it look the way it does in the end, whilst being efficient.

Quite fascinating.

It also explains the detail pattern you see in Horizon screen shots at 4K where it does not resolve anything smaller than 2X2 pair pixels. Kinda awesome.
checker41kxi.png
What is the point of the tangram transform ?
 

Snefer

Member
It's different because everything in the background is in realtime. So when they add this to the PU, and some player walks behind the characteryou will see it in realtime at your holodesk.

This will also mean that they can add body cams on some exploration suits, so that the crew at the bridge have a realtime video feed to follow the expedition party. Same we've seen in the movie Prometheus.

Yeah, thats what I mean. None of that is new at all from a technical perspective.
 

pottuvoi

Banned
Yeah, thats what I mean. None of that is new at all from a technical perspective.
Yes, we have seen similar tech many times before.
One of the more interesting things in their implementation is the automatic render to texture size evaluation and support for multiple cameras with displays in each others view.
 
Can someone explain what is the advantage of not "cheating" at holograms is besides just being cool at tech wise? It gets you exact same result, no?

Cheating works in fixed scenarios, a method like this allows for many gameplay elements to be translated through this tech. Not only NPC and scripted moments but also players can push out real time content. One example may just be the The Reliant Mako - News Van.
 

bonej

Member
This is really fascinating.

Neural Network modelled ambient occlusion - it approaches ground truth in a pretty cool way.
http://theorangeduck.com/media/uploads/other_stuff/nnao.pdf
nnao1cs0g.png
similar idea to deep shading. http://deep-shading-datasets.mpi-inf.mpg.de/

I have been asking myself when do we get a dedicated Asic for NN inferencing in mainstream consumer cards. NNs have lots of potential in video compression, video enhancement (such as super resolution or motion interpolation), post process effects etc.
 

bonej

Member
With so many CB solutions out there (ubi, guerilla, EA, even UE4 has one) I wonder if the "best" is gonna crystallize and if they receive names like other rendering techniques have. It would also be interesting if you were able to choose the upscaling/reconstruction algo like you can do with the several AA solutions for example.

But now we are still in the beginning and CB seems to be much more complex than AA solutions with more possible tweaks, so in this early time the fragmentation leads to faster inovation
 

nOoblet16

Member
So I've been thinking about something. Destiny 2 on console suffers from an issue of low FoV now I've always been curious about low FoV on consoles because why couldn't it have been "just a bit more"? But at the end of the day it comes down to perhaps having performance issues in a few parts and the developers would settle on an FoV that's the lowest common denominator for the entire game.

However Destiny is a special case because it has a third person mode where the FoV is extremely wide and akin to PC version FoV turned up to 100 that players can go to anytime they wish as many times as they want and it has the exact same performance as first person mode. (I can't remember if increasing FoV on PC also affects third person FoV, it probably doesn't).

So it stands to reason that the game is indeed rendering much more than we assume and the frustum for culling is much larger than what the first person FoV suggests. In which case...why do you guys think console versions dont offer an FoV slider?
 

illamap

Member
So I've been thinking about something. Destiny 2 on console suffers from an issue of low FoV now I've always been curious about low FoV on consoles because why couldn't it have been "just a bit more"? But at the end of the day it comes down to perhaps having performance issues in a few parts and the developers would settle on an FoV that's the lowest common denominator for the entire game.

However Destiny is a special case because it has a third person mode where the FoV is extremely wide and akin to PC version FoV turned up to 100 that players can go to anytime they wish as many times as they want and it has the exact same performance as first person mode. (I can't remember if increasing FoV on PC also affects third person FoV, it probably doesn't).

So it stands to reason that the game is indeed rendering much more than we assume and the frustum for culling is much larger than what the first person FoV suggests. In which case...why do you guys think console versions dont offer an FoV slider?

Maybe it is just design decision where they assume most players play on average size tv fairly far away, and not account those with big tvs or those playing on monitor. Performance wise Destiny 2 seems to have plenty of headroom at least on ps4.

I can't write with full confidence how higher fov affects framerate, but shouldn't it push lods back thus helping with performance and shouldn't developers account for worst-case scenario anyways when viewing game area from far away. Maybe higher fov would have induced objects popping up on farthest lods. Anyways just my graphics noob thoughts.
 
Hi Everyone,
I really have enjoyed setting up this thread, contributing, learning, and reading everyone else's contributions here. GAF is just full of people who care and know a heck of a lot more than I do. I am grateful. Like many of the niche communities on GAF, we were interested in something specific, hungry for knowledge about it, and willing to engage in friendly and caring discussion concerning it. Always helping each other out, and hopefully linking and creating content that made others more knowledgable or interested at the same time.

That time has come to an end for me here on NeoGaf and I will be going elsewhere.

I hope that anyone who stumbles across this thread can realise what a great thing we had going here, and then they will realise that the same thing will be found elsewhere.

So forgive me my ranting, my specificity, and most especially, my biases and terrible typing.

You can find me on Twitter until the next portal opens up. We will rise again to nerd out about the resolution of the third shadow cascade in your favourite game.
Best,
Dictator
 

Domaje

Member
Hi Everyone,
I really have enjoyed setting up this thread, contributing, learning, and reading everyone else's contributions here. GAF is just full of people who care and know a heck of a lot more than I do. I am grateful. Like many of the niche communities on GAF, we were interested in something specific, hungry for knowledge about it, and willing to engage in friendly and caring discussion concerning it. Always helping each other out, and hopefully linking and creating content that made others more knowledgable or interested at the same time.

That time has come to an end for me here on NeoGaf and I will be going elsewhere.

I hope that anyone who stumbles across this thread can realise what a great thing we had going here, and then they will realise that the same thing will be found elsewhere.

So forgive me my ranting, my specificity, and most especially, my biases and terrible typing.

You can find me on Twitter until the next portal opens up. We will rise again to nerd out about the resolution of the third shadow cascade in your favourite game.
Best,
Dictator
It's spelled ResetERA, not RsetERA.
 
Top Bottom