• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unreal Engine 4 Kite Open World Cinematic

E-Cat

Member
The current consoles, at the time of launch, were around 4-5 years behind bleeding edge single-GPU performance.

Extrapolating from this, if the next batch of consoles is to be launched around 2020, it is not at all unreasonable to expect them to match GPUs from 2015 or 2016. That, plus new arch bells and whistles from the late teens.

I'm really hoping for a 2016 level of performance, because that's when we'll _finally_ get a node shrink from 28nm --> 14nm and exceed the 10 TFLOPS barrier. By that time, it will have been four years--instead of the regular two--without a new lithography process, deriving all the perf gains since early 2012 not from Moore's Law, but architectural improvements and bigger die sizes alone. Madness! And totally unprecedented.
 

KidJr

Member
They ARE individually modelled and instanced around the scene so that comparison is a shitty one.

I was just about to re post to ask that question, no textures at THAT good at giving flat surfaces depth.

I dont mean to turn this into a comparison thread at all, but perhaps a better comparison would be uncharted's rocks (they're modelled individually I think?) but I dont understand when applying textures there is such a difference in quality? (these rocks look far superior to that of uncharted imo), is it an issue of bandwidth (for better quality textures) or the lighting effects, that makes uncharted's geometry look somewhat plastic in comparison to this?

I mean this is the only demo I've seen that comes close to the infamous sorcerer demo (which I know many people dubious as to is even possible on the ps4)
 

KKRT00

Member
I think to emphasize the realtime nature of the demo, they could have given users a slight control of the camera with pre-determined limits, just let that wobble or zoom a tiny bit (so as not to worry about dynamic culling), if they are not doing this, they should!.. (Think MGS4 realtime-cutscenes)

It looks spectacular btw, kudos to the team for this achievement!

They give You full control over camera in their booth and they will release this demo and full world to the public in several weeks.
 
The big difference for me, is that as good as rocks look in Ryse and Crysis, in the demo the rocks look like they've been individually modelled as opposed to textures/shaders on a flat surface. It's little things like that for me that make huge difference.

In the twitch video they show that they did exactly that. They photo scanned the rocks and de-lighted them so they can apply their own lighting mechanics on them etc.

You could also see this in a different light, that it isn't necessarily the engine and the graphics card that makes all the difference, it's the work behind the assets. You could make those other examples of Ryse & co look drastically better if you are willing to spend the work hours and the money on that.
 

orioto

Good Art™
Kite5.gif

The Last Guardian vibes from that place and light..
 

KidJr

Member
In the twitch video they show that they did exactly that. They photo scanned the rocks and de-lighted them so they can apply their own lighting mechanics on them etc.

You could also see this in a different light, that it isn't necessarily the engine and the graphics card that makes all the difference, it's the work behind the assets. You could make those other examples of Ryse & co look drastically better if you are willing to spend the work hours and the money on that.

Which leads me to believe the likelyhood of seeing a game like this in real life is unlikely, not because of horsepower but because it's just not commercially viable to dedicate that much time to doing stuff like texture mapping rocks lol
 
The current consoles, at the time of launch, were around 4-5 years behind bleeding edge single-GPU performance.

Extrapolating from this, if the next batch of consoles is to be launched around 2020, it is not at all unreasonable to expect them to match GPUs from 2015 or 2016. That, plus new arch bells and whistles from the late teens.

I'm really hoping for a 2016 level of performance, because that's when we'll _finally_ get a node shrink from 28nm --> 14nm and exceed the 10 TFLOPS barrier. By that time, it will have been four years--instead of the regular two--without a new lithography process, deriving all the perf gains since early 2012 not from Moore's Law, but architectural improvements and bigger die sizes alone. Madness! And totally unprecedented.

4-5 years behind? What?

Are you really comparing ps4 to a gtx 280 or hd 4870?
 

E-Cat

Member
4-5 years behind? What?

Are you really comparing ps4 to a gtx 280 or hd 4870?

Don't forget that Xbox One also counts as a current-gen console. :p So, no, I'm not comparing PS4 to a HD 4870. Computationally, it's somewhere around 2009 high-end level, but with a more modern architecture--which I was alluding would be analogous to how it will be with the ninth generation of consoles.
 

Durante

Member
How is there a difference between AMD and Nvidia Tflops?
You talking about drivers?
It's not just drivers, it's the entire hardware stack and its average efficiency for game-like rendering tasks. That's traditionally a bit higher per-FLOP on the NV side right now.

Of course, this doesn't change the actual FLOPs, and such "adjustments" are inherently flawed. But perhaps not more so than using FLOPs (or any other single number!) as a metric for rendering performance in the first place.
 
Some of the close ups didn't look too good like the grass. I just wasn't blown away by this overall. Samaritan wows me more even though I'm aware this demo probably has much better tech.
 
D

Deleted member 17706

Unconfirmed Member
What? There's tons of pop-in.

After watching it a few times, there is definitely a good amount of LOD transitions, but I still contend that it's much less noticeable than most games, and certainly less noticeable than UE3 and how it managed texture LOD.
 

Zil33184

Member
It's not just drivers, it's the entire hardware stack and its average efficiency for game-like rendering tasks. That's traditionally a bit higher per-FLOP on the NV side right now.

Of course, this doesn't change the actual FLOPs, and such "adjustments" are inherently flawed. But perhaps not more so than using FLOPs (or any other single number!) as a metric for rendering performance in the first place.
I wonder how it would be possible to benchmark this outside of software conditions. Unless we're comparing a particularly optimized example on GCN vs. an equivalent on Maxwell, there might just implementation level differences that favour one over the other.

I do agree using FLOPS count is a faulty way of comparing two architectures, especially if the comparison is between GPUs 4 to 5 years apart. ;)
 

Vuze

Member
Wonder what the download size will be. The hand full of assets from the demo they released a while ago already clocked in at about 7gb...
 

Xiraiya

Member
I mean it looks incredibly impressive but nothing mind blowing at a distance, HOWEVER the fact it could hide a lot of it's LOD in ways that made it barely noticeable while also having crazy amounts of detail close up.

I'm thoroughly impressed.
 
Top Bottom