• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rise of the Tomb Raider Cutscenes are 1440*1080, not 900p

Mivey

Member
Did you use a calculator?
I always carry it with me.

ab22-1.jpg
 

Adam M

Member
I think it's a joke about The Order.
Finally someone got it why I wrote that nonsense :D But funny that others started to explain the pixel count haha

I wrote that to refer that this is just nitpicking and playing with the numbers...play the game, who is playing won't notice that tiny detail loss from couch distance
Of course native resolution is the best but you are overreacting this resolution subject every time since The Order
 

thelastword

Banned
1600*900 vs 1440*1080 is hardly significant, the game will still look blurry and it's still upscaled. The only reason I can see they opted for 1440*1080 is because "1080" is still part of the equation and therefore more palatable for PR purposes. It will be interesting to see if these cutscenes never drops frames from 30fps.
 

Pif

Banned
Between 900p and 1080pr I still don't know which one Gaf despises the most after all this time.

In one hand, 900p provides a neatly scalable 16:9 image, which looks in place when upscaled to FullHD. But in the other hand, 1080pr, despite having the marketing power of making people believe it is i fact FullHD 1080p, it is a square missing a couple bajillion pixels on each side.

Oh boy.

The console wars were never so entertaining. Thank you Tomb Raider on Xbox one, you're a good chap. Now please be back a year later on PS4 and bring some more pixels along, maybe also a few frames more, or fewer is also ok. Either way, we're gonna be here waiting.
 

Mohasus

Member
1920x1080 = 2073600
1440x1080 = 1555200

The difference is 518 400. I am not a native speaker, so forgive my ignorance of words for sets. "a few" is used to describe more then half a million, yes?

What a way to complicate things.

1440 is 3/4 of 1920, therefore, 1920x1080 is 33% bigger or 1440x1080 is 25% smaller.
 

Shpeshal Nick

aka Collingwood
Is there a reason why so much misinformation about this particular game keeps being spread in the media that it continually requires Brian's clarification?

Never seen this happen so often on any other game.

1600*900 vs 1440*1080 is hardly significant, the cutscenes will still look blurry and it's still upscaled. The only reason I can see they opted for 1440*1080 is because "1080" is still part of the equation and therefore more palatable for PR purposes. It will be interesting to see if these cutscenes never drops frames from 30fps.

Corrected
 

dark10x

Digital Foundry pixel pusher
Prerendered cutscenes made games like Max Payne 3 and Hitman Absolution like 30 gigs each, so kill that shit

Back on topic, this is fine by me.
Nearly every cut scene is real-time. It wouldn't save any space in this case.
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
Sounds good. Sounds like the cutscenes are pretty strenuous on the engine itself, so a drop makes sense.

I wonder what it'll be like on PS4 and PC.

And also, i wonder if 360 has the same resolution drop in cutscenes

*edit* i'm a dumbass, 360 cutscenes are only pre-rendered versions of the XB1 game condensed down to 720p. I completely forgot even though we found out just yesterday
 

dark10x

Digital Foundry pixel pusher
Sounds good. Sounds like the cutscenes are pretty strenuous on the engine itself, so a drop makes sense.

I wonder what it'll be like on PS4 and PC.

And also, i wonder if 360 has the same resolution drop in cutscenes
I think it's clear that the PC version will run as well as the hardware can support.

I would not be surprised if the PS4 version delivered 1080p across cutscenes and gameplay *BUT* at 30fps rather than the unlocked frame-rate of Definitive Edition. Would be great to see them try for 60fps on PS4 but I would be surprised if they could pull it off.

I suspect the 360 version will run at sub-720p throughout the game. Maybe I'm wrong, though, haven't seen it yet.

*edit* i'm a dumbass, 360 cutscenes are only pre-rendered versions of the XB1 game condensed down to 720p. I completely forgot even though we found out just yesterday
Was that revealed somewhere? Can't wait to get my hands on it to see what they've done. Pre-rendering the XO scenes makes sense, though, but I would have preferred to see them try their hand at real-time scenes on 360.
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
I think it's clear that the PC version will run as well as the hardware can support.

Well aware of that, i mean just in terms of performance in general, how optimized it will be with that kind of cut-scene disparity in performance between gameplay ECT. Please be done by Nixxes, so we can not worry about that

I would not be surprised if the PS4 version delivered 1080p across cutscenes and gameplay *BUT* at 30fps rather than the unlocked frame-rate of Definitive Edition. Would be great to see them try for 60fps on PS4 but I would be surprised if they could pull it off.

Your a pretty smart tech guy, apparently all the volumetric lighting on XB1 is being done through asynch compute. Given PS4's huge compute advantage in this area, what say you about the performance benefits?

I would not mind if they had unlocked fps for gameplay and maybe locked 30 for cut-scenes....again, please be done by Nixxes. I actually trust them with the polish of the game far more than the actual devs.

Was that revealed somewhere? Can't wait to get my hands on it to see what they've done. Pre-rendering the XO scenes makes sense, though, but I would have preferred to see them try their hand at real-time scenes on 360.

I'm pretty sure i heard that yesterday from a source playing the game...but your making me doubt myself, let me find the source
 

Kezen

Banned
Well aware of that, i mean just in terms of performance in general, how optimized it will be with that kind of cut-scene disparity in performance between gameplay ECT. Please be done by Nixxes, so we can not worry about that



Your a pretty smart tech guy, apparently all the volumetric lighting on XB1 is being done through asynch compute. Given PS4's huge compute advantage in this area, what say you about the performance benefits?

I would not mind if they had unlocked fps for gameplay and maybe locked 30 for cut-scenes....again, please be done by Nixxes. I actually trust them with the polish of the game far more than the actual devs.



I'm pretty sure i heard that yesterday from a source playing the game...but your making me doubt myself, let me find the source

The 360 sku has Xbox One cutscenes (pre-rendered). The leaked 360 gameplay revealed that.
 
But the most important part of the game, you know, playing the actual game, is full 1080p. You don't play Tomb Raider to watch cutscenes.



It's a story based adventure game, you actually do play to watch the cut scenes to see what the heck is going on story wise...
 

Red Hood

Banned
I'm an absolute idiot when it comes down to the technical aspects, but how does this make sense? I mean, it implies it loses the 16:9 ratio which I'm sure is not the case, but what else is happening to the image quality exactly?
 
I'm an absolute idiot when it comes down to the technical aspects, but how does this make sense? I mean, it implies it loses the 16:9 ratio which I'm sure is not the case, but what else is happening to the image quality exactly?

Rendered as that resolution and then stretched to 1920x1080. The source image is the wrong aspect ratio then stretched to 16:9, this way it retains a full 1080 pixels vertical resolution thus has better image quality than, say 1600x900. At 1440x1080 the final output quality is fairly close to full 1080p to untrained eyes.
 

AlStrong

Member
I'm an absolute idiot when it comes down to the technical aspects, but how does this make sense? I mean, it implies it loses the 16:9 ratio which I'm sure is not the case, but what else is happening to the image quality exactly?

Pixels don't need to be square (although it is kind of hip, I hear).

They just adjust the FOV independently. It's a similar deal with anamorphic widescreen movies.
 

SRTtoZ

Member
Alan Wakes cutscenes on PC look like ass because of this type of thing, it really took me out of the game.
 

Skux

Member
I'm an absolute idiot when it comes down to the technical aspects, but how does this make sense? I mean, it implies it loses the 16:9 ratio which I'm sure is not the case, but what else is happening to the image quality exactly?

You can also have a pixel aspect ratio. In this case the image is stretched horizontally so the pixels are wider, thereby filling a 16:9 screen space.
 

HTupolev

Member
Viva Pinata :D edit: erm... although I guess that was just the patch tess for the garden, no deformation.
I'm mostly referring to semi-tacky but expressive realtime surface deformation. Looking at Lara dash about in the snow there gives similar vibes to splishy-splashing in Halo 3.

I'm an absolute idiot when it comes down to the technical aspects, but how does this make sense? I mean, it implies it loses the 16:9 ratio which I'm sure is not the case, but what else is happening to the image quality exactly?
They render a squished image, then stretch it out.

Basically, the GPU might render an image that, if it were interpreted as a bitmap on a square pixel grid, looks like this:

SBE5Bgr.jpg


Then they scale it horizontally into the final aspect ratio, and the result looks correctly unsquished.

cN3x8O5.jpg
 

thelastword

Banned
Alan Wakes cutscenes on PC look like ass because of this type of thing, it really took me out of the game.
I've been playing some PC games lately and some of these low rez cutscenes look really bad. Every game should do their videos in-engine and at proper aspect ratios.
 

HoodWinked

Member
lots of drive by comments which i think are getting the wrong idea.

first of all there are alot of games where cut scenes have drops in framerate for example https://www.youtube.com/watch?v=tkB8gpPzMkw

this is because cut scenes may use higher quality assets jacked up LoD so things look good. well this causes some frame drops.

tomb raider was doing a GOOD THING by dropping it down to 1600x900 so that the cut scenes run smoothly. The good thing is that they've done some engineering work and now are able to push the resolution up to 1440x1080 which is a modest improvement in pixels.

The gameplay itself is supposedly 1920x1080.

For scaling images it really depends.

1600x900 upscaled to 1920x1080 will pretty much degrade the whole image but its a consistent degradation. 1440x1080 upscaled to 1920x1080 will only scale horizontally so you get image distortion in some places less in others.
 

oneida

Cock Strain, Lifetime Warranty
what's the buzz on this game? haven't really been watching it. I liked the first one well enough to finish it, are people expecting it to be as good or better?
 

Harlequin

Member
what's the buzz on this game? haven't really been watching it. I liked the first one well enough to finish it, are people expecting it to be as good or better?

If you liked the first one, you should probably get this. If you didn't (that's me), you really shouldn't.
 
what's the buzz on this game? haven't really been watching it. I liked the first one well enough to finish it, are people expecting it to be as good or better?

At least as good, it's got a greater focus on the titular tombs though so hopefully it'll be even better.
 

Keihart

Member
I'm mostly referring to semi-tacky but expressive realtime surface deformation. Looking at Lara dash about in the snow there gives similar vibes to splishy-splashing in Halo 3.


They render a squished image, then stretch it out.

Basically, the GPU might render an image that, if it were interpreted as a bitmap on a square pixel grid, looks like this:

SBE5Bgr.jpg


Then they scale it horizontally into the final aspect ratio, and the result looks correctly unsquished.

cN3x8O5.jpg

That looks like Driveclub i think...
.stealth Driveclub 1080p brag?
 
Top Bottom