• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Biggest graphical leap in a series in a single generation?

ITT: Lacking reading comprehension, bullshots, stretched shots and emulator shots. GAF, I am disappoint.

My answer: Final Fantasy VII to Final Fantasy VIII.

320px-Reno_sprengt_die_S%C3%A4ule.png
linear-ff8.png


date4.jpg
ptStXdS.jpg


Final_Fantasy_VII_%28PSX%29_009.png
5.jpg


Of course I can't definitively prove those haven't been meddled with, but unlike some other posts in here these are at least all the same and - more importantly - native res screenshots.
 
IWell all the obvious ones have been beaten to death, so I'm gonna say Infamous 1 vs 2.
Lots of improvements overall.



N64 vs NGC isn't quite exactly the same gen. Might as well compare Jak and Daxter to Uncharted 3.

That and the 64 game actually ran at 60 fps. Misleading gif.
 
Winner winner chicken dinner time!

Motortoon GP1

M4wsEhy.jpg


Gran Turismo 2
EnlnQ3X.jpg


(and come on, they totally count as a "series" was the prior clearly evolved into the latter)
 
You might actually be able to trick someone with an emulated/PC shot of FFVIII's Angelo as the end point of a generation and LR as the start of one. The fact the PS1 had a better dog model from the same company is kind of scary.
Wind-Waker-Windfall.jpg

the-legend-of-zelda-twilight-princess-wii-screenshots-20060825051212628.jpg


just kidding
or not ¬_¬
While that's easily written off as stylistic differences... it'd hold even better if you used the Space World demo I think. Granted that was a tech demo... but the fact the tech demo that had no other concerns (albeit was very hastily developed) was more or less ousted by a similarly styled final game that had a lot more to worry about is pretty impressive.
 
You might actually be able to trick someone with an emulated/PC shot of FFVIII's Angelo as the end point of a generation and LR as the start of one. The fact the PS1 had a better dog model from the same company is kind of scary.

While that's easily written off as stylistic differences... it'd hold even better if you used the Space World demo I think. Granted that was a tech demo... but the fact the tech demo that had no other concerns (albeit was very hastily developed) was more or less ousted by a similarly styled final game that had a lot more to worry about is pretty impressive.

It would be difficult to make a model that bad and not be trying to. They must have let someone's kid do that in like half an hour.
 
Biggest leap of a series on the same hardware you ask?
Fatal Fury.

fatalfury-1.png

Fatal_Fury-w350-h500.jpg

Great choice. What you can't see in those screenshots is just how gorgeously smooth the animation in Mark of the Wolves was. Hard to believe it's all the same hardware.

Jak and Daxter to Jak 3:

5391632cbc760.png

5391634a2ded0.png

Also a great choice. The leap was biggest between part 1 and part 2, though. The character models just exploded in detail.

My contribution: Shadow Dancer vs Shinobi 3 on Genesis.

http://youtu.be/bjPKiaBuQ-Q


http://youtu.be/epdselGCXkI
 
The thing is... if it's 96 bits per pixel (which is pretty typical for PS360 Gbuffers), using tiling to reach 720p would be a questionable decision. Tiling isn't free, you're already able to hit 1152x720 (or theoretically even 1200x720) without tiling, and "720p" hardly has much of a nativeness bonus these days, so you'd be making a pretty significant sacrifice to permit the use of just 10% higher spatial sampling.

As you note, Crytek did try their hand at deferred rendering on 360... and their choice was exactly the same as Bungie's, 1152x720. Most games that use framebuffer tiling on 360 try to get a decently large boost out of it.

Halo Reach didn't use tiling and crytek didn't try deferred rendering, deferred lighting is different from deferred rendering/shading ala Killzone 2/Killzone 3. The only performance hit you get from tiling is from rendering additional geometry, the whole purpose of tiling is so that you avoid going sub HD because of the eddRAM's memory limit, wouldn't be much of a use if you had to go sub HD even with it. More on it here: http://forum.beyond3d.com/showthread.php?t=60118

Also GTA4 was deferred rendered and 720p (with 2*MSAA to boost) and I think GTA5 is too, Trine 2 and Trine 1, were deferred rendering as well and so was NFS: Hot pursuit..all of them native 720p on 360. BF3 and BF4 use deferred rendering as well but they cut a few lines off the top and bottom for performance reasons rather than not being able to fit in eDRAM because they have to use tiling since even with their setup of 1280*704 they cannot fit it inside the edRAM.
 
Halo Reach didn't use tiling
I know, that's my point. Using tiling to boost from 1152x720 to 1280x720 while using a 96 bits-per-pixel buffer format is arguably fairly wasteful; it's simply not a very large increase in capacity to justify the cost.

The only performance hit you get from tiling is from rendering additional geometry
And the work figuring out which objects fit to which tiles, and technically also a bit of shuffling the tiles around while the frame is being rendered.

It's not extremely expensive, but comments from developers make it sound like it's not entirely negligible either. For instance, dealing with the overhead to use two tiles might look reasonable if you're aiming for a 14MB buffer format (since you just boosted capacity by 40%), but it'll look less reasonable if you're only aiming for a 10.5MB buffer format (since you're not boosting your capacity by much at all). The relative hit of tiling is bigger in the latter case; you get less bang for your buck.

(Of course, it's probably fairly complicated, considering where the costs are going to occur. For instance, Bungie brought up the complaint that one of its relevant impacts was latency, since the CPU had figure more stuff out before kicking off a new frame.)

Now, if it's something other than 96 bits-per-pixel (which it could be, but even you used "deferred lighting" to suggest what Halo 4 might be doing, which is frequently 96 on PS360), my argument starts to fall apart.

and crytek didn't try deferred rendering, deferred lighting is different from deferred rendering/shading ala Killzone 2/Killzone 3.
Yes, I misworded that. Shoulda been "deferred lighting."
 
ITT: Lacking reading comprehension, bullshots, stretched shots and emulator shots. GAF, I am disappoint.

My answer: Final Fantasy VII to Final Fantasy VIII.

320px-Reno_sprengt_die_S%C3%A4ule.png
linear-ff8.png


date4.jpg
ptStXdS.jpg


Final_Fantasy_VII_%28PSX%29_009.png
5.jpg


Of course I can't definitively prove those haven't been meddled with, but unlike some other posts in here these are at least all the same and - more importantly - native res screenshots.

To be fair, the backgrounds in both are amazing. If you bumped up the resolution, they'd still be good.
 
Fatal Fury: King of FIghters -> Garou: Mark of The Wolves
(Man, it's hard finding good pics for these games...)

gfs_86394_2_12.jpg


garou000.jpg
As I ve already written it s the best candidate because it was a perfect storm of causes:
- long life of the ecosystem thanks to the mvs popularity. That meant the development tools greatly improved in the decade.
- the first game came out early on and was average looking even for the time compared to launch games like Magician Lord.
- above all what gave a decisive advantage compared to other 2d system was that cartridge space grew exponentially in the span of a decade. The last Fatal Fury is thirteen times bigger than the first which meant space to fit finely crafted art and animation was never a problem in the later part og NG life, assets that could be accessed almost istantely.
This is a very important factor, just imagine a Snes game without the space limitation (IMO snes was heavily costrAined by cartridge space later on).

Basically the first FF look more or less like a Snes game while Garou was a fine fit for Dreamcast.
Almost three generation leap on th same hardware.
 
They were, but I thought we were comparing games in a generation on one console. If we are crossing platforms then the difference will be fairly huge.

Fair enough. Actually -- very fair -- once I read the OP's semi-angry edit emphasizing same console; not quite sure why that's a limiting factor, but it's their thread and they can specify whatever parameters they wish. As a participant, I can only respect the rules :-)

If the OP doesn't have any other implicit limitations for graphical leaps, then I'd probably back the crowd citing Garou 1 -> Garou: MotW. The fact that they jumped from ugly graphics and semi-choppy animation in Garou 1 to beautiful graphics and animation bordering on silky in MotW speaks volumes. But then again, subjective minds can debate the Neo Geo's "same hardware" profile by questioning the fact that they enhanced the whole ROM technology as the years went by. If I recall correctly, the earlier games would display something like "MAX 330 (something or the other)..." which in later years changed to "GIGA Power (something or the other)..."

PS: Besides Garou MotW, another marvelous end-of-life (ish) title for the Neo Geo is Last Blade 2. Really pretty...
 
My number one answer hasn't changed since the last time this question came up.

Virtua Fighter, Saturn, 1994:

vf1saturn800x600tue3q.png


Virtua Fighter 2, Saturn, a single year later:

vf2saturn800x600l9eos.png


Flat-shaded polygons became textured polygons! 320x224 became 704x480! 30fps became 60fps! All at the same time! What's going to top that?
 
On mobile, so bear with me not posting screenshots (maybe someone else can), but infamous to infamous 2 was an incredible step at the time.
 
To bad the pop in for the Witcher 2 was terrible.
And the Witcher 2 looks in a way like the most supercharged PSP game ever, like its power was increased by magnitudes to be THE most powerful gaming platform ever. . . and it still can't get rid of that damn dithering.
 
Top Bottom