• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Evolution of Graphics Technology

MadPanda

Banned
I just gave an example KZ:SF and Horizon. And yet, you still don't address the specifics? Those videos don't look DRASTICALLY different at all! Why? Because a lot of the techniques are REUSED!

I gave an example of volume smoke. And it gets ignored. What the hell?
Have you played these two games or you're just comparing videos? I've played both of them and to my eyes the difference is night and day. Maybe some things are the same, but playing both of them there's a big difference.

Assassin's creed black flag vs odyssey is a big leap. Call of Duty ghosts vs modern warfare is like two different generations completely. Forza horizon 2 vs forza horizon 4 is a big difference. And so on and so forth.

I can't guarantee that it will be the same this gen too, as I don't know the future, but I'd be very surprised if assassin's creed in 2026 and call of duty are not much better looking games.
 

Sun Blaze

Banned
Have you played these two games or you're just comparing videos? I've played both of them and to my eyes the difference is night and day. Maybe some things are the same, but playing both of them there's a big difference.

Assassin's creed black flag vs odyssey is a big leap. Call of Duty ghosts vs modern warfare is like two different generations completely. Forza horizon 2 vs forza horizon 4 is a big difference. And so on and so forth.

I can't guarantee that it will be the same this gen too, as I don't know the future, but I'd be very surprised if assassin's creed in 2026 and call of duty are not much better looking games.
For one their art style and environments are totally different, so kinda hard to compare them there.

For two, Black Flag was a game developed on PS360 and ported over to PS4/X1, so not sure the comparison is admissible.

Forza Horizon 2 is another one that was an Xbox 360 game ported to the Xbox One.
 

Clear

CliffyB's Cock Holster
It also brings up another point that turns into arguments here - that technology is moved forward by the console exclusives. That is just completely wrong and yet, I get shunned for telling otherwise.

Uptake/general usage of technology *IS* moved forwards by console exclusives because its an economic consequence of closed-system development. While PC tech will always be at the vanguard, its a trickle-down effect generally because only a subset of users can enjoy the benefit. This makes devs less willing and able to really lean on the tech because the justification for investment simply isn't there.

The result is the perception that consoles are further ahead than they actually are, the incentives and financial resources to implement are just more favorable.

Where consoles have technologically excelled most is in terms of establishing standards for optimal coding practices and methodologies, and again that's a result of closed architectures imposing hard limits on what can be done. Ambition becomes a true double-edged sword as it can result in dismal frame-rates when no amount of optimization can make up the shortfall in resources.

It would be unfair to characterize all PC code-dev as being sloppy and less disciplined, but it is a far more prevalent issue because the temptation is always there to lean on better hardware to hit targets knowing that it offers instant vindication if only for an advantaged minority of users. So yeah, although there are many brilliant minds and high performing teams within the sector, its basically an environmental consequence.

The key thing is that games tend to be less simulations than illusions, they are about experience and effect more than absolute accuracy. So whomever creates the best illusion, be it by visual trickery, judicious resource budgeting, or simple art-direction choices tends to be perceived as the winner.
 

MadPanda

Banned
For one their art style and environments are totally different, so kinda hard to compare them there.

Forza Horizon 2 is another one that was an Xbox 360 game ported to the Xbox One.


Not really. Forza horizon 2 Xbox one vs Xbox 360 are different games on different engines.

"It's based in the same world, it's based on the same themes," Playground Games creative director Ralph Fulton said of the Xbox 360 version. "Rather than thinking of them as the same game on different platforms, they are different games inspired by the same ideas."
www.polygon.com/platform/amp/2014/6/23/5834042/forizon-horizon-2s-xbox-360-version-is-a-different-game-than-on-xbox

Ryse was to be a u kinect 360 game then it was salvaged as an Xbox one game yet people are using it as an example.
 
Last edited:
Nope. However, it depends on whether you want Cyberpunk 2077 MAX settings or not. Because I don't see every game popping up doing that kind of rendering as a norm. Not when a RTX 3090 is rendering it all in the 30-40s FPS. If you think the version of the PS5/XSX that comes out next year is what we will see all generation, then I'm ok with that.

that’s not what I said, I said the leap will be similar (with all bells and whistles)

right now cyberpunk on ps5 looks nowhere near as good as Spider-Man, Demons Souls, or heck even Assassins Creed

will the next gen patch change that? Maybe

but naughty dog, Sony Santa Monica, Rockstar, Guerilla, and team Kojima will undoubtedly blow cyberpunk out of the water

peak console performance isn’t coming from a game developed with last gen in mind from a company that doesn’t give a shit about console optimization
 

Lethal01

Member
Not a generational leap like comments like "if it's doing this so early on, imagine what it will look like in 3yrs..". We both know there is a lot of hyperbole going on with the Sony crowd. No use in defending that.

How is this hyperbole? If games come out that look as good as the UE5 demo then saying "think of how much better the next uncharted will look" is totally fair since it will look way better.
 
Last edited:

VFXVeteran

Banned
How is this hyperbole? If games can out that look as good as the UE5 demo then saying "think of how much better the next uncharted will look" is totally fair since it will look way better.

Nah bro. You've been around these parts and on the speculation threads to read between the lines. Next time someone makes a bold claim about wait till you see what's in store in 3yrs, ask them what specifically are they imagining. You'll quickly turn your head when they start saying they expect Cyberpunk-like RT settings on a console that you know can only render as well as a 2080 at best and with the knowledge that DLSS is nowhere in the console hardware to remedy the computation costs. They really believe that the developers are THAT far behind understanding the hardware to where they can pull BIG gains in a GPU with low-end RT performance.

Also to add, you are really hung up on UE5 demo. Because you think it's the best looking thing you've ever seen so far (despite giving mad praise over the rendering of Cyberpunk), it's going to be hard for us to agree on observations and therefore, rendering tech's impact as a whole as the generation unfolds. FYI.
 
Last edited:

VFXVeteran

Banned
cyberpunk does not even look good.

RwEtbfr.gif
 

Kataploom

Gold Member
From what I can recall, Remember Me was the first game to heavily implement PBR. It was certainly quite the looker at the time of release.
It did and it looked absurdly good... I remember playing it almost maxed with a GTX 560 with only 1GB of VRAM at 1080p@60fps (almost locked), and looked better in many cases that some 8th gen games.

I literally felt I was playing a next gen game, when I came to see actual next gen games graphics they didn't look very impresive to me (I saw a lot of those things already on PC on 2012 and 2013 games).
 

Lethal01

Member
Nah bro. You've been around these parts and on the speculation threads to read between the lines. Next time someone makes a bold claim about wait till you see what's in store in 3yrs, ask them what specifically are they imagining. You'll quickly turn your head when they start saying they expect Cyberpunk-like RT settings on a console that you know can only render as well as a 2080 at best and with the knowledge that DLSS is nowhere in the console hardware to remedy the computation costs. They really believe that the developers are THAT far behind understanding the hardware to where they can pull BIG gains in a GPU with low-end RT performance.

Also to add, you are really hung up on UE5 demo. Because you think it's the best looking thing you've ever seen so far (despite giving mad praise over the rendering of Cyberpunk), it's going to be hard for us to agree on observations and therefore, rendering tech's impact as a whole as the generation unfolds. FYI.

I just rather stick to a single example and try to discuss a few points it, I don't have time to write 12 paragraphs each post getting examples from multiple games. I think taking that would lead to things quickly getting messy and just wasting time without really getting anywhere.

Me talking about the UE5 demo isn't because it's the great rendering of all time or something. It's important because it shows a clear leap from what we are Currently seeing ON the consoles vs what is possible when they "mastered" which is the point of this thread.
yes obviously real-time Raytracing is the pinnacle of graphics but we aren't talking about the pinnacle, we are discussing how much the consoles will improve and the UE5 demo shows they can improve tremendously.

As others have said, much of this discussion seems to be boiling down to you thinking a word like "amazing" is too high praise for the jump that will be coming.

I'm sure there are people making crazy claims like the PS5 is gonna be at 16k by the end of the generation. But I'm assuming you didn't make this thread expecting people to try to argue that those claims are true.
 
Last edited:

Ryu Kaiba

Member
not a clear step up? better lightning? :pie_roffles: this just another proof how blind and delusional you are UE5 demo looks like offline render in real time even REAL dev have said that but hey you think CD have better lightning because it have better shader which makes lightning pop? prove it what's complexity of the shader compared to UE5? why does it looks better compared to UE5?
now CD looks good but still have gamey look to it and nowhere near to UE5 quality. the only reason you're saying this because the only PS5 renders visuals like UE5 at the moment not even your overpriced 2K$ 3090 have showed it can run this.


I'm not a developer just your average gaming enthusiast and this UE5 Demo looks better then any game I've ever seen.
 

carsar

Member
I find amusing to judge render by performance of my hardware.
Since 2013 my gpu(980 ti) can handle most of games at 4k 30fps at acceptable settings. There are no big differences(2 times) between assasins unity, hzd in term of my rig perfomance. Pastgen is another story -many games run at 4k 120fps, so current gen has about 4 times havier render. Old Crossgen games (bf4, black flag etc) are in the middle, about 4k 60fps.
And new crossgen games like 2077, msf2020 run at 4k 15 fps only.
So, i can see the steps: pastgen > crossgen = x2 pastgen > nextgen = x2 crossgen.
In terms of technologies i see the same. Pastgen looks obviosly worse, crossgen has some past\next gen techs and nextgen looks obviosly better.
The Difference within the gencycle exists(i'm not talking about crossgen titles and extremely early\late releases) but it is subtle and doesn't worth to hype imho
 

stetiger

Member
Thanks for your input! Welcome to the thread!

Would you say, in your opinion, that with everything that's going on with a game like Cyberpunk (especially on a high end PC GPU) running at max throughput @4k, RTX set to Ultra with DLSS 2.0, that you'd see a console exclusive game exceed the amount of rendering going on at the same settings with NO DLSS in hardware? We've had guys claim that "there will be 1st party games on PS5 that will look better than this game from a technical perspective". Many Sony gamers here want to use subjective comparisons in art direction to denote a game "looking better" technically. The problem with that is they never punchline their comments with "subjectively" and declare that optimization has made a $500 box do "more" than a $2000 high end PC. This is the crux of argument after argument on these boards. I believe that if more developers were on these boards (besides me and a handful of others) putting things into perspective, we'd see less of these ridiculous claims.

Yeah I don't we will see that kind of performance jump. I think what they mean to say if your read between the line is that you will see games running on ps5 that look better and run better than cyberpunk on 3090 right now. But these optimizations will also translate to the pc.

However, there is one amazing thing about the ps5 worth mentioning. The geometric output of the ps5 is higher than the 3090 because tri/cycle increases with clock frequency. And as of right now the ps5 can output more triangles than the 3090. Unfortunately for the fanboys there are more things to gpus than just triangle. But, if a game were to be built from scratch where triangles is the bottleneck, the ps5 would scale better as well as the 6800, 6800xt, and 6900. This is really important however because unreal engine's nanite scales with triangles, resolution is secondary. So on a technical level, you should expect ps5 to perform better than series/consoles and better than all nvidia GPUs in the market today. Pretty cool huh! I suspect sony designed their GPU that way to work well with UE5. And I also suspect, you will be surprised. If UE5 becomes the defacto engine next gen like this gen, then the ps5 is looking to punch above its weight (teraflops) this gen.
 

VFXVeteran

Banned
Me talking about the UE5 demo isn't because it's the great rendering of all time or something. It's important because it shows a clear leap from what we are Currently seeing ON the consoles vs what is possible when they "mastered" which is the point of this thread.
yes obviously real-time Raytracing is the pinnacle of graphics but we aren't talking about the pinnacle, we are discussing how much the consoles will improve and the UE5 demo shows they can improve tremendously.

So in your opinion, UE5 demo is a clear leap for consoles. OK. Let me ask you this.

If you took away the Nanite tech. Would you still think that overall UE5 demo would be a tremendous improvement?
 

Alexios

Cores, shaders and BIOS oh my!
Thought this would be about the evolution of video game graphics, from text to ascii to all sorts of 2D, sprite scaling, vector, faux 3D, polys, tris, voxels, wireframe, flat shaded, texture mapped, gouraud, lighting techs, up to today's stuff, like volumetrics, tessellation & rt. It's just troll shitposting...

Dunno about any game becoming better than every launch game ever in a gen but devs plenty of times talk about how sequel to x game has more polygons, effects and so on than their own last one since like the 32bit days where they said it for Lara's boobs and hair and such in Tomb Raider...
 
Last edited:

Lethal01

Member
So in your opinion, UE5 demo is a clear leap for consoles. OK. Let me ask you this.

If you took away the Nanite tech. Would you still think that overall UE5 demo would be a tremendous improvement?

Nah probably wouldn't say tremendous, could still be a really big improvement g but having Lumen AND Nanite takes it from big to Massive.
 
Last edited:

VFXVeteran

Banned
Yeah I don't we will see that kind of performance jump. I think what they mean to say if your read between the line is that you will see games running on ps5 that look better and run better than cyberpunk on 3090 right now. But these optimizations will also translate to the pc.

And that's impossible from a technical standpoint. If we judge all the graphics algorithms that CDPR put into the game (which is pretty everything that I know of save the Nanite tech). That's pretty much just as "dreamy".

However, there is one amazing thing about the ps5 worth mentioning. The geometric output of the ps5 is higher than the 3090 because tri/cycle increases with clock frequency.

Makes sense. How much of a difference to be able to tell is another matter.

And as of right now the ps5 can output more triangles than the 3090. Unfortunately for the fanboys there are more things to gpus than just triangle.

Much more important things like lighting/shading.
 

VFXVeteran

Banned
Nah probably wouldn't say tremendous, could still be a really big improvement g but having Lumen AND Nanite takes it from big to Massive.

But Lumen is nothing different than what we've seen. Even now a lot of guys can't even tell the difference and Cyberpunk/Metro/Control/Crysis Remake has it all over the place. UE5 demo doesn't look any better (aside from geometry) shading-wise than any of those games.
 
It did and it looked absurdly good... I remember playing it almost maxed with a GTX 560 with only 1GB of VRAM at 1080p@60fps (almost locked), and looked better in many cases that some 8th gen games.

I literally felt I was playing a next gen game, when I came to see actual next gen games graphics they didn't look very impresive to me (I saw a lot of those things already on PC on 2012 and 2013 games).
Talking about Remember Me made me dig up some of the stuff I browsed back then :D
 

Sun Blaze

Banned
Just gonna put this out there as it may shed some light on the whole "devs are unfamiliar with the PS5" argument.


SGJ4CoU.png


Nz1MCO7.png


Evidently, the PS5 is nowhere near as hard to develop for than the PS3 was, explaining in no small part why early PS3 and late PS3 games look so markedly different.
I said as much earlier and Mark Cerny I think copied me.
 
Last edited:

Clear

CliffyB's Cock Holster
Cerny's philosophy is so simple and obvious on the surface, yet it really does appear that its a profoundly important consideration that a lot of the time really isn't factored in, or at least not given the priority it warrants.

To me it really does seem like a reaction to the trajectory of Sony's hardware under Kutaragi, and how that impacted everything.
 
Time, money, and experience will always impact development, but the underlying techniques through a generation are largely the same. You can’t quantify how a game looks, but you can break it down to how it’s built. Those pieces rarely change through a generation with games built exclusively on that hardware.

I think the perception comes from the first 1-2 waves of games being cross-gen or targeting more conservative spec than the final devkit. Looking at games like Uncharted 4 vs TLOU2 and the differences are way less drastic than something like COD Ghosts vs MW. Only exception being Crysis remastered and their software RT solution even though that’s 1) not totally a new technique and 2) reliant on the mid gen upgrades aka new hardware.

I would argue we should see a much larger difference this generation solely due to the advances around mesh/primitive shading, RT techniques, and SSD asset streaming.

Are these new techniques? No, but these will help with the time and efficiency of a games development and will impact the look of a game significantly by the 2nd-3rd round of games this gen.
 

Fafalada

Fafracer forever
CLAIM #1 - No matter what the generation, developers will ALWAYS need to start their learning process of the hardware all over again from ground zero.
Having said that, many people also believe that with a given hardware architecture that has been thoroughly benchmark against a known example (i.e. PS5 vs 2080), that this architecture isn't being utilized to it's full potential concerning graphics technology - despite no remarkable change in graphics algorithms, which leads to claim#2
That's conflating multiple things here. Most generations in past 2 decades, the learning process was almost entirely down to the changes in useable algorithms - not hw utilization. And yes - that includes the beginning (PS2 gen) and the end (PS4 gen). Whether or not said algorithms pre-date the consoles (most of the time they did - sometimes by several decades), isn't really a topic of the debate.
To do some lists - PS2 gen popularized accumulation motion blur, pseudo-HDR/Bloom, image post-processing, realtime-volumetric shadows, and eventually (Tail end of it) normal maps.
360 - full HDR, shadow-maps, large open worlds, screen-space lighting, POM...
PS4 - PBR, temporal-AA and upsampling, image reconstruction, realtime GI etc.
None of the lists are exhaustive - but point is more that software took time to adopt things over time, or even pick what was most impactful. You can argue this gen is 'different' - but we have multiple variables that are subject new experimentation (RT, VRS, SSD, heck, probably practical applications of ML too), in addition to just playing with more compute. Realtime isn't just picking off a laundry list of algorithms based on assumptions what fits - that's only the starting point in every project.

CLAIM #2 - The hardware is nowhere near fully utilizing it's GPU at the start of the generation.
Finally, people think that relationships with technology and output are exponential combinations. The phrase "if the game looks this good at the start of this generation, imagine..." gets said over and over again in threads concerning 1st party games.
I mean - you may want to define 'utilization' first. Even some of the PS2 launch titles had high-utilization at points, especially those that didn't run at 60 - they just weren't doing particularly useful stuff with it. And while hardware is a contributing element, and that absolutely impacts all console launches, that never yields 'exponential' improvements - we're talking 10s of %s.
But the 'useful' stuff changes as the approaches evolve, as per point 1 of course.

CLAIM #3 - Graphics technology has an exponential output based on linear time. A game today will look exponentially better in 2-3yrs.
I want to see the receipts on who ever said this. I get people get hyperbolic online - but really, has anyone said that and 'meant' it, ever? Like exponentially, really?
 

Lethal01

Member
But Lumen is nothing different than what we've seen. Even now a lot of guys can't even tell the difference and Cyberpunk/Metro/Control/Crysis Remake has it all over the place. UE5 demo doesn't look any better (aside from geometry) shading-wise than any of those games.

Once again, this isn't about if there is a game or a technique that is better on PC. We are talking about how The games that will release on PS5 later will improve over what is currently being released. And I'd say that lighting in the UE5 demo looks way better than lighting games that are currently out and better than something like Control or Crysis remake running in backwards compatibility mode.

The most important thing to understand here is that The unreal demo is a clear jump from any game currently out on PS5..

Not to mention it was totally dynamic.
 
Last edited:

Lethal01

Member
Just gonna put this out there as it may shed some light on the whole "devs are unfamiliar with the PS5" argument.


SGJ4CoU.png


Nz1MCO7.png


Evidently, the PS5 is nowhere near as hard to develop for than the PS3 was, explaining in no small part why early PS3 and late PS3 games look so markedly different.
I said as much earlier and Mark Cerny I think copied me.

To me this is him trying to say it will be easier than PS3 while allowing for more improvement over time than the PS4 did.
 
Skyrim Struggled to maintain 27 FPS on PC vs the 360 (it actually, underperformed the 360 by 4-7 FPS and came in 2 FPS under the 360 on average) and in fact was outperformed by the 360 even when a generationally better card (the 5870) was installed.

A card that was a year and a half newer - a generation more advanced too - underperformed the 360 on PC.

Skyrim did not out perform it's 360 Counterpart until the 6870 was released - A card with 2 generations of new technology under it's hood. A card 2 generation's better than what was in the 360 only then offered steady 30FPS and above.

This is where the argument falls apart, Skyrim was chosen because it was finally a game that could cite factually - that it tapped out the 360's hardware at near 100%.

And PC's underperformed the 360 until they were equipped with the LATEST NEXT GEN gpu, not simply a generationally better Next Gen GPU.

History will again repeat itself, as consoles will begin to outperform PC's equipped with 3080's in about 4 1/2 - 5 1/2 years into the consoles lifecycle.. as Dev's begin to fully utilize consoles and begin to ignore/completely disregard custom development on older PC hardware (ie: the 3080).

And this is where this useless battle about Consoles vs PC's completely falls apart. It is based on the 3080, and the 3080 will get ignored by developers - while those same developers will say "Hey there's still 45% of untapped potential here on this console" and consoles will outperform generationally better PC hardware due to this certainty.

Make no mistake, the Series X will outperform the 3080 once console dev has matured and utilized 100% of the horsepower on consoles.... and the game makers begin to shift dev standards to focus on newer bleeding edge PC hardware while ignoring the older latest gen 3080.

The only way this trend will not occur is if Microsoft Completely ignores DLSS 2.0 and does not provide an equivalent with MLRender - an overhead API built directly into DirectX to offer a direct answer to DLSS 2.0 and the like.

If MS Ignores MLRender and does not offer a direct answer to this Machine Learning DLSS then PC's may remain more powerful than consoles - but only if Microsoft decides not to implement 100% MLRender functionality in the years ahead.

Because the difference between Consoles and PC's is - on PC's, Developers never in fact fully tap into the potential or fully utilize a GPU like the 3080 5 years into it's lifecycle. It takes 8-9 years for a GPU to be almost fully utilized at 100% of it's potential not because Developers aim to Utilize the GPU - but because they focus on developing for a Common Standard known typically (but not always) as the DirectX GPU Stack. If you are expecting a game to fully harness the potential of your specific GPU, unless Machine Learning comes in to correct this issue by coding custom variables for the 3080 - expect to wait 13 years while Developers ignore your PC hardware and work towards a common GPU standard.

And after 9 years of waiting - you are still left essentially waiting for developers to utilize the remaining 5-7 percent of the GPU that was being ignored.

Funny enough

When dev's do finally begin to ignore the 3080 because of newer GPU's coming down the pipe

Series X - in the console Arena - will still oddly be considered bleeding edge tech almost nearly fully underutilized. Which is why it will begin to outperform the 3080 as dev's ignore PC hardware and focus on fully utilizing Consoles at 100%. By the time consoles are in fact utilized at 100% to deliver peak performance and visuals - Dev's will have only managed to utilize 65% of the performance of the 3080.

Thank God PC's will have the 4080/5080 to fall back on by that time.
 
Last edited:

Boss Man

Member
The improvements I expect to see come more from the assumption that there’s just a lot of talent working on newer tech. If those same developers were always targeting the latest and greatest (and most expensive) PC specs I would not longer expect significant improvements over time with consoles.

Another way to look at it is to say that consoles are holding PC games back, but I think that’s the wrong take because if the console market didn’t exist the games (or at least their budgets) probably wouldn’t either.

Edit: I guess you could call this claim #1, but it’s less about learning the technology and more about finding new tricks that are enabled by it.
 
Last edited:
PC developers focus on the latest hardware, the 3080 will not be the latest GPU hardware 4 1/2 years from now - and as such - consoles will begin to overtake what is currently considered bleeding edge on PC.

This is why the 360 beat the fastest GPU's with comparable CPUs released from 2006 to 2007 in Skyrim benchmarks made solely to demonstrate that consoles in fact do beat more advanced PC hardware late into a consoles life cycle as developers attempt to fully utilize console hardware.

Any disparaging cries stating that trend will not continue well into the future of this generation, do so by plainly railing against the fact's.

The only way this may change is by harnessing ML dev tools that fully utilize older GPU's, as standard developers will always ignore GPU hardware (ie the 3080 will be ignored, the Series X will still undergo heavy dev cycles) on PC in lieu of the latest PC hardware.
 

Raploz

Member
I think VFXVeteran is right. The technical part didn't change much this past gen, and on PS5 we won't see games that now run at dynamic 4k 30 run at 4k 60 + mind-blowing ray-tracing, however I also think what really matters is the end result, and for many PS4 games the newer (exclusive) games do look much better (subjectively), even if they're using the same techniques. I know the point of this thread is to discuss the technical aspects, but does it really matter how developers achieve improvements in the game's visuals?

I apply to this the same philosophy I apply to music. If the singer is awful but they autotune it well and the beat is nice, what's not to like? 😌
 

Ryu Kaiba

Member
I just rather stick to a single example and try to discuss a few points it, I don't have time to write 12 paragraphs each post getting examples from multiple games. I think taking that would lead to things quickly getting messy and just wasting time without really getting anywhere.

Me talking about the UE5 demo isn't because it's the great rendering of all time or something. It's important because it shows a clear leap from what we are Currently seeing ON the consoles vs what is possible when they "mastered" which is the point of this thread.
yes obviously real-time Raytracing is the pinnacle of graphics but we aren't talking about the pinnacle, we are discussing how much the consoles will improve and the UE5 demo shows they can improve tremendously.

As others have said, much of this discussion seems to be boiling down to you thinking a word like "amazing" is too high praise for the jump that will be coming.

I'm sure there are people making crazy claims like the PS5 is gonna be at 16k by the end of the generation. But I'm assuming you didn't make this thread expecting people to try to argue that those claims are true.
So in your opinion, UE5 demo is a clear leap for consoles. OK. Let me ask you this.

If you took away the Nanite tech. Would you still think that overall UE5 demo would be a tremendous improvement?
If you took away the improvement would the improvement still be an improvement lol

You keep talking like geometry doesn't matter when clearly it's far more noticeable to most people.
Its easier to tell if a circle is not really a circle then it is to tell if some area behind a trash can isn't properly shaded.
 

Ryu Kaiba

Member
Cyberpunk looks good as long as you're not around streets filled with people. The denizens walking the streets make everything still look fake and videogamey and the lack of smart AI doesn't help it. Hopefully they improve the AI over time. If we had a time machine to look at videogames 20 years from now, Super smart AI is what would blow our minds moreso than graphics. Thats the true next leap for videogames.
 
Last edited:

eot

Banned
Skyrim Struggled to maintain 27 FPS on PC vs the 360 (it actually, underperformed the 360 by 4-7 FPS and came in 2 FPS under the 360 on average) and in fact was outperformed by the 360 even when a generationally better card (the 5870) was installed.

A card that was a year and a half newer - a generation more advanced too - underperformed the 360 on PC.
Stop making shit up

eaHg7m5nNuMw9BrAm9AUpg-970-80.png

SwXvrxKkSacqL97Ki9sZe5-970-80.png
 
Last edited:
Stop making shit up

eaHg7m5nNuMw9BrAm9AUpg-970-80.png

SwXvrxKkSacqL97Ki9sZe5-970-80.png

The test wasn't utilizing the BEST cpu as most benchmarks would have, but a comparable... yet faster CPU. And it was a controlled test scenario. To run the test, and simulate the level of technology being utilized in the 360 - they had more than level 1 FXAA enabled - the 360 equivalent of FXAA utilized was 4x FXAA. As it is only fair to implement AA at the level of the 360. 360 Does not use FXAA - so the equivalent FXAA to MSAA ratio was found.

And also 16x AF. These benchmark's are not matching the purpose or in fact setting's used by the Futuremark community to test whether a PC with not the best hardware, but hardware generationally better than the 360 - could maintain a steady framerate above 25 FPS at specifically high detail settings that matched Consoles verbatim - particularly in the area of FXAA.

Futuremark replicated the specs/details equivalent of the 360 on PC - which these test's do not come close to if they are not utilizing 4 levels of FXAA.

At the correct setting's meant to simulate an in fact higher fidelity than what was used as standard on PC - 360 did not struggle and more consistently maintained 27 - 30 FPS. While the PC in fact hit 60 FPS, it did so while looking at obscure corner's of low detail in door assets or blank areas of the sky, and as such - such gameplay was kept to a minimum and only occurred to demonstrate that yes a PC could hit 60 FPS at those settings, but floundered in maintaining a steady a framerate at 24-27 FPS in actual gameplay.

Phil Spencer maintained that their Anti Aliasing solution was closer (yet far superior) to MSAA 8 on 360, if dev's had time to correctly implement it. Hence FXAA 4 settings utilized for this controlled test on PC.

I'm sure I am not the ONLY PC Gamer that remembers ignoring the fact that to really exceed the 360 exact blow for blow while maintaining a framerate higher than 24 FPS, that the NEWEST GPU was a must due to poor - non existent coding optimization on older (yet better) hardware. And that until corrected - Many GPU's at 16x AF had horrendous errors/glitches/unplayable framerates due to poor coding optimization which underscored the results of the test. That PC's without the most bleeding edge hardware were poorly coded for.

The test's were made to prove that older hardware in fact is poorly coded for/completely ignored until industry standards push GPU optimization forward.

Also the test's were made to prove, that Devs were not utilizing anywhere near in full newer better GPU hardware (as was demonstrated by the mass of issues running the game at actual High settings, opposed to what is seen in these screenshots) as much as PC dev's focus on the most LATEST gpu hardware.

In many of the tests, the GPU tech utilization by the dev's in question was probably in the spectrum of 45%-65% utilization.
 
Last edited:

FireFly

Member
Sure I am not the ONLY PC Gamer that remembers ignoring the fact that to really exceed the 360 blow for blow that the NEWEST GPU was a must. And that until corrected - Many GPU's at 16x AF had horrendous errors/glitches/unplayable framerates due to poor coding optimization.
Yes, but why are you talking about the 5870 at all? The series goes X1800XT -> X1900 XT/X -> 3870 -> 4870 -> 5870

The comparison should be made with the X1800XT and X1900 XT/X, not the 5870, which is comparable or faster to the Xbox One's GPU.

(I had a X1800 XT which was a high end GPU at the end of 2005, and within a few years I had to upgrade due to lack of shader power. So the comparison is interesting, but you seem to be misremembering the cards involved)
 
Last edited:

eot

Banned
The test wasn't utilizing the BEST cpu as most benchmarks would have, but a comparable... yet faster CPU. And it was a controlled test scenario. To run the test, and simulate the level of technology being utilized in the 360 - they had more than level 1 FXAA enabled - the 360 equivalent of FXAA utilized was 4x FXAA. As it is only fair to implement AA at the level of the 360. 360 Does not use FXAA - so the equivalent FXAA to MSAA ratio was found.
The post I replied to didn't mention any specific test lol, maybe you should learn to be a bit more specific. If your point is that a console with a low level API will see better performance when a dev tailors their code to that hardware, compared to it running through a general API on PC, then you didn't quite so many words. In the end, who cares? You're still going to get better performance on a good PC, for most games during most of the console cycle.
 
Top Bottom