• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NextGen hardware (even next-gen PC hardware) graphics are still FAR from Photorealistic

GymWolf

Member
4x times a ps5 is around 36-40 tf right?
sound like a big fat number for a console to be honest...but i'm not an expert.
maybe gpu technology is gonna make huge jumps in 8 years, i think that even the most expert people on this planet can't establish this right now, isn't?
 
Last edited:
4x times a ps5 is around 36-40 tf right?
sound like a big fat number for a console to be honest...but i'm not an expert.
maybe gpu technology is gonna make huge jumps in 8 years, i think that even the most expert people on this planet can't establish this right now, isn't?
you said it 4x.

The actual tflop jump from ps4 to ps5 is 5.6x. And keep in mind that this is due to rdna using less tflops than gcn for similar performance. The real gcn to rdna performance jump is about 8x(for rdna1, rdna2 might be even greater). Also ps5 focused on ssd and i/o while having a more conservative gpu design. ssd costs are likely to drop significantly in 8 years, and it is likely a more ambitious gpu design can be used in the ps6 with the freed up budget.

edit:

if ps6 used a similar conservative gpu design and had a similar jump to ps5's that'd put it at 58Tflops. But given ssd costs are likely to be a nonissue next nextgen, I'm assuming a more ambitious gpu design might be used.
 
Last edited:
The bar for photorealism keeps moving upwards as graphics get better. People have been saying games or movies look "photorealistic" since the 90s. I remember reading a review of a basketball game for PS1 around that time where the author described the graphics as being nearly indistinguishable from a real game on TV. I can laugh about it now, but back then it didn't seem like much of a stretch.
 
Last edited:

VFXVeteran

Banned
And? Still doesn't confirm anything. Let's stop throwing "facts" until we see gameplay.

It confirms that I know how it was rendered. You sound like several of the posters here that doubted my inside info on PS5 and PS exclusives coming to PC. Give me a little bit of the doubt even if it hurts hearing what I'm saying.
 

VFXVeteran

Banned
you said it 4x.

The actual tflop jump from ps4 to ps5 is 5.6x. And keep in mind that this is due to rdna using less tflops than gcn for similar performance. The real gcn to rdna performance jump is about 8x(for rdna1, rdna2 might be even greater). Also ps5 focused on ssd and i/o while having a more conservative gpu design. ssd costs are likely to drop significantly in 8 years, and it is likely a more ambitious gpu design can be used in the ps6 with the freed up budget.

edit:

if ps6 used a similar conservative gpu design and had a similar jump to ps5's that'd put it at 58Tflops. But given ssd costs are likely to be a nonissue next nextgen, I'm assuming a more ambitious gpu design might be used.

You are ignoring cost.

The 2080Ti came out about 18 months ago. How much do they cost now? EVGA XC 2080Ti @ $1,338. Mine cost me $1,279. The price went UP over 18 months.

The primary reason next-gen consoles can't keep pace with PC hardware is because of cost. There may very well be a 58TFLOPS graphics card in 7yrs (I seriously doubt it), but it certainly won't be in a $500 console. In any case, it still wouldn't be enough to render the CG in my post in real-time.
 
Sure but slow and steady advancement is not deniable. Just compare recent games to, let’s say, late PS3 ones. It’s huge upgrade but because it’s happening slowly you don’t really get to appreciate it. For example, I had that moment in Modern Warfare when I was just stunned with how good it looked and still does (OG PS4). Guns are unbelievably realistic, same with operators. Level of detail is honestly never seen before.

I can only imagine when devs start using full power of next gen how good everything will look.
 

VFXVeteran

Banned
Sure but slow and steady advancement is not deniable. Just compare recent games to, let’s say, late PS3 ones. It’s huge upgrade but because it’s happening slowly you don’t really get to appreciate it. For example, I had that moment in Modern Warfare when I was just stunned with how good it looked and still does (OG PS4). Guns are unbelievably realistic, same with operators. Level of detail is honestly never seen before.

I can only imagine when devs start using full power of next gen how good everything will look.

MW on next-gen will look like the PC version @ Ultra settings on a 2080Ti. SP looks better than MP by a good bit which is why it's not steady 60FPS @ 4k. Bandwidth will destroy the TFLOPS in the next-gen consoles. Not leaving enough room to added content unless they sacrifice image quality and/or FPS.
 
Last edited:

Sagii86

Member
It confirms that I know how it was rendered. You sound like several of the posters here that doubted my inside info on PS5 and PS exclusives coming to PC. Give me a little bit of the doubt even if it hurts hearing what I'm saying.

It doesn't "hurt" since I'm not that invested. I see a fact, I point it. I've been around the industry long enough to know that "insiders" information means nothing when it comes to the final release of a movie or a game for that matter. I'm sure you know the amount of doubters we had back when RAD first teased The order 1886 in 2013, since you have been around as a professional.
 
Last edited:

StreetsofBeige

Gold Member
Graphics are still gamey, except for those those awesome rendered stills.

The kicker that makes graphics gamey is still the animation which I don't think is much better than last gen. It's just that the textures look better.

If you really want to see shit graphics/animations, play any game and move your character or watch enemies go up and down a flight of stairs. It's like they are levitating and gliding as opposed to actually balancing and planting a set of feet on each step.

The stairs might as well not even look like stairs. Just make it sloped like a triangle.
 
Last edited:

VFXVeteran

Banned
It's the same routine every generation when a studio raising the bar by visually. I remember the comments from those skeptical "professionals" when RAD teased The order 1886 back in 2014. We had the exact same threads back then. Manging expectations is one thing, assuming

RAD made a game that looked beautiful from an artistic perspective. No where in that game did tech somehow be pushed. If you work in the industry as a professional, you should attest to that. The same limits in lighting continued the entire generation. There was no new advancement in lighting until RTX came along.

It doesn't "hurt" since I'm not that invested. I see a fact, I point it. I've been around the industry long enough to know that "insiders" information means nothing when it comes to the final release of a movie or a game for that matter. I'm sure you know the amount of threads we had back when RAD first teased The order in 2013.

What "fact" are you pointing out?

Lastly, my "inside" information meant everything.. and a lot of hurt PS players. It wasn't my fault what I heard was factual.
 
Last edited:

Sagii86

Member
RAD made a game that looked beautiful from an artistic perspective. No where in that game did tech somehow be pushed. If you work in the industry as a professional, you should attest to that. The same limits in lighting continued the entire generation. There was no new advancement in lighting until RTX came along.



What "fact" are you pointing out?

Lastly, my "inside" information meant everything.. and a lot of hurt PS players. It wasn't my fault what I heard was factual.


The fact you know nothing about how games will look like next gen. It's all assumptions.
 

Poordevil

Member
Not a huge fan of photorealism, but I think in gaming it has its place. Racers and flight sims benefit from photorealism. MS flight sim 2020 is really creeping up on that vision. Gotta admit it is looking very impressive.

 

VFXVeteran

Banned
The fact you know nothing about how games will look like next gen. It's all assumptions.

Should we go back to how the PS4 demos that came out last gen that all ended up being missing from the PS4 games the entire generation? Or the fact that Sony devs tell me that all those demos were done using PC hardware? I know how a game will look based on the power the hardware can give out and knowing the actual GPU hardware and what the API gives me. How about you?
 
Last edited:

Sagii86

Member
Should we go back to how the PS4 demos that came out last gen that all ended up being missing from the PS4 games the entire generation? Or the fact that Sony devs tell me that all those demos were done using PC hardware? I know how a game will look based on the power the hardware can give out and knowing the actual GPU hardware and what the API gives me. How about you?


We've been there, I don't disagree about PC being the main platform dictating the visuals.
I'm talking about you claiming you know something we don't.
 

Kenpachii

Member
Should we go back to how the PS4 demos that came out last gen that all ended up being missing from the PS4 games the entire generation? Or the fact that Sony devs tell me that all those demos were done using PC hardware? I know how a game will look based on the power the hardware can give out and knowing the actual GPU hardware and what the API gives me. How about you?

I totally agree with your observation, we are nowhere near photorealism and it will take many many years if not decades to even get there. Everything from uncharted / tomb raider to ac odyssey still looks very very cartoony with limitations everywhere, from grass to object details, to animations, to even particle effects or physics or density or ai etc etc.

Pretty much everything.
 

VFXVeteran

Banned
We've been there, I don't disagree about PC being the main platform dictating the visuals.
I'm talking about you claiming you know something we don't.

Ok. If you can't contribute to the discussion with any meaningful data then you really should leave the thread. I know a lot that you guys don't. Just because I don't put up screenshots of a game that's coming out doesn't mean I don't know how shit will look. But you keep thinking I'm talking out of my ass until the games come out. When the first one appears (assuming it's not being released on the PC) you damn sure better have some graphics knowledge to back up your claims.

I can tell you this. Next-gen won't look like the gifs I put up. Period.
 

Sagii86

Member
Ok. If you can't contribute to the discussion with any meaningful data then you really should leave the thread. I know a lot that you guys don't. Just because I don't put up screenshots of a game that's coming out doesn't mean I don't know how shit will look. But you keep thinking I'm talking out of my ass until the games come out. When the first one appears (assuming it's not being released on the PC) you damn sure better have some graphics knowledge to back up your claims.

I can tell you this. Next-gen won't look like the gifs I put up. Period.


No need to get upset. I Never said they will. My involvement took place when you started debunking the details shown in the hellblade 2 trailer. We can agree to disagree.
 

PSYGN

Member
When people say photo-realistic they are talking in the context of games today where people may be fooled into believing it's real at a glance. Most of the believability comes down to lighting and inanimate objects. Of course someone like you with a trained eye may see the flaws immediately, but understand most people don't have that eye. Faces are still not there yet, our brains read far more into faces than anything else for natural reasons. I think some of the Unreal Engine demos we saw would fool many people into thinking it was real if taken as still shots and as such this thread seems somewhat pompous.
 
Last edited:

VFXVeteran

Banned
When people say photo-realistic they are talking in the context of games today where people may be fooled into believing it's real at a glance. Most of the believability comes down to lighting and inanimate objects. Of course someone like you with a trained eye may see the flaws immediately, but understand most people don't have that eye. Faces are still not there yet, our brains read far more into faces than anything else for natural reasons. I think some of the Unreal Engine demos we saw would fool many people into thinking it was real if taken as still shots and as such this thread seems somewhat pompous.

Agreed. I have no problem with it. What I do have problems with is people that declare things that are completely false. Subjective opinions are welcome. But scientific facts have to be the line in the sand.
 
You are ignoring cost.

The 2080Ti came out about 18 months ago. How much do they cost now? EVGA XC 2080Ti @ $1,338. Mine cost me $1,279. The price went UP over 18 months.

The primary reason next-gen consoles can't keep pace with PC hardware is because of cost. There may very well be a 58TFLOPS graphics card in 7yrs (I seriously doubt it), but it certainly won't be in a $500 console. In any case, it still wouldn't be enough to render the CG in my post in real-time.
The 1060 was around 980 performance. The 2060 was around 1080 performance.

The 3060 is likely to be around 2080 performance at under 300$. The 3070 may very well have 2080 ti performance at under 500$. Prices might be a bit higher if amd doesn't bring something competitive.
 

sneas78

Banned
We are not far off. I don’t get why you even make a thread about it. I’m expecting lion king like graphics by ps7. Maybe even before it on pc.
 

rnlval

Member
All,

Every single generation, I get a lot of people that think games will approach photoreal quality found in CG movies. And every year we get absurd threads claiming cutscenes from exclusive games are within that realm. We even have some developers bold enough to make those claims knowing that the hardware isn't anywhere close to making these kinds of claims. Yet, here we are..

Instead of fighting the opinions to no avail, I'd like to instill a dose of reality based on your own "judging" terms that you use for claiming these games will look CG (even in cutscenes).

Let's take a look at the cream of the crop rendering from current-gen and next-gen cutscenes and compare them to the best CG from the hit Netflix Original TV series - Love, Death, Robots. You should quickly come to the conclusion that we are still far far away.

(SNIP for post size)


In closing: Before everyone gets ahead of themselves and start making threads about these consoles (or even Ampere equipped PCs) putting out photoreal CG quality games, please take a look at this thread, take a deep breath, and continue to wait for it.



The top picture is a real-life picture and the bottom picture is ray-tracing in-game.

EO6KtgbXkAEWT3e
 

FranXico

Member
Thanks for the thread, VFXVeteran VFXVeteran .
I do feel people haven't been insisting on photorealism too much lately though. Truth be told, console gamers are always hoping to get graphics closer to CGI each gen, and always settle for what they do get. Art direction will do it for the most part, I guess.
 

SmokSmog

Member
4x times a ps5 is around 36-40 tf right?
sound like a big fat number for a console to be honest...but i'm not an expert.
maybe gpu technology is gonna make huge jumps in 8 years, i think that even the most expert people on this planet can't establish this right now, isn't?
This year Nvidia will have GPUs for workstations with up to 8k cores, 8k gpu at 2ghz = 32Teraflops
 

Reindeer

Member
People are forgetting that PS6 in 8 years time will probably be made on 3nm or less with architecture being much more efficient than what we have now. So in reality it doesn't need to have 40-50 tflops as the efficiency of future architectures along with evolving graphical tools and things like machine learning will probably mean that you could theoretically have 25-30 tflop GPU that can outperform 40-50 tflop GPU based on current architecture. We might have to wait for true photorealism till we get to PS7 (if consoles still exits then), but I think PS6 games in 8-10 years time could look almost photorealistic.
 
Last edited:

webber

Member
I love videogames.
I'm impressed by what a mere PS4 can accomplish in COD MW Warzone, TLoUPII, RDR2 etc...
I love Love, Death + Robots.
The closest to next gen graphics I see in LDR is the Shapeshifter episode, excluding the fur in the werewolves.
But that's what makes it interesting and fascinating, the evolution of technologies.
It gets better and better every year and if we look back at what we had 30 years ago...
... I think we're doing a good job.
Can't wait to see what next gen brings.
The main difference between CGI films and animations is that they prioritize fixed visuals and not realtime interaction and gameplay, that's why they look so good.
And as we get closer and closer to that CGI but not "true" CGI style of visuals at playable frame rates I get more and more excited.
 

rnlval

Member
Precomputation is used because the hardware isn't powerful enough to compute it in real time. That's a big disadvantage.



Man, you are really talking out of you mind dude. Do you work in the industry at all?

Let's see how dumb your statement is. If a 30xx GPU comes with a performance delta of 20TFLOPS, you are claiming that a PS6 will have a GPU by AMD (who's known for being years behind in tech) doing 60-80TFLOPS of power? That's literally a GPU that's not even designed yet from either company. Are you that much of a Sony warrior that you just make up figures in your head just to try to belittle the top graphics company in the world? Why? Your claims are so outrageous I'm considering putting you on my /ignore list because it's just so far away from reality.



1080Ti introduced 2017
PS4 introduced 2014
PS5 introduced 2020 at barely 1080Ti levels.

That's a 6-year gap from one gen to the next gen using 3yr old tech. Do the math.



I know devs at ND and they understand limitations of consoles and more importantly, how far away they are from CG. There are a LOT of artists at ND that have worked in the film industry. If you haven't actually created a CG shot or developed code for a CG movie and then code for a game, you wouldn't understand the large gap between game and CG.


Sapphire Radeon RX-5700 XT Nitro SE with 1996 Mhz average has 10.219 TFLOPS

relative-performance_3840-2160.png

On Techpowerup's average, Sapphire Radeon RX-5700 XT Nitro SE (10.219 TFLOPS average) is slightly faster than GTX 1080 Ti
 

sendit

Member
Next gen doesn't start until Sony says it does. They have a machine in the works that will tap in to the cerebrum of your brain:



The only thing standing between photo-realistic graphics is your imagination.

BcAyYjU.jpg
 
Last edited:

Nickolaidas

Member
VFxVeteran.

Creating strawmen in your stead, then acts all defensive and counters them with his own.

Trurly next gen stuff.

Here's the thing.

Games will never match cgi of the same gen, because cgi will always be ten steps ahead. In 20 years, computer games will look better than 2000's cgi, but worse than 2020's cgi.

CGI are created with pcs of the crem de la creme, while games (console and pc), are not.

Once more, vfxveteran, the captain obvious with a messiah complex, attempts to open our eyes by telling us what the world already knows. That games cannot look as good as cgi movies.

Whooping de fucking do.
 

Sushen

Member
I'll take highly styled well art directed graphics over photo realistic graphics any day of the week. Simply photo realistic graphics is boring to me as I see them in real life every day.
 

sendit

Member
VFxVeteran.

Creating strawmen in your stead, then acts all defensive and counters them with his own.

Trurly next gen stuff.

Here's the thing.

Games will never match cgi of the same gen, because cgi will always be ten steps ahead. In 20 years, computer games will look better than 2000's cgi, but worse than 2020's cgi.

CGI are created with pcs of the crem de la creme, while games (console and pc), are not.

Once more, vfxveteran, the captain obvious with a messiah complex, attempts to open our eyes by telling us what the world already knows. That games cannot look as good as cgi movies.

Whooping de fucking do.

Agreed. Pre-rendered scenes will always be several steps ahead of real time graphics as the bar is continuously raising as you said. To put this in perspective, one frame in Toy Story 4 resulted in 60 to 160 hours to complete.

"Pixar uses its own “render farm,” a massive supercomputer that ranks among the 25 largest computers on the planet. Pixar used a 294-core render farm during production of the first Toy Story. The latest installment employed an astounding 55,000 cores. "

Reference

Not sure if OP is trolling. :messenger_neutral:
 
Last edited:

Nickolaidas

Member
Not sure if OP is trolling. :messenger_neutral:

He just likes creating his own little debates in order to stroke his ego.

I mean, if anyone looked at the Order 1886 and said 'this looks like real life!' is obviously saying it as a hyperbole, as a praise to the game's visuals.

When I was 10, and just went from playing on my Amstrad CPC 128K to a freakin' Amiga 500, I watched Shadow of the Beast 1 and my jaw hit the floor. I said, 'This looks like real life!!' while I knew it didn't. Because that was my way of saying how much better the graphics of my recently bought 16-bit computer looked from my previous 8-bit computer ones.

Now VFXVeteran plays the word lawyer and attempts to debunk the hyperbole 2-3 people said here with GIFs and examples and whatnots in a glorious self-wanking session, ignoring the fact that nobody REALLY disagreed with him, ever.

It's … a little sad, really.
 
Last edited:
In 20 years, computer games will look better than 2000's cgi

Sorry, I need to pick you up on your grammar here. Do you mean CGI from the year 2000 (as your apostrophe implies) or CGI from 2000 to 2009?

If the former, than we're WAY past that now.

Remember the infamous Killzone 2 target render? Killzone 3 basically ended up looking better than that, and Killzone Shadowfall knocks it completely out of the park.
 

Nickolaidas

Member
Sorry, I need to pick you up on your grammar here. Do you mean CGI from the year 2000 (as your apostrophe implies) or CGI from 2000 to 2009?

If the former, than we're WAY past that now.

Remember the infamous Killzone 2 target render? Killzone 3 basically ended up looking better than that, and Killzone Shadowfall knocks it completely out of the park.
I'm talking about CGI movies like 2000's Beowulf, not CGI cutscenes in games made in 2000. A PS6 game of AAA production values (i.e. Naughty Dog's game) will probably match Beowulf in terms of graphics.

Real-life graphics are a trickier part. Anything that has hair is much harder to emulate in real-time graphics, while flat-surfaced objects are much easier. Looking at games like RE2make and RE3make you can see that simple objects like an old PC monitor are almost flawless, while human hair and animal fur leaves a lot to be desired.
 

sneas78

Banned
Agreed. Pre-rendered scenes will always be several steps ahead of real time graphics as the bar is continuously raising as you said. To put this in perspective, one frame in Toy Story 4 resulted in 60 to 160 hours to complete.

"Pixar uses its own “render farm,” a massive supercomputer that ranks among the 25 largest computers on the planet. Pixar used a 294-core render farm during production of the first Toy Story. The latest installment employed an astounding 55,000 cores. "

Reference

Not sure if OP is trolling. :messenger_neutral:
I think you and that other joker are going to be dead wrong. You see now, we don’t run to the movies or shows to see which movie/show has the best graphics. At some point it will get to that point. 1 million TF isn’t going to do anything different a 200TF Can’t do. It will be like playing pong on a iPhone 4 and iPhone 15. It won’t matter, it will get there. The hardware will get there.
 
Last edited:

ahmyleg

Banned
What's up with some of the crazy people in this thread saying they don't want or didn't ask for photo realistic graphics? Why wouldn't you want visuals that look "real" even if they're stylized too? Why would you even care about next gen if you don't care about getting to photorealism? That's been the goal since the early '90s. Sheesh. Yes, graphics are fantastic today and will look even better next gen, but they can be even better. Prob gonna be a long time before photoreal though.
 
Top Bottom