• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NextGen hardware (even next-gen PC hardware) graphics are still FAR from Photorealistic

VFXVeteran

Banned
All,

Every single generation, I get a lot of people that think games will approach photoreal quality found in CG movies. And every year we get absurd threads claiming cutscenes from exclusive games are within that realm. We even have some developers bold enough to make those claims knowing that the hardware isn't anywhere close to making these kinds of claims. Yet, here we are..

Instead of fighting the opinions to no avail, I'd like to instill a dose of reality based on your own "judging" terms that you use for claiming these games will look CG (even in cutscenes).

Let's take a look at the cream of the crop rendering from current-gen and next-gen cutscenes and compare them to the best CG from the hit Netflix Original TV series - Love, Death, Robots. You should quickly come to the conclusion that we are still far far away.

#1 Claim - This console with optimization (whatever that is) will be able to create cutscenes near CG level quality (if not match) with enormous geometry bandwidth due to the SSD! Look at the Order 1886.
source.gif


giphy.gif


#1 Fact - These shots don't take a hard stare to know they are too gamey. While the color grading is great, everything else falls apart. The fog in the background is 2D, the skin on the werewolf is not accurate, his muscle don't have animation and his fur is not enough fidelity.

Here's LDR's implementation of a werewolf:

6855470-1999249770-tumbl.gif


155290820817142042.gif


These images render every single hair folical as geometry curves with multiple verts per strand of hair, multiple bounce GI lighting within the fur, full path-tracing, extremely high resolution textures (with layers), sub-surface scattering and enough geometry primitives where LOD isn't even needed.

#2 Claim - PS hardware rendered this Spiderman game cutscene with just 1.8TFLOPS! You declare that this is amazing looking graphics! Look at the particles and dust! Imagine what it would look like with 10TFLOPS? 12TFLOPS?

MelodicTeemingClumber-size_restricted.gif


#2 Fact - In this spiderman cutscene, the FX are jarringly bad compared to CG. The smoke disappears almost immediately and it's not fully volumetric and they are not lit with the same light source in the scene, so they "pop" out at you.

Here's LDR's implementation of a true volume lighting where particles persist and the FX are rendered with very complex algorithms not possible on hardware today (or even tomorrow).

tumblr_poqsxs8dIq1qzc7jao2_500.gifv



#3 Claim - Hellblade 2 detail is incredible. XSX is capable of CG in cutscenes, look at the face animation and particles!

d1354cd1bac0aea0792507ca491967967f627dbe.gifv


#3 Fact - Hellblade 2 looks the closest I've seen to CG from the trailer, but it still has things that give it away as a game. Look at the big pixels for particles. Also the fog in the background doesn't have enough range and you can see the transition. It's also a 2D effect. Facial animation is incredible though (using mocap).

Here's LDR's implementation of removing all of the tricks used because of lack of powerful hardware. The quality in rendering appears close when you look at face to face but notice the lighting. The differences in lighting is remarkable. Check out the volume lighting and how real particles are supposed to look (not colored circles from the old Quake days). The render of the blood is simply not possible for at least a decade. Those are real fluid simulations with sub-surface scattering, using implicit surfaces which blend spherical meshes together, ray-marching to get the final material color and fully physics-based.

tumblr_poikmqic0h1qabkrxo1_500.gif


tumblr_pscczaCEZ21y11o13o1_500.gifv


#4 Claim - Cyberpunk will lead the way for next-gen consoles with ray-tracing and incredible lighting.

giphy.gif


210015.gif


#4 Fact - while Cyberpunk has excellent art direction, great use of colors, incredible level design and good color palette. It is also restricted by the hardware with respect to true accurate path-traced lighting and expensive shaders.

Again, here is what we truly want in hyper-real scenes and excellent quality materials that are physically accurate.

tumblr_pojrqq8ydu1rpsdujo2_540.gif


tumblr_pow55cXMtl1ug2uiho1_500.gifv



Here is a scene in LDR that could *possibly* make it to next-gen targets as gameplay but highly unlikely!

MaleQuestionableCoati-max-1mb.gif



In closing: Before everyone gets ahead of themselves and start making threads about these consoles (or even Ampere equipped PCs) putting out photoreal CG quality games, please take a look at this thread, take a deep breath, and continue to wait for it.
 

Mista

Banned
Oh yeah definitely they're far from being photorealistic and that alone is brilliant. I don't play games for photorealistic looks

Games today look magnificent and they'll look even more beautiful next-gen. I don't need photorealism in my games

I like it the way it because even if they are not photorealistic, they still look so damn impressive and that is enough for me

Also I haven't seen anyone hear talk about anything being photorealistic and I am here everyday so this thread is a bit weird? But I appreciate the detailed explanation

Back to topic. Games graphics have their own uniqueness so asking for a photorealistic graphics is dumb. No one can tell me that this doesn't look GREAT

All pictures taken by me:

b55db046-da27-4b07-be65-7685a275ffbd.PNG

f60a42c2-da18-4f8f-96a6-526608cca500.PNG

fb53e484-4021-4d8b-a2a9-273496d3a855.PNG

f8b89914-6083-4780-8cf6-2ff542f64bf3.PNG

2dba0d89-668c-483f-bbea-780c7d72a838.PNG

8cc2a766-e828-4f32-ba87-fc5192cef3c8.PNG







In other words, fuck photorealism when I can get all of this.
 

Captain Hero

The Spoiler Soldier
Oh yeah definitely they're far from being photorealistic and that alone is brilliant. I don't play games for photorealistic looks

Games today look magnificent and they'll look even more beautiful next-gen. I don't need photorealism in my games

I like it the way it because even if they are not photorealistic, they still look so damn impressive and that is enough for me

Also I haven't seen anyone hear talk about anything being photorealistic and I am here everyday so this thread is a bit weird? But I appreciate the detailed explanation

Back to topic. Games graphics have their own uniqueness so asking for a photorealistic graphics is dumb. No one can tell me that this doesn't look GREAT

All pictures taken by me:

b55db046-da27-4b07-be65-7685a275ffbd.PNG

f60a42c2-da18-4f8f-96a6-526608cca500.PNG

fb53e484-4021-4d8b-a2a9-273496d3a855.PNG

f8b89914-6083-4780-8cf6-2ff542f64bf3.PNG

2dba0d89-668c-483f-bbea-780c7d72a838.PNG

8cc2a766-e828-4f32-ba87-fc5192cef3c8.PNG







In other words, fuck photorealism when I can get all of this.


That’s what we all want really .. not photorealistic! .. PS and XB are offering the best games in gaming industry and guess what ? .. no photorealistic
 

VFXVeteran

Banned
Oh yeah definitely they're far from being photorealistic and that alone is brilliant. I don't play games for photorealistic looks

Games today look magnificent and they'll look even more beautiful next-gen. I don't need photorealism in my games

I like it the way it because even if they are not photorealistic, they still look so damn impressive and that is enough for me

Also I haven't seen anyone hear talk about anything being photorealistic and I am here everyday so this thread is a bit weird? But I appreciate the detailed explanation

Back to topic. Games graphics have their own uniqueness so asking for a photorealistic graphics is dumb. No one can tell me that this doesn't look GREAT

All pictures taken by me:

b55db046-da27-4b07-be65-7685a275ffbd.PNG

f60a42c2-da18-4f8f-96a6-526608cca500.PNG

fb53e484-4021-4d8b-a2a9-273496d3a855.PNG

f8b89914-6083-4780-8cf6-2ff542f64bf3.PNG

2dba0d89-668c-483f-bbea-780c7d72a838.PNG

8cc2a766-e828-4f32-ba87-fc5192cef3c8.PNG







In other words, fuck photorealism when I can get all of this.


Ok. let's put it another way. ANY kind of realism is far from CG. The last example I gave with Cyberpunk is hyper-realism and there still is nothing out there that will touch offline rendering. That's my point actually. That the hardware so far is just not there. Games will look great for games. But you'll get some knuckleheads making threads about the exclusives rivaling CG.. it's only a matter of time.
 

Mista

Banned
Ok. let's put it another way. ANY kind of realism is far from CG. The last example I gave with Cyberpunk is hyper-realism and there still is nothing out there that will touch offline rendering. That's my point actually. That the hardware so far is just not there. Games will look great for games. But you'll get some knuckleheads making threads about the exclusives rivaling CG.. it's only a matter of time.
Yeah and again, nobody asked for it. So you're coming from future. Maybe we will get people that make threads about it and asking for it but as for now I haven't seen anyone actually wanting photorealistic graphics or anything close. Of course we don't mind superbly made cutscenes but as for the gameplay nobody wants that because I am sure people know that this is only going to take away what actually makes games, games. But yeah, I feel that we will have some people asking for it next-gen.
 
All,

Every single generation, I get a lot of people that think games will approach photoreal quality found in CG movies. And every year we get absurd threads claiming cutscenes from exclusive games are within that realm. We even have some developers bold enough to make those claims knowing that the hardware isn't anywhere close to making these kinds of claims. Yet, here we are..

Instead of fighting the opinions to no avail, I'd like to instill a dose of reality based on your own "judging" terms that you use for claiming these games will look CG (even in cutscenes).

Let's take a look at the cream of the crop rendering from current-gen and next-gen cutscenes and compare them to the best CG from the hit Netflix Original TV series - Love, Death, Robots. You should quickly come to the conclusion that we are still far far away.

#1 Claim - This console with optimization (whatever that is) will be able to create cutscenes near CG level quality (if not match) with enormous geometry bandwidth due to the SSD! Look at the Order 1886.
source.gif


giphy.gif


#1 Fact - These shots don't take a hard stare to know they are too gamey. While the color grading is great, everything else falls apart. The fog in the background is 2D, the skin on the werewolf is not accurate, his muscle don't have animation and his fur is not enough fidelity.

Here's LDR's implementation of a werewolf:

6855470-1999249770-tumbl.gif


155290820817142042.gif


These images render every single hair folical as geometry curves with multiple verts per strand of hair, multiple bounce GI lighting within the fur, full path-tracing, extremely high resolution textures (with layers), sub-surface scattering and enough geometry primitives where LOD isn't even needed.
Except that cg wolf looks far more cartoony and fake. I'm sure scorpion king is up there too, but you can call it art direction or whatever but there are many games that look significantly better than scorpion king.

It's like some of the effects in end game, The spiderman armor and the iron man armor looked rushed. Videogamey and fake looking.
The render of the blood is simply not possible for at least a decade. Those are real fluid simulations with sub-surface scattering, using implicit surfaces which blend spherical meshes together, ray-marching to get the final material color and fully physics-based.


tumblr_pscczaCEZ21y11o13o1_500.gifv
Maybe it will take a decade, but alot of things can be precomputed and there are plenty of advances in realtime graphics









In closing: Before everyone gets ahead of themselves and start making threads about these consoles (or even Ampere equipped PCs) putting out photoreal CG quality games, please take a look at this thread, take a deep breath, and continue to wait for it.
No one's talking everything being photoreal. Right now there are architect visualizations in unreal that basically look photoreal. With photogrammetry objects and locations can look extremely realistic



Look at the environments in project mara for instance



Long flowing hair will likely not look very realistic for the foreseable future. Realtime fluid simulations are limited, although there's been some advances in that area. Though precomputed animations and lighting might be possible for hair and fluids.

The thing is even pixar did not use ray tracing broadly until 2013, likely many studios used it in limited capacity prior to that date. Yet by ps6 we can expect most games to make heavy use of ray tracing. Techniques like subsurface scattering were missing even from hollywood cg not so long ago, yet such has been adapted for real time use.
 
Ok. let's put it another way. ANY kind of realism is far from CG. The last example I gave with Cyberpunk is hyper-realism and there still is nothing out there that will touch offline rendering. That's my point actually. That the hardware so far is just not there. Games will look great for games. But you'll get some knuckleheads making threads about the exclusives rivaling CG.. it's only a matter of time.

Cyberpunk 2077 is a relevant case study, because back when they premiered the revel trailer, around 2012, one of their alleged goals was to eventually deliver near-CGI graphical fidelity.

Some 8-years after, what they have delivered seems to fall short. You could argue E3 2018 trailer's initial subway shot comes close enough. Maybe it's a cutscene, because all the gameplay we've seen seems to belong to a previous computational era when compared to it. With each showing, the game looks less impressive. I do think claims of yet another downgrade are entirely legitimate, but debatable and to-be-determined. Hopefully I'll be proven wrong once the PR barrage restarts.

The art direction is stylized. It's taking what I assume is a very conscious and deliberate step away from hyperrealism. Whether that decision was purely creative or factored in their entirely understandable inability to deliver on that naïve initial 2012 promise on 2014 hardware, only devs will be able to answer in full. When expectations meet reality, reality tends to win.

We'll likely get incremental progress towards photorealism. Illumination seems to finally be under control, but animation, real-time soft-body physics, etc, still betray the kind of trademark shortcomings associated with videogames. By the end of the generation, the standard will have moved upward, but so will expectations. Surely you remember the faces and the comments of the folks coming out of Bethesda's Oblivion first public demos, in awe, the glare of what they thought were life-like graphics still in their retinas.

In the end, it is as legitimate for gamers to ask for photorealism as it is to ask for credible AI or innovative gameplay. It is entirely legitimate to express one's asportations. Then It's up to the AAA studios to decide whether that's the way they want to go or not.
 
Last edited:

GymWolf

Member
not to be that guy, but no shit we are still absurdly far from photorealism or modern cg quality...

nice analysis tho, great choice of gifs.
 
Last edited:

DESTROYA

Member
I might be in the minority with this opinion but I think games already look pretty amazing. I don’t think I would care for photo realism in my games.
 

hyperbertha

Member
The claim is that SSDs will get us far more closer to CG in a way Tflops just won't. Like the ND dev said, you can render each hair and wrinkle on Drake's face in a scene when you can have 7+ GB of data just for the next second. Of course pure photorealism will need way more like full raytracing and realtime hair physics simulation etc.
 

bitbydeath

Member
In closing: Before everyone gets ahead of themselves and start making threads about these consoles (or even Ampere equipped PCs) putting out photoreal CG quality games, please take a look at this thread, take a deep breath, and continue to wait for it.

This thread only largely covers the current-gen though. We’ve seen almost nothing of next-gen yet.
 

Stuart360

Member
not to be that guy, but no shit we are still absurdly far from photorealism or modern cg quality...

nice analysis tho, great choice of gifs.
We are many gens away from movie cgi, nevermind pure photorealism.

AVATAR is 11 years old now, and i bet it will be atleast 3 or 4 more gens before we get say an open world games that looks as good as this -

291cbe1cd6ed5743b8d0d6a90047da23.gif

AlarmingAstonishingIzuthrush-size_restricted.gif

lNhlwj.gif

avatar-28.gif

avatar-35.gif
 

GymWolf

Member
before real photorealism we need a couple of gen of MASSIVE improvement in interaction and destruction.

seeing all this realistic and untochable environment literally break immersion more than shitty hairs or lights.

i want to shot on a wall with a shotgun and use the debris as a projectiles for my slingshot just because i fucking want\can, this is gonna increase gameplay chooice more than ever while looking fucking cool in the meantime.

also more realistic and dynamic animations and interaction between soft-bodies and environment, for now it's all stiff and unrealistic even in games with advanced scheletric system like rdr2.

and some big step in ia, seeing enemies and npc moving like stiff morons it's another huge turn off for immersion, maybe more than anything really.
 
Last edited:

GymWolf

Member
We are many gens away from movie cgi, nevermind pure photorealism.

AVATAR is 11 years old now, and i bet it will be atleast 3 or 4 more gens before we get say an open world games that looks as good as this -

291cbe1cd6ed5743b8d0d6a90047da23.gif

AlarmingAstonishingIzuthrush-size_restricted.gif

lNhlwj.gif

avatar-28.gif

avatar-35.gif
that's why i say "absurdly far". :lollipop_squinting:

also avatar can be a miracle of cg but to me it looks like shit, i don't know why, maybe an art design thing.
 

bitbydeath

Member
that's why i say "absurdly far". :lollipop_squinting:

also avatar can be a miracle of cg but to me it looks like shit, i don't know why, maybe an art design thing.

That first image alone shows a really long neck that then cuts to a normal sized neck when the shot changes to the side.
 

VFXVeteran

Banned
Except that cg wolf looks far more cartoony and fake.

Are you serious?

Long flowing hair will likely not look very realistic for the foreseable future. Realtime fluid simulations are limited, although there's been some advances in that area. Though precomputed animations and lighting might be possible for hair and fluids.

Precomputed isn't realtime. We are going for realtime here.

The thing is even pixar did not use ray tracing broadly until 2013, likely many studios used it in limited capacity prior to that date. Yet by ps6 we can expect most games to make heavy use of ray tracing. Techniques like subsurface scattering were missing even from hollywood cg not so long ago, yet such has been adapted for real time use.

The PS6 will not even have the bandwidth to make these CG graphics in realtime. If you assume a PS6 to have even an Ampere 30xx, it still won't be enough. You are grossly underestimating the kind of data to make CG. Perhaps you could take a course on the implementing CG FX (i.e. Maya or Houdini) to know how far away we are because just telling some of you isn't enough for you to believe it.
 

VFXVeteran

Banned
This thread only largely covers the current-gen though. We’ve seen almost nothing of next-gen yet.

I showed you Hellblade 2 which for now destroys anything seen for next-gen. And it's a cutscene.

You guys need to look at some hard data from some of these CG shots. I'll try to get you some when the ridiculous "this game looks CG" threads start to roll in. Otherwise it's really hard for you to imagine how much data is moving per frame on these CG shots.
 
Last edited:
Those claims are PR buzzwords (nvdia graphics like Toy Story, same with PS2, ect) and nothing more (yes, i have zero respect for PR).

Thanks op for your explanations. 👍👏
 

VFXVeteran

Banned
The claim is that SSDs will get us far more closer to CG in a way Tflops just won't. Like the ND dev said, you can render each hair and wrinkle on Drake's face in a scene when you can have 7+ GB of data just for the next second. Of course pure photorealism will need way more like full raytracing and realtime hair physics simulation etc.

An SSD isn't going to get us more closer to CG than more RAM. That's just a fact. DDR 5 RAM moves near 20-30Gb/s vs 5.5Gb/s on Sony's SSD. Why do you keep focusing on streaming data when the absolute preference is having that data already into RAM (which is much much faster than SSD) so you don't need to fetch it? In fact, the ultimate solution is to have all the data reside in GDDR VRAM which moves at > 600Gb/s.
 
Last edited:

GymWolf

Member
I showed you Hellblade 2 which for now destroys anything seen for next-gen. And it's a cutscene.

You guys need to look at some hard data from some of these CG shots. I'll try to get you some when the ridiculous "this game looks CG" threads start to roll in. Otherwise it's really hard for you to imagine how much data is moving per frame on these CG shots.
still, you know better than me that by the end of the gen hellblade 2 it's gonna be "cute" and nothing more, we saw the jump during this gen from first titles to end of the era titles.
nobody was expecting shit like tushima or tlou2 on a 1.8tf machine at the beginning of this gen, literally NOBODY.

doesn't change your point, it's just to add some context.
 
Last edited:

VFXVeteran

Banned
still, you know better than me that by the end of the gen hellblade 2 it's gonna be "cute" and nothing more, we saw the jump during this gen from first titles to end of the era titles.
nobody was expecting shit like tushima or tlou2 at the beginning of this gen, literally NOBODY.

That's a huge fallacy. There is nothing more technical from the start of gen to the late games of this gen that differentiates it from the early games.

Uncharted 4's techniques for SSS, baked lighting, etc... was still used in The Order 1886. The tech was a constant throughout the entire generation and will be the same this next-generation. The little bumps in speed mid-gen only made games render at higher res or faster FPS.
 
Last edited:

GymWolf

Member
That's a huge fallacy. There is nothing more technical from the start of gen to the late games of this gen that differentiates it from the early games.

Uncharted 4's techniques for SSS, baked lighting, etc... was still used in The Order 1886. The tech was a constant throughout the entire generation and will be the same this next-generation. The little bumps in speed mid-gen only made games render at higher res or faster FPS.
so you are telling me that nothing in the next gen is gonna be better than hellblade 2? because killzone 4 and infamous 3 right now looks cute and nothing more in their respective genres.
serious doubt but whatever dude, we are gonna se who is wrong when the first nextgen games from wizard of the hardware are gonna come out.
 
Last edited:

bitbydeath

Member
I showed you Hellblade 2 which for now destroys anything seen for next-gen. And it's a cutscene.

You guys need to look at some hard data from some of these CG shots. I'll try to get you some when the ridiculous "this game looks CG" threads start to roll in. Otherwise it's really hard for you to imagine how much data is moving per frame on these CG shots.

And that’s from a small dev at the beginning of the gen. Not the big guns.
 
Precomputed isn't realtime. We are going for realtime here.
except it doesn't matter. Precomputation is being used to allow for interactive real time graphics with far higher fidelity.

With precomputation you can have interactive simulations such as deformations running up to 2000 times faster. Suddenly something that takes 1 minute to render can run at 30fps interactively.


The PS6 will not even have the bandwidth to make these CG graphics in realtime. If you assume a PS6 to have even an Ampere 30xx, it still won't be enough. You are grossly underestimating the kind of data to make CG. Perhaps you could take a course on the implementing CG FX (i.e. Maya or Houdini) to know how far away we are because just telling some of you isn't enough for you to believe it.
You know I was speaking ps6 not ps5 in that quote, right?

ps6 is likely to be 3-4x as powerful as the 30xx nvidia series. with cost per transistor drops likely returning thanks to euv, it might have 32-64GB of ram and over 1+TB/s of bandwidth.
edit:
Otherwise it's really hard for you to imagine how much data is moving per frame on these CG shots.
The biggest transformer, devastator had 32GB of textures, keep in mind, such level of detail is likely the highest LOD. At any time only part of the transformer is likely to be on screen if he's close, and when far away lower LOD can be used.

Toy story 2 had between 4-40 million polygons per frame. I've heard some ps4 titles had more than 10 million polygons per frame.
That's a huge fallacy. There is nothing more technical from the start of gen to the late games of this gen that differentiates it from the early games.
let's look at last gen for an example of changes from early games to later games.
 
Last edited:

VFXVeteran

Banned
so you are telling me that nothing in the next gen is gonna be better than hellblade 2?
serious doubt but whatever dude.

Hellblade 2 scene was a cutscene. It won't look anywhere near like that.

Remember all the cutscenes in this generation of games? They ALL transitioned from really high quality rendering to low "gamey" quality while playing the game. It is a huge step down. I'm telling you that even Hellblade 2 cutscene won't be matched in gameplay let alone something BETTER at the end of the generation. We developers have been given the hardware as it is now. We will push it to whatever we can do now. Not later. Remember, there is no large learning curve like the PS3 days. APIs are much better and drivers are more friendly.
 

GymWolf

Member
Are you serious?



Precomputed isn't realtime. We are going for realtime here.



The PS6 will not even have the bandwidth to make these CG graphics in realtime. If you assume a PS6 to have even an Ampere 30xx, it still won't be enough. You are grossly underestimating the kind of data to make CG. Perhaps you could take a course on the implementing CG FX (i.e. Maya or Houdini) to know how far away we are because just telling some of you isn't enough for you to believe it.
wait a moment, you think that a ps6 in 6-7-8 years is gonna have inside a gpu that come out in a couple of months during 2020?

and how do you explain sex having a gpu almost on par with a 2080super? the second most powerfull gpu in the marked today??

did i misunderstood that 30xx you wrote?
 
Last edited:

GymWolf

Member
Hellblade 2 scene was a cutscene. It won't look anywhere near like that.

Remember all the cutscenes in this generation of games? They ALL transitioned from really high quality rendering to low "gamey" quality while playing the game. It is a huge step down. I'm telling you that even Hellblade 2 cutscene won't be matched in gameplay let alone something BETTER at the end of the generation. We developers have been given the hardware as it is now. We will push it to whatever we can do now. Not later. Remember, there is no large learning curve like the PS3 days. APIs are much better and drivers are more friendly.
dude, some realtime cutscene on tlou2 don't look light years distant from that H2 cutscene (especially faces and facial animation), you really understimate people like ND or santa monica with 10 tf under their asses...
 
Last edited:

JordanN

Banned
Every single generation, I get a lot of people that think games will approach photoreal quality found in CG movies. And every year we get absurd threads claiming cutscenes from exclusive games are within that realm.
CGI is always a moving target.
I don't think there's any doubt games at least look better than a lot of 90s era graphics.

7G5IEbt.png



It's only when you get to around the 2000s, does the gap still reveal itself.

But I would argue it's down to the artstyle and budget. Like this scene is "technically" dated, but artistically speaking, it's hard to find a lot of games that put Star Wars to shame just based on George Lucas' creativity.

BA75cgx.jpg
 
Last edited:

VFXVeteran

Banned
except it doesn't matter. Precomputation is being used to allow for interactive real time graphics with far higher fidelity.

Precomputation is used because the hardware isn't powerful enough to compute it in real time. That's a big disadvantage.

ps6 is likely to be 3-4x as powerful as the 30xx nvidia series. with cost per transistor drops likely returning thanks to euv, it might have 32-64GB of ram and over 1+TB/s of bandwidth.

Man, you are really talking out of you mind dude. Do you work in the industry at all?

Let's see how dumb your statement is. If a 30xx GPU comes with a performance delta of 20TFLOPS, you are claiming that a PS6 will have a GPU by AMD (who's known for being years behind in tech) doing 60-80TFLOPS of power? That's literally a GPU that's not even designed yet from either company. Are you that much of a Sony warrior that you just make up figures in your head just to try to belittle the top graphics company in the world? Why? Your claims are so outrageous I'm considering putting you on my /ignore list because it's just so far away from reality.

wait a moment, you think that a ps6 in 6-7-8 years is gonna have inside a gpu that come out in a couple of months during 2020?

and how do you explain sex having a gpu almost on par with a 2080super? the second most powerfull gpu in the marked today??

1080Ti introduced 2017
PS4 introduced 2014
PS5 introduced 2020 at barely 1080Ti levels.

That's a 6-year gap from one gen to the next gen using 3yr old tech. Do the math.

dude, some realtime cutscene on tlou2 don't look light years distant from that cutscene, you really understimate people like ND or santa monica with 10 tf under their asses...

I know devs at ND and they understand limitations of consoles and more importantly, how far away they are from CG. There are a LOT of artists at ND that have worked in the film industry. If you haven't actually created a CG shot or developed code for a CG movie and then code for a game, you wouldn't understand the large gap between game and CG.
 

GymWolf

Member
Precomputation is used because the hardware isn't powerful enough to compute it in real time. That's a big disadvantage.



Man, you are really talking out of you mind dude. Do you work in the industry at all?

Let's see how dumb your statement is. If a 30xx GPU comes with a performance delta of 20TFLOPS, you are claiming that a PS6 will have a GPU by AMD (who's known for being years behind in tech) doing 60-80TFLOPS of power? That's literally a GPU that's not even designed yet from either company. Are you that much of a Sony warrior that you just make up figures in your head just to try to belittle the top graphics company in the world? Why? Your claims are so outrageous I'm considering putting you on my /ignore list because it's just so far away from reality.



1080Ti introduced 2017
PS4 introduced 2014
PS5 introduced 2020 at barely 1080Ti levels.

That's a 6-year gap from one gen to the next gen using 3yr old tech. Do the math.



I know devs at ND and they understand limitations of consoles and more importantly, how far away they are from CG. There are a LOT of artists at ND that have worked in the film industry. If you haven't actually created a CG shot or developed code for a CG movie and then code for a game, you wouldn't understand the large gap between game and CG.
but hellblade 2 cutscene is not cg, microsoft people talked about real time or in engine at worst...

also there is no way in hell of ps6 having a gpu like 30xx series in 2027.
maybe not 4x times more powerfull like the other member said, but much much better than a 30xx, cmon...

and i'm not sure of this one, but i think the gpu inside ps5 being more similar to a 2070super than a 1080ti in terms of power and architecture, it has rtx, rdna2 and other modern features.
 
Last edited:

Azelover

Titanic was called the Ship of Dreams, and it was. It really was.
I wish they'd change the course a little bit. I don't just want photorealistic graphics.

I'd like to see a standard controller that had some true innovation with it. Or simply the return of some really neat controller features. Like the IR sensor that the Wii used.. I'd love to have that be incorporated into a modern console. It's like we went backwards when that was abandoned. There are certain games which can't be played anymore for that reason.
 

Sagii86

Member
Hellblade 2 scene was a cutscene. It won't look anywhere near like that.

Remember all the cutscenes in this generation of games? They ALL transitioned from really high quality rendering to low "gamey" quality while playing the game. It is a huge step down. I'm telling you that even Hellblade 2 cutscene won't be matched in gameplay let alone something BETTER at the end of the generation. We developers have been given the hardware as it is now. We will push it to whatever we can do now. Not later. Remember, there is no large learning curve like the PS3 days. APIs are much better and drivers are more friendly.


That's the point where you start assuming stuff based on nothing. You dont have any evidence shows it won't look like that or even better from any other talented studio along the way. Let's just wait and see before we jump to conclusions.
 
Last edited:

VFXVeteran

Banned
but hellblade 2 cutscene is not cg, microsoft people talked about real time or in engine at worst...

It's captured which basically means it was recorded at 24FPS. Not indicative of real-time. We don't know what resolution, nor any of the specifics of what was used. The gameplay most certainly NOT look like it.
 

VFXVeteran

Banned
That's the point where you start assuming stuff based on nothing. You dont have any evidence shows it won't look like that or even better from any other talented studio along the way. Let's just wait and see before we jump to conclusions.

I know the Producer of that demo. He works for EpicGames (along with others I know that work there). Let's just leave it at that.
 

GymWolf

Member
It's captured which basically means it was recorded at 24FPS. Not indicative of real-time. We don't know what resolution, nor any of the specifics of what was used. The gameplay most certainly NOT look like it.
i add some thing to my post, please read, this discussion is interesting to say the least.
 

Sagii86

Member
It's captured which basically means it was recorded at 24FPS. Not indicative of real-time. We don't know what resolution, nor any of the specifics of what was used. The gameplay most certainly NOT look like it.

Many trailers this gen recorded at 24fps mainly for marketing purposes, some with much less visual fidelity. It doesn't indicate anything.
 

VFXVeteran

Banned
Many trailers this gen recorded at 24fps mainly for marketing purposes, some with much less visual fidelity. It doesn't indicate anything.

Do you know what the capture process entails? Have you loaded up UE4 Editor, made a shot and then captured it?
 
Hellblade 2 scene was a cutscene. It won't look anywhere near like that.

Remember all the cutscenes in this generation of games? They ALL transitioned from really high quality rendering to low "gamey" quality while playing the game. It is a huge step down. I'm telling you that even Hellblade 2 cutscene won't be matched in gameplay let alone something BETTER at the end of the generation. We developers have been given the hardware as it is now. We will push it to whatever we can do now. Not later. Remember, there is no large learning curve like the PS3 days. APIs are much better and drivers are more friendly.

To be honest the transition from cutscene to gameplay is super smooth in god of war, and as far as i can tell the quality is the same.
Man, you are really talking out of you mind dude. Do you work in the industry at all?

Let's see how dumb your statement is. If a 30xx GPU comes with a performance delta of 20TFLOPS, you are claiming that a PS6 will have a GPU by AMD (who's known for being years behind in tech) doing 60-80TFLOPS of power? That's literally a GPU that's not even designed yet from either company. Are you that much of a Sony warrior that you just make up figures in your head just to try to belittle the top graphics company in the world? Why? Your claims are so outrageous I'm considering putting you on my /ignore list because it's just so far away from reality.
The year the ps4 released, the 780 card released(the equivalent of the 30xx for this gen). That had 4Tflops of compute. The ps5 is being compared to the radeon vii with 13.8 of GCN compute. The series X even more. But still you can see that even ignoring rdna is likely more efficient at gaming per tflop. That's 3x the flops for series x(12Tflops vs 4), and nearly 3x for ps5(10Tflops vs 4). If they were using gcn tech they could easily be 14-15+Tflops, but of course they'd be less efficient.

Keep in mind there was a significant delay in euv in these intervening years, and thus diminished rate of progress. EUV is now fully working it seems, and rate of progress is back up again.

Ps6 having 3-4x the performance of a 3080 is not out of the question if we keep 8 year console generations(cycles).
 
Last edited:
Top Bottom