• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry Forspoken analysis; hint: it sucks

Buggy Loop

Member
Wow, seems we can't even poke fun a bit about all that PS5 I/O craze without a knight jumping in to delete posts/threads. Where's the thread that discussed DirectStorage for Forspoken that was made just a few hours ago?

We had mountains of magical expectations for that tech and that nobody could match its dedicated I/O. What fun is it if we can't even keep the receipts?
 

Deerock71

Member
That has to be the LONGEST list of bullet points I have EVER seen!
Im Out GIF
 

Stiltzkin88

Neo Member
I think people are missing the point with the DStorage on PC being slightly faster than PS5. It's CPU bound ( data decompression ), not SSD bound ( data throughput ). An example being Metro Exodus is very CPU bound loading a chapter which in turn is faster on PC than on console.
Mark Cerny stated that the custom I/O work put in would be the equivelent of an additional 11 Zen 2 cores to do the work on top of the 8 ( possibly 7 max game use ) in the CPU. This game is clearly focused on decompressing data on the CPU. The 12900k is still very high-end and out of the vast majority of PC gamers reach and is magnitudes faster than the PS5 CPU on raw performance excluding I/O stuff.

Combined with DStorage and this class of CPU ( $400 to $600 ) with 16 very, very fast cores it's only barely perceptively faster than a PS5 with much slower 8 cores. I'm more impressed with the console at that price/performance as this is the amount of brute force needed just to be barely faster when it should be alot faster in reality ( see above ).
That custom I/O is doing something good for data compression.

One more thing is I'm hearing that GPU decompression will make things even faster and that is correct, but not in this game as it simply isn't taking advantage of any large pools of data being moved that require super fast storage access on either the PC or PS5 as shown by the SSD of choice being used.

It's important that people don't get ahead of themselves as the PS5 is barely being used as is, mostly in data throughput, no engine is taking full advantage of complex compression algorithms on PS5, XBOX, or PC, due to a long cross-gen period. Only a few games have tried to use it to any degree, like the new Ratchet game at launch.
 

jimmypython

Member
So what was with all the tech demos they release on PC to showcase the engine over the years when the game does not even work well on PC? Just lies?

I think it is time to get rid of this engine, which may not mean well for the studio named after it lol
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Wow, seems we can't even poke fun a bit about all that PS5 I/O craze without a knight jumping in to delete posts/threads. Where's the thread that discussed DirectStorage for Forspoken that was made just a few hours ago?

We had mountains of magical expectations for that tech and that nobody could match its dedicated I/O. What fun is it if we can't even keep the receipts?
Theead was deleted and i was told to keep the talk in this thread so go ahead and post what you want to say.

And i posted exactly what cerny said about the io block. He pointed out that the ps5 io block is equivalent to 9 zen 2 cores when it comes to decompression.

The fact that they 16 core 24 thread 5.2 ghz i9 is needed to match the ps5 loading times is proof that cerny was right. I wish the test was a bit more scientific. He should’ve used a faster ssd and capped the 3600 6 core 12 thread cpu at ps5s 3.5 ghz. We would’ve seen just how much better the ps5 io is at handling decompression with both the cpu clock and threads available to games being equal along with the ssd. Though right now its obvious that the game’s loading is cpu bound seeing as how the 3600 takes almost 3 seconds more to load the same level on the same ssd.

This vindicates cerny.
 

OsirisBlack

Banned
dont tell me FFXVI use the same engine? :messenger_pensive:
No it runs on a heavily modified version of the FFXIV engine Crystal tools. The team considered other engines but this is the one they already knew and didn’t want to take the time learning a new system so they updated the one they were already using with more modern features. There’s a whole demonstration on it.
 
Last edited:
As a Xbox user a can say "Thanks for Beta testing." Man that does feel good lol.
luminous engine isn’t worth planting a flag down for your console of choice. There’s nothing coming after this beta test, except for a price drop.

It’s just sad that this game didn’t make the move to Unreal like KH.
 

DenchDeckard

Moderated wildly
I think people are missing the point with the DStorage on PC being slightly faster than PS5. It's CPU bound ( data decompression ), not SSD bound ( data throughput ). An example being Metro Exodus is very CPU bound loading a chapter which in turn is faster on PC than on console.
Mark Cerny stated that the custom I/O work put in would be the equivelent of an additional 11 Zen 2 cores to do the work on top of the 8 ( possibly 7 max game use ) in the CPU. This game is clearly focused on decompressing data on the CPU. The 12900k is still very high-end and out of the vast majority of PC gamers reach and is magnitudes faster than the PS5 CPU on raw performance excluding I/O stuff.

Combined with DStorage and this class of CPU ( $400 to $600 ) with 16 very, very fast cores it's only barely perceptively faster than a PS5 with much slower 8 cores. I'm more impressed with the console at that price/performance as this is the amount of brute force needed just to be barely faster when it should be alot faster in reality ( see above ).
That custom I/O is doing something good for data compression.

One more thing is I'm hearing that GPU decompression will make things even faster and that is correct, but not in this game as it simply isn't taking advantage of any large pools of data being moved that require super fast storage access on either the PC or PS5 as shown by the SSD of choice being used.

It's important that people don't get ahead of themselves as the PS5 is barely being used as is, mostly in data throughput, no engine is taking full advantage of complex compression algorithms on PS5, XBOX, or PC, due to a long cross-gen period. Only a few games have tried to use it to any degree, like the new Ratchet game at launch.

While everything you say is true, and I agree. The raw facts are if you have a decent cpu, which nowadays still only uses a small number of threads for games, you can use the remaining codes to offer swift performance which beats out the ps5 with a 3.5gbs drive...it bodes well for the future of games. The ps5 ssd and io hasn't really had chance to flex in anything but loading times in a select few games over xbox and PC and I feel direct storage shows that none of us will have to suffer with poor performing games or games being held back by inferior i/o on xbox or PC.

Developers are free to do what they want.

The xbox can basically hang with ps5, which has been proven and now we know oc can beat them both quite easily. Even with 3.5gbs drives.

What about pcs with 7gbs drives? And similar cpus to ps5? PC never stops advancing a 13400f probably smashes that 3600. Couple that with a 7gbs nvme 4 drive and it's going to be further ahead.

PC was and is always the platform that if you choose to invest you blow console tech away. The ps5 i/o and ssd was touted to be something that no hardware could compete with...Well now we know, that before the gen even started it has actually been eclipsed, this is not some jab it's the truth but the way it always has been with pcs.

My 13900ks and 7600mbs nvme should destroy the ps5 but we are talking like 800 pounds of tech in just those two components.

Once the gpu decompression is unlocked we could be looking at near instant loading on PC with the right hardware.

It's the only good news coming out of forspoken tbh. Plus we are only on direct storage 1.1 so things will improve.

I just hope we see devs utilising this insane io like Cerny mentioned in his chat, becuase no one really is.
 

Thebonehead

Banned
I said it before in another thread, but worth repeating again.

It comes to something when the games biggest technological achievement is gained from using a Microsoft storage library.
 

Stiltzkin88

Neo Member
While everything you say is true, and I agree. The raw facts are if you have a decent cpu, which nowadays still only uses a small number of threads for games, you can use the remaining codes to offer swift performance which beats out the ps5 with a 3.5gbs drive...it bodes well for the future of games. The ps5 ssd and io hasn't really had chance to flex in anything but loading times in a select few games over xbox and PC and I feel direct storage shows that none of us will have to suffer with poor performing games or games being held back by inferior i/o on xbox or PC.

Developers are free to do what they want.

The xbox can basically hang with ps5, which has been proven and now we know oc can beat them both quite easily. Even with 3.5gbs drives.

What about pcs with 7gbs drives? And similar cpus to ps5? PC never stops advancing a 13400f probably smashes that 3600. Couple that with a 7gbs nvme 4 drive and it's going to be further ahead.

PC was and is always the platform that if you choose to invest you blow console tech away. The ps5 i/o and ssd was touted to be something that no hardware could compete with...Well now we know, that before the gen even started it has actually been eclipsed, this is not some jab it's the truth but the way it always has been with pcs.

My 13900ks and 7600mbs nvme should destroy the ps5 but we are talking like 800 pounds of tech in just those two components.

Once the gpu decompression is unlocked we could be looking at near instant loading on PC with the right hardware.

It's the only good news coming out of forspoken tbh. Plus we are only on direct storage 1.1 so things will improve.

I just hope we see devs utilising this insane io like Cerny mentioned in his chat, becuase no one really is.
The gen started over 2 years ago and last I checked the PS5 was the standard to beat, that's why you had guys like Tim Sweeney, any dev willing to talk praising it to the high heavens. That's pretty long for the PC platform to catch up and not the normal time scale I'm used to, nor the spec demands needed when it did catch up. The SSD is irrelevent to this discussion, 3.5gb doesn't beat a 5.5gb- 7gb if you believe Mark Cerny due to traditional SSDs only having 2 command queue and PS5 having 6 to improve data moving efficiency, hence speed requirements as a "base" asserted by him. As for the Xbox holding its own, why wouldn't it? Multiplatform games have to target the lowest common denominator when developing their code, any minor differences in favor of a particular platform are just that. That happens every gen and doesn't represent any evidence of the Xbox keeping up with PS5 as much as it represents Direct Storage is great for streamlining things going forward for multiplatorm and PC, which is Microsofts way and that's great for making things easier ( DX12-DStorage ).

Sonys approach with the PS5 is proprietary for pushing things massively ahead of the competition, for their own developers that is., since they won't be limited to what they can access and push within the hardware. Sony or more Cerny went above and beyond what devs asked for and delivered an extremely forward thinking, innovative design. The simple fact is Microsoft dopped the ball when it came to their machines focus, even more so than last-gen and cobbled together a last minute speed boost to their SSD to save face because they got wind of what Sony was doing. Microsoft played it safe, unfortunately, so the baseline is lower for developers to work with.
Hardware will always trump a software approach to do the same thing and the PS5 has a significant more investment in the area were it matters most ( Max Data throughput, zero bottlenecks which could throttle performance if demands were placed to heavily on it as the gen goes on ) The 2 consoles are more unique and different from each other this time compared to last-gen in many ways that people greatly underestimate and assume because they're rocking Zen 2-RDNA2 they must be identical, which couldn't be further from the truth.

What I'm interested in is the long-term of a console when I consider its overall max power capability vs the short-term of a PC, as you say, components are constantly releasing every year during one console gen. Ofcourse you can bruteforce ahead in time with the right money. So when I say long-term I mean I fully expect quite easily that PS first party games down the line, if ported to the PC platform, would need a good bit more than a 7gb SSD to run comfortably with no stuttering, that atleast is a given just for that one part alone. As an example on the GPU side of things you'd be mad to bet on the RTX3080 10gb and 3070 8gb to not suffer degrading performance vs consoles when texture requirements go through the roof down the line, even more so 3070. That sounds crazy for the 3080 doesn't it, but it will happen, it's a case of how much though? This happens every gen without fail. DLSS will help a little bit but not much to stave off the impact on VRAM. That's what is exciting going forward as I want to see what is needed long term to match or exceed and to find out what can't match long-term that was once regarded as more powerful. If we're talking third party sure you're good, that's to be expected with the Xbox 2.4gb base.

A point to remember is a console roughly doubles it's spec sheet performance by the end of a gen- xbox360/PS3 late gen 2011 needed a 8800 gtx or 9800 gtx just to match them ( only multiplats not FP ) when earlier you would have laughed at someone saying that when looking at the GPU specs alone on paper. The PS4 gen people were saying an i3-2gb750ti combo was all you needed and look how that turned out- late gen i5-4gbGTX960or1050ti hell it was very close to the R9290 by the end on same GCN and had no problem running rings late gen around GTX780ti. Their are reasons for these things happening, some are outdated architecture vs better console API such as one of the reasons ( not the only one ) it beats 780ti when people were sure it would always be better. The point is paper specs don't mean crap in a console long-term and I constantly see people naively repeating the same mistakes and then wondering why their "better" hardware can't keep up down the line.
 

hyperbertha

Member
Bro not trying to be rude here but did you invest in the ip or something? You defend this game quite fiercely even when the criticism it has received is more than valid considering its terrible technical state.
He pre-ordered the game and is now denial about its shittyness.
 

DenchDeckard

Moderated wildly
The gen started over 2 years ago and last I checked the PS5 was the standard to beat, that's why you had guys like Tim Sweeney, any dev willing to talk praising it to the high heavens. That's pretty long for the PC platform to catch up and not the normal time scale I'm used to, nor the spec demands needed when it did catch up. The SSD is irrelevent to this discussion, 3.5gb doesn't beat a 5.5gb- 7gb if you believe Mark Cerny due to traditional SSDs only having 2 command queue and PS5 having 6 to improve data moving efficiency, hence speed requirements as a "base" asserted by him. As for the Xbox holding its own, why wouldn't it? Multiplatform games have to target the lowest common denominator when developing their code, any minor differences in favor of a particular platform are just that. That happens every gen and doesn't represent any evidence of the Xbox keeping up with PS5 as much as it represents Direct Storage is great for streamlining things going forward for multiplatorm and PC, which is Microsofts way and that's great for making things easier ( DX12-DStorage ).

Sonys approach with the PS5 is proprietary for pushing things massively ahead of the competition, for their own developers that is., since they won't be limited to what they can access and push within the hardware. Sony or more Cerny went above and beyond what devs asked for and delivered an extremely forward thinking, innovative design. The simple fact is Microsoft dopped the ball when it came to their machines focus, even more so than last-gen and cobbled together a last minute speed boost to their SSD to save face because they got wind of what Sony was doing. Microsoft played it safe, unfortunately, so the baseline is lower for developers to work with.
Hardware will always trump a software approach to do the same thing and the PS5 has a significant more investment in the area were it matters most ( Max Data throughput, zero bottlenecks which could throttle performance if demands were placed to heavily on it as the gen goes on ) The 2 consoles are more unique and different from each other this time compared to last-gen in many ways that people greatly underestimate and assume because they're rocking Zen 2-RDNA2 they must be identical, which couldn't be further from the truth.

What I'm interested in is the long-term of a console when I consider its overall max power capability vs the short-term of a PC, as you say, components are constantly releasing every year during one console gen. Ofcourse you can bruteforce ahead in time with the right money. So when I say long-term I mean I fully expect quite easily that PS first party games down the line, if ported to the PC platform, would need a good bit more than a 7gb SSD to run comfortably with no stuttering, that atleast is a given just for that one part alone. As an example on the GPU side of things you'd be mad to bet on the RTX3080 10gb and 3070 8gb to not suffer degrading performance vs consoles when texture requirements go through the roof down the line, even more so 3070. That sounds crazy for the 3080 doesn't it, but it will happen, it's a case of how much though? This happens every gen without fail. DLSS will help a little bit but not much to stave off the impact on VRAM. That's what is exciting going forward as I want to see what is needed long term to match or exceed and to find out what can't match long-term that was once regarded as more powerful. If we're talking third party sure you're good, that's to be expected with the Xbox 2.4gb base.

A point to remember is a console roughly doubles it's spec sheet performance by the end of a gen- xbox360/PS3 late gen 2011 needed a 8800 gtx or 9800 gtx just to match them ( only multiplats not FP ) when earlier you would have laughed at someone saying that when looking at the GPU specs alone on paper. The PS4 gen people were saying an i3-2gb750ti combo was all you needed and look how that turned out- late gen i5-4gbGTX960or1050ti hell it was very close to the R9290 by the end on same GCN and had no problem running rings late gen around GTX780ti. Their are reasons for these things happening, some are outdated architecture vs better console API such as one of the reasons ( not the only one ) it beats 780ti when people were sure it would always be better. The point is paper specs don't mean crap in a console long-term and I constantly see people naively repeating the same mistakes and then wondering why their "better" hardware can't keep up down the line.

Where did Microsoft speed up the Ssd last minute?

You can't magic something like velocity architecture out of your ass last minute. It's obvious the series consoles were well thought out and designed to do exactly what was needed and asked of them. Developers asked for SSDs with 1gbps speed which is over 20 times the speed of a hdd standard drive and Microsoft gave them 2.5gps Sony went ham and gave them like what 5gbps? Or 7? And decided that was the easiest component to be able to talk about in extent as on paper it looks twice as fast as the series x drive.

So, in essence they did what any clever marketer would do and highlighted where their product out shines the competition. Won the crowd over with tech jargon like an incredible snake oil salesman.

In reality we havetnt seen anything that has shown the actual real world benefit of that magical I/O vs the competition but I am more than happy to take suggestions. When ratchet and clank is proven to run on a drive with half the speed of the ps5 drive you know that no developer is using the ps5 ssd to what Mark Cerny fed us.

We are two years and still waiting...I have faith that we won't see it this gen. Not anything that couldn't run on PC or xbox. But I'm willing to wait and see what comes and will take the L if it appears.
 
Last edited:

sachos

Member
This is not terrible graphics. cmon now. I am just trying to be fair here.
No offense taken. I just like the game. Especially the movement in the open world and graphics.
I can understand people don't like these characters or story but gameplay and graphics are quite solid imo
kqGYl61.jpg

TVKGnFe.jpg

3RR7s1A.jpg

nfWMQIL.jpg


And it's not even showing the fantastic HDR. This is how it looks with HDR.... now clearly... .this is not TERRIBLE GRAPHICS.
zjVRUhm.jpg

WCxaB0s.jpg
Thats your setup? Damn bro, looking good!
 

01011001

Banned
I am more than happy to take suggestions. When ratchet and clank is proven to run on a drive with half the speed of the ps5 drive you know that no developer is using the ps5 ssd to what Mark Cerny fed us.

not to mention that it still happens that some games just load faster on Series X, which you'd think is impossible given the SSD specs alone

 

Vick

Member
not to mention that it still happens that some games just load faster on Series X, which you'd think is impossible given the SSD specs alone


Outdated video btw, along with graphics taking a generational leap load times also saw a significant upgrade on PS5 after the Patch.
Apparently though, only in Performance Mode for some reason.

Edit:

What's to laugh about aries_71 aries_71 ?

MYQquIp.jpg


Not different enough for you junior?
 
Last edited:

Moriah20

Member
The 3070 isn't exactly spring chicken but it's still well above the average PC user's graphics card, and it's also better than the Ps5's GPU. Yet I literally cannot play the game without the textures looking like washed out shit, no matter how much I lower the settings.
Consider this, I can play RDR 2 on near max and get over 60 fps, I can play Spider-Man and achieve the same, Forza Horizon 5.. etc. Much more impressive looking open world games. And 8gb vram not being "enough" is a joke as there's games with much better texture work that work just fine. This is just awful optimization.

Luminous is clearly not up to the task. They develop their own engine tech yet the tech seems far behind the competition. The game is an absolute hog with very little to show for it.
And the game itself... yeah.
 

Stiltzkin88

Neo Member
Where did Microsoft speed up the Ssd last minute?

You can't magic something like velocity architecture out of your ass last minute. It's obvious the series consoles were well thought out and designed to do exactly what was needed and asked of them. Developers asked for SSDs with 1gbps speed which is over 20 times the speed of a hdd standard drive and Microsoft gave them 2.5gps Sony went ham and gave them like what 5gbps? Or 7? And decided that was the easiest component to be able to talk about in extent as on paper it looks twice as fast as the series x drive.

So, in essence they did what any clever marketer would do and highlighted where their product out shines the competition. Won the crowd over with tech jargon like an incredible snake oil salesman.

In reality we havetnt seen anything that has shown the actual real world benefit of that magical I/O vs the competition but I am more than happy to take suggestions. When ratchet and clank is proven to run on a drive with half the speed of the ps5 drive you know that no developer is using the ps5 ssd to what Mark Cerny fed us.

We are two years and still waiting...I have faith that we won't see it this gen. Not anything that couldn't run on PC or xbox. But I'm willing to wait and see what comes and will take the L if it appears.
Where did I say VA was last minute? I said software solution will always be worse than hardware, that is correct. And as for marketing they couldn't wait to talk about their amazing speed out the gate and played extra emphasis on the GPU terafloppys when we all know they matter very little in real world.
"The worlds most powerful console" was replaced with "The worlds most powerful Xbox" so spare me the nonsense on marketing. Cerny delivered hard facts for nearly an hour even to the point of boring certain people to death who prefer special buzzwords to keep them engaged.

I never said the Xbox was poorly designed I said it was great for streamlining access across not just Xbox but the PC platform, that was their goal, but in terms of investment in that area and focus to maximise that they made a comprimise by investing more into their GPU and the like. It's quite easy to see this and speaking of software approach vs hardware, CPU cycles will be needed and the more demanding the game the more cycles will be needed, PC wont have this problem if you have an amazing CPU to bruteforce. PS5 will have ZERO cycles being wasted at all as a nice benefit of hardware only approach. That is one aspect of being more efficient.

Ratchet could easily run on an XBOX if it were ported properly, it runs at 60fps on a PS5, it's a launch game. It runs on a PS4 engine so Insomniac did the best they could in a tight launch window.
"Long-term", however, there would be big problems.

Magical SSD I/O? Please stop sounding like a fanboy. I've stated that Cernys approach is technically and hardware designed better for a fixed platform on this topic. It has more theoretical potential overall and additionally won't be constrained by multplat approach to their game making as Xbox are doing now-Xbox/PC first party merged. Their devs will have to take "weaker specced" multi-configured PC hardware into consideration which will eat extra dev time to polish what they already have. These aren't me saying these things, this is the reality that MIcrosoft are commiting to.
 

DenchDeckard

Moderated wildly
Where did I say VA was last minute? I said software solution will always be worse than hardware, that is correct. And as for marketing they couldn't wait to talk about their amazing speed out the gate and played extra emphasis on the GPU terafloppys when we all know they matter very little in real world.
"The worlds most powerful console" was replaced with "The worlds most powerful Xbox" so spare me the nonsense on marketing. Cerny delivered hard facts for nearly an hour even to the point of boring certain people to death who prefer special buzzwords to keep them engaged.

I never said the Xbox was poorly designed I said it was great for streamlining access across not just Xbox but the PC platform, that was their goal, but in terms of investment in that area and focus to maximise that they made a comprimise by investing more into their GPU and the like. It's quite easy to see this and speaking of software approach vs hardware, CPU cycles will be needed and the more demanding the game the more cycles will be needed, PC wont have this problem if you have an amazing CPU to bruteforce. PS5 will have ZERO cycles being wasted at all as a nice benefit of hardware only approach. That is one aspect of being more efficient.

Ratchet could easily run on an XBOX if it were ported properly, it runs at 60fps on a PS5, it's a launch game. It runs on a PS4 engine so Insomniac did the best they could in a tight launch window.
"Long-term", however, there would be big problems.

Magical SSD I/O? Please stop sounding like a fanboy. I've stated that Cernys approach is technically and hardware designed better for a fixed platform on this topic. It has more theoretical potential overall and additionally won't be constrained by multplat approach to their game making as Xbox are doing now-Xbox/PC first party merged. Their devs will have to take "weaker specced" multi-configured PC hardware into consideration which will eat extra dev time to polish what they already have. These aren't me saying these things, this is the reality that MIcrosoft are commiting to.

Pretty cool right.jpg 😎
 

DaGwaphics

Member
This is not terrible graphics. cmon now. I am just trying to be fair here.
No offense taken. I just like the game. Especially the movement in the open world and graphics.
I can understand people don't like these characters or story but gameplay and graphics are quite solid imo
kqGYl61.jpg

TVKGnFe.jpg

3RR7s1A.jpg

nfWMQIL.jpg


And it's not even showing the fantastic HDR. This is how it looks with HDR.... now clearly... .this is not TERRIBLE GRAPHICS.
zjVRUhm.jpg

WCxaB0s.jpg

Those aren't actually from this game though right? Every shot looks terrible, like someone just took an old game and made a high resolution texture package for it. That last one is especially horrifying.
 

rofif

Can’t Git Gud
For a PS2 game. ;)
I’ve posted many screenshots already. They game looks fantastic. Big areas with volumetrics and great geometric detail. Really is stunning at points.
Stop following the sheep and trolling. Try giving the game a chance yourself.
 

rofif

Can’t Git Gud
Those aren't actually from this game though right? Every shot looks terrible, like someone just took an old game and made a high resolution texture package for it. That last one is especially horrifying.
These are from the game are look fantastic. Stop looking for bad things about it. There is nothing old game about the presentation. I took pics of the tv because it shows hdr a bit better. Matte view the pics on bigger screen and not a phone idk. I think it’s one of better looking open world games
 

DaGwaphics

Member
These are from the game are look fantastic. Stop looking for bad things about it. There is nothing old game about the presentation. I took pics of the tv because it shows hdr a bit better. Matte view the pics on bigger screen and not a phone idk. I think it’s one of better looking open world games

Beauty is in the eye of the beholder I guess. There is literally nothing impressive about those images to me. The last one with the distant staircase is particularly bad, there is a flatness to it. Something with the lighting just does not hit right. The buildings look almost like cardboard cutouts, almost like it is faux 3D (like a super high res version of The Sims 1, LOL). Textures are sharp but somehow completely unrealistic looking in all the wrong ways. It's not a looker for me from the shots you posted. 🤷‍♂️
 
Last edited:
Top Bottom