• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Graphics Technology Discussion: All games on consoles and PCs..

Lol. No. Take a look at the pc performance thread. Pretty low end hardware is mopping the floor with the ps4 performance. Trying to equate a ps4 to a high end pc is...delusional.

Looking at the Steam Hardware stats, Neogaf has a pretty fucked up definition what low, mid and high end PCs are.
 
This doesn't seem. Clouds in Driveclub use Simulview Trueksy which is a very advanced sky system that handles clouds very realistically in a way that can affect gameplay. For example, the clouds hide the sun for a fertain time and they move making dynamic shadows, and when they expose the sun suddenly you get blinded or dazzled by its beams and you can't drive very well. It can alter stealth games gameplay too: http://simul.co/

You can see driveclub in their site and now they support UE4 too.

It's Simul trueSKY. I couldn't find their site without going to the YouTube page you linked. That page links to their alpha state plug-in for Unreal Engine 4.
 
Looking at the Steam Hardware stats, Neogaf has a pretty fucked up definition what low, mid and high end PCs are.

If I remember correctly, steam stats listed that the most common graphic card among the steam users is the 9800 GT (my old card). I don't think the majority of steam users have decent cards to play new games.
 
I can't believe you made a thread about this to continue the flamebait you started in Uncharted 4 thread! You are really stubborn!

Dude, don't come in here trolling me. It'll get you banned. This thread is not intended to "flame" any game. It's to talk about the graphics technology (which I failed to achieve in the UC4 thread) in all games and compare them to one another.

TLOU uses geomerics enlighten system with real-time radiosity enabled.

I don't see real-time radiosity in the game. You have any links or videos?

Please don't pretend you know shit about video game tech. It will be a great waste of time to try to discuss anything with you cus teh word doens't doens't even exist in your vocabulary! Cheers !

You don't know me or my background.. so to make such blanket statements would be considered trolling. Either constructively talk about graphics tech in this thread or go elsewhere.
 
It is simply disappointing that Microsoft and Sony both aimed at very low-end CPU performance and very modest GPUs with X1/PS4. I do understand why though. They simply could not justify near-bleeding edge hardware as they did when 360 and PS3 launched in '05 and '06. PS4 and Xbox One had to make a profit from the start.

Agreed. This is one of the reasons why I believe we are already hitting a wall with some of the more recent games coming out on next-gen. I feel that 30fps is going to be the norm.
 

RoboPlato

I'd be in the dick
Agreed. This is one of the reasons why I believe we are already hitting a wall with some of the more recent games coming out on next-gen. I feel that 30fps is going to be the norm.

I have to ask again since this is a more appropriate place for the discussion but I'm really confused about your stance on framerate. You're simultaneously disappointed when a game is 30fps due to being a lower framerate but also disappointed when a game is 60fps and not pushing as much tech. I don't get how it would be possible to please you outside of a high end PC setting.
 
That is a vague statement considering what hardware is in the particular PC....saying its super smooth on PC is wrong, saying its super smooth on a high end PC is right....most don't own a high end PC. As a PC gamer myself with high end hardware yes it is smooth but the PS4 version holds up very close to a high end PC to be frank.

Not really..

9YXTp42.jpg


Most GPUs can run this game very efficiently.

Also, PS4 and X1 is missing some features in AI. Also limited to 30fps with some dips.. I would hardly call that "holds up very close to a high end PC"
 
I don't see real-time radiosity in the game. You have any links or videos?

http://www.gamezone.com/news/naughty-dog-the-last-of-us-squeezes-every-last-drop-of-power-from-ps3

"For us, it's nice to really have mastered the PS3 at this point and be able to push things where we haven't been able to push before," he said, noting the game's use of real-time radiosity on the flashlight. That is, when the flashlight hits a surface, the light bounces around the environment. So if you shine it at an orange wall, it will bounce orange light around the environment. "

You don't know me or my background.. so to make such blanket statements would be considered trolling. Either constructively talk about graphics tech in this thread or go elsewhere.

"It's only because we learned the PS3 so well that we can finally push graphics that are this complex,"

Sth you completely denied in the Unchrated 4 thread by being sure that no learning cycle can make you squeeze more of the console and new features can't happen.

Also spreading false statements about a company buolt on your impressions and sight:
I don't see real-time radiosity in the game.

without any founded proof just to beef up your own point of view . Now who is trolling now? You see you want to force people to agree with you that ND tech is useless and that devs aren't skilled to achieve technically spectacular (based on your own eyes and personal dislikes and feelings) while it isn't the truth
 

Kezen

Banned
Version 4 will exclusively target the next console generation, Microsoft's successor for the Xbox 360, Sony's successor for the Playstation 3 - and if Nintendo ships a machine with similar hardware specs, then that also. PCs will follow after that.
In retrospect this is hilarious considering how far ahead high-end PC hardware was even back in 2013.

It seems Epic expected much, much more.
 

Nzyme32

Member
You don't know me or my background.. so to make such blanket statements would be considered trolling. Either constructively talk about graphics tech in this thread or go elsewhere.

Excellent. I like the push to keep this thread on topic instead of allowing it to start devolving into the childish nonsense of other threads that get derailed.

On topic. I love nVidia's flex interactions, but I hope to see similar things incorporated into other engines soon enough. I recall how amazing it was to be so interactive in a highly physically malleable world in HL2's source engine way back when, but there is still so much that could be done as far as materials and substances go. Some of the graphics technologies pushing for that, and increasing power of the hardware is really starting reach a critical point.
 
http://www.gamezone.com/news/naughty-dog-the-last-of-us-squeezes-every-last-drop-of-power-from-ps3

"For us, it's nice to really have mastered the PS3 at this point and be able to push things where we haven't been able to push before," he said, noting the game's use of real-time radiosity on the flashlight. That is, when the flashlight hits a surface, the light bounces around the environment. So if you shine it at an orange wall, it will bounce orange light around the environment. "

I will look for this when I play the game today.

And I never accused them of NOT being able to do it. I said I don't see it. That's not being judgmental.

"It's only because we learned the PS3 so well that we can finally push graphics that are this complex,"

PS3 hardware isn't the same as PS4 hardware. It won't take nearly as long to learn the hardware on PS4.

Sth you completely denied in the Unchrated 4 thread by being sure that no learning cycle can make you squeeze more of the console and new features can't happen.

I never said that. I said I won't assume future gen improves based on past gen data. This generation is different than last gen, therefore things won't evolve the same way.

Also spreading false statements about a company buolt on your impressions and sight:

I never made a "false" statement about ND dude! WTF are you talking about?

You see you want to force people to agree with you that ND tech is useless (based on your own eyes and personal dislikes and feelings) while it isn't.

LOL! I never said ND was useless... why am I defending myself on my own thread. If you want to speak to me, PM me. Keep this nonsense out of this thread.
 

Ziffles

Member
The earlier iterations of the Unreal Engine had an interesting dithering feature when running in software mode, which was a sort of crude method to hide the blocky pixellation of stretched out textures without resorting to filtering which GPUs provided.

Before/after example:


Apparently it was a fairly simple algorithm and not very taxing on the CPU. Makes me wonder if perhaps some PS1 games could have utilized it.

Doubtful. The PS1 couldn't even handle perspective correction. Also those images are 1920x1080. Most PS1 games ran at 320x240, so even having that kind of filtering at that resolution would probably look just as bad as having none at all.
 
I have to ask again since this is a more appropriate place for the discussion but I'm really confused about your stance on framerate. You're simultaneously disappointed when a game is 30fps due to being a lower framerate but also disappointed when a game is 60fps and not pushing as much tech. I don't get how it would be possible to please you outside of a high end PC setting.

Not disappointed at all.. I have a high end PC and am quite satisfied with both new tech *and* running at 60fps.

My stance is that most games on the next-gen consoles won't be able to achieve that -- as already demonstrated this year with the plethora of games released already (with advanced features). That may change in the future, but for now, it is what it is.
 
Crytek's tralucency map based skin shading. It actually captures back lighting tansluceny unlike purely screen space based techniques (basically every game ever).

THe ears:
earsayaan.png
 

RoboPlato

I'd be in the dick
Not disappointed at all.. I have a high end PC and am quite satisfied with both new tech *and* running at 60fps.

My stance is that most games on the next-gen consoles won't be able to achieve that -- as already demonstrated this year with the plethora of games released already (with advanced features). That may change in the future, but for now, it is what it is.

While I do agree that most games on consoles will be 30fps, don't see that changing any time soon, I do think we are starting to trend up a bit in terms of framerate. BF, Halo, and Uncharted are all major series that are going from 30fps to 60fps this gen which I think may have an influence on the number of linear games that aim for 60fps. Open world will likely still be 30 outside of MGSV. I just hope that 30fps games are truly a consistent 30 and not the mess that they became later on last gen.
 
While I do agree that most games on consoles will be 30fps, don't see that changing any time soon, I do think we are starting to trend up a bit in terms of framerate. BF, Halo, and Uncharted are all major series that are going from 30fps to 60fps this gen which I think may have an influence on the number of linear games that aim for 60fps. Open world will likely still be 30 outside of MGSV. I just hope that 30fps games are truly a consistent 30 and not the mess that they became later on last gen.

Yes, a lock 30fps with no dips and excellent tech would be satisfying!
 

UnrealEck

Member
Crytek's tralucency map based skin shading. It actually captures back lighting tansluceny unlike purely screen space based techniques (basically every game ever).

This is where Ryse really improves on what they had with Crysis 3. The character models.
But then there's things that they left out, like water caustics. Why they didn't just enable them, I have no idea.
 
In retrospect this is hilarious considering how far ahead high-end PC hardware was even back in 2013.

It seems Epic expected much, much more.

Epic, unrealistically, was pushing for more. The initial Elemental demo was running on an i7 CPU and a GTX 680. A $500+ GPU and $300+ CPU were never in the cards.

People argue over semantics like the wholesale price of components, but don't account for the cost of labor and shipping.
 

Teremap

Banned
Looking at the Steam Hardware stats, Neogaf has a pretty fucked up definition what low, mid and high end PCs are.
Lets not carry on this conversation, it's off-topic.
Sorry, but I have to respond to this because it's used as a talking point so often:

1. Steam has tens of millions of users on its service. Even having only 5% of users with a GTX 670 or higher (it's actually higher than that, but this is just as an example) video card is an absolutely massively number.
2. The hardware survey is opt-in, which can skew the numbers (probably doesn't because the sample size is so large, but you never know).
3. Many people install Steam on machines that are not gaming-capable, which again skews the numbers.

Low, mid, and high-end GAMING PCs are what we're actually concerned about when we speak on this forum. Any PC that is not gaming-capable is completely irrelevant for any conversation regarding gaming hardware, and is usually abused in arguments to sell the idea that gaming-capable PCs are an extreme rarity next to gaming consoles. Obviously, this is not just false, but an outright lie, and I would really appreciate it if more posters would refrain from doing so.

PS3 hardware isn't the same as PS4 hardware. It won't take nearly as long to learn the hardware on PS4.

I never said that. I said I won't assume future gen improves based on past gen data. This generation is different than last gen, therefore things won't evolve the same way.
Thank you for this.

I'm really tired of seeing people expecting miracles to be squeezed out of these limited machines. It's illogical, wishful thinking. Very tiresome to see again and again.
 
I would like to ask a couple of vsync questions and this seems a good place to do so.

In theory would a 120 refresh display have half the additional latency of a 60 refresh display in regards to vsync? Kind of like so?

1 * 60rf vysnc w/o TB = 120rf vsync w TB
1/2 * 60rf vsync w TB = 120rf vsync w TB
1/4* 60rf vsync w TB = 240rf vysnc w TB
1/2* 60rf vsync w/o TB = 240rf vysnc w TB

So in theory a 240 refresh display with Vsync and TB would have half the latency of a 60 refresh display running Vsync without TB?

If this is true is there a realistic target refresh rate where any latency caused by Vsync and TB would be imperceptible to the user?

EDIT - I am referring to DirectX TB mainly. IIRC OpenGL TB does not add additional latency.
 

vpance

Member
I'm really tired of seeing people expecting miracles to be squeezed out of these limited machines. It's illogical, wishful thinking. Very tiresome to see again and again.

TBH, I'm seeing far more posts of general disdain for console hardware than these miracle posts that you're referring to.

Current gen consoles will be squeezed just as much as past gens. If anything I think most devs decided to revert back to traditional PC styled coding practices afforded by the increased amount of RAM and x86 architecture, when ideally they should be maintaining the practice of job systems ala SPURS on PS3, across both CPU and GPU. FP devs like ND and GG will show this in practice.
 

KKRT00

Member
TBH, I'm seeing far more posts of general disdain for console hardware than these miracle posts that you're referring to.

Current gen consoles will be squeezed just as much as past gens. If anything I think most devs decided to revert back to traditional PC styled coding practices afforded by the increased amount of RAM and x86 architecture, when ideally they should be maintaining the practice of job systems ala SPURS on PS3, across both CPU and GPU. FP devs like ND and GG will show this in practice.

But most engines are already job based.
 

AmyS

Member
Agreed. This is one of the reasons why I believe we are already hitting a wall with some of the more recent games coming out on next-gen. I feel that 30fps is going to be the norm.

I also don't believe these consoles are going to last until 2020-2021 before their successors arrive. Which would be the case if this gen goes as long as 360/PS3.

I think 6 years would be the sweet spot - That would mean another 5 solid years for Xbox One and PS4 before another cycle starts. Not too short, but not overly long.
 

Fafalada

Fafracer forever
Teremap said:
I'm really tired of seeing people expecting miracles to be squeezed out of these limited machines.
There's no miracles, but squeezing a closed-box is how optimization works. The notion that it's down to some mythical "architecture complexity" had more to do with marketing and wishful thinking than anything in real world.
 
There's no miracles, but squeezing a closed-box is how optimization works. The notion that it's down to some mythical "architecture complexity" had more to do with marketing and wishful thinking than anything in real world.

The exotic archicture of the PS3 is what made it "hard to get performance out of." It wasn't advertising, SPUs were not exactly easy to take advtange of in meaningful ways that did not require tons of work.
 

Fafalada

Fafracer forever
Dictator93 said:
The exotic archicture of the PS3 is what made it "hard to get performance out of."
Getting peak performance is hard-work on any-hw, finding improved context for it even more so.
"Exoticness" of architecture comes into play on scale of fighting with project deadlines, not ~decade cycles of natural tech progression.
 

Arulan

Member
I would like to ask a couple of vsync questions and this seems a good place to do so.

In theory would a 120 refresh display have half the additional latency of a 60 refresh display in regards to vsync? Kind of like so?

1 * 60rf vysnc w/o TB = 120rf vsync w TB
1/2 * 60rf vsync w TB = 120rf vsync w TB
1/4* 60rf vsync w TB = 240rf vysnc w TB
1/2* 60rf vsync w/o TB = 240rf vysnc w TB

So in theory a 240 refresh display with Vsync and TB would have half the latency of a 60 refresh display running Vsync without TB?

If this is true is there a realistic target refresh rate where any latency caused by Vsync and TB would be imperceptible to the user?

EDIT - I am referring to DirectX TB mainly. IIRC OpenGL TB does not add additional latency.

Yes, with regards to input latency associated with Vsync, buffering frames, etc. you'll notice a significant decrease. However, it's not that easy to equate input latency values. For example, because 120hz or 240hz are much harder to reach in terms of rendering power, it's much more difficult to get to the point where you'd be stalling your GPU, which is why there is more input latency when you're over your refresh rate while using Vsync. This makes TB Vsync more appealing to 120hz displays for instance in terms of input latency. That said, you could roughly state that a 120hz display will cut your latency associated with Vsync by half.

I don't really think it's feasible however to look at 240hz+ displays to attempt to solve our Vsync input latency issues when we already have Gsync.
 

Peltz

Member
Sorry, but I have to respond to this because it's used as a talking point so often:

1. Steam has tens of millions of users on its service. Even having only 5% of users with a GTX 670 or higher (it's actually higher than that, but this is just as an example) video card is an absolutely massively number.
2. The hardware survey is opt-in, which can skew the numbers (probably doesn't because the sample size is so large, but you never know).
3. Many people install Steam on machines that are not gaming-capable, which again skews the numbers.

Low, mid, and high-end GAMING PCs are what we're actually concerned about when we speak on this forum. Any PC that is not gaming-capable is completely irrelevant for any conversation regarding gaming hardware, and is usually abused in arguments to sell the idea that gaming-capable PCs are an extreme rarity next to gaming consoles. Obviously, this is not just false, but an outright lie, and I would really appreciate it if more posters would refrain from doing so.


Thank you for this.

I'm really tired of seeing people expecting miracles to be squeezed out of these limited machines. It's illogical, wishful thinking. Very tiresome to see again and again.

I think we've already seen many miracles on them. But I'm easy to please and I don't have a gaming PC.
 
Yes, with regards to input latency associated with Vsync, buffering frames, etc. you'll notice a significant decrease. However, it's not that easy to equate input latency values. For example, because 120hz or 240hz are much harder to reach in terms of rendering power, it's much more difficult to get to the point where you'd be stalling your GPU, which is why there is more input latency when you're over your refresh rate while using Vsync. This makes TB Vsync more appealing to 120hz displays for instance in terms of input latency. That said, you could roughly state that a 120hz display will cut your latency associated with Vsync by half.

I don't really think it's feasible however to look at 240hz+ displays to attempt to solve our Vsync input latency issues when we already have Gsync.

Thank you for the reply.
 
Sorry, but I have to respond to this because it's used as a talking point so often:

1. Steam has tens of millions of users on its service. Even having only 5% of users with a GTX 670 or higher (it's actually higher than that, but this is just as an example) video card is an absolutely massively number.
2. The hardware survey is opt-in, which can skew the numbers (probably doesn't because the sample size is so large, but you never know).
3. Many people install Steam on machines that are not gaming-capable, which again skews the numbers.

Low, mid, and high-end GAMING PCs are what we're actually concerned about when we speak on this forum. Any PC that is not gaming-capable is completely irrelevant for any conversation regarding gaming hardware, and is usually abused in arguments to sell the idea that gaming-capable PCs are an extreme rarity next to gaming consoles. Obviously, this is not just false, but an outright lie, and I would really appreciate it if more posters would refrain from doing so.

Sure, that's a way of simple dismissing something that doesn't match your idea of "GAMING PC"s.
 
Just played more Alien:Isolation today.. this is the only game out currently that has realtime dynamic indirect lighting. You can see it in the flashlight (when you turn it on). It will take about a second but you'll notice it will cast secondary bounced light behind the direct path of the flashlight. Simply amazing to look at.

In contrast, I looked for this in TLOU:Remastered (since the devs said it's a feature) and it's simply isn't there.
 
Just played more Alien:Isolation today.. this is the only game out currently that has realtime dynamic indirect lighting. You can see it in the flashlight (when you turn it on). It will take about a second but you'll notice it will cast secondary bounced light behind the direct path of the flashlight. Simply amazing to look at.

In contrast, I looked for this in TLOU:Remastered (since the devs said it's a feature) and it's simply isn't there.

I remember this effect in the PS3 version of the game, it was only noticeable to me in the subway-part where it's really dark.
 
Top Bottom