• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Do you SEE games as the developer intended?

Hostile_18

Banned
Does everyone have their displays on warm 2 with no post processing, correct colour space etc to get the image as intended by the developers?

I have a feeling even amongst hardcore gamers many use a cooler/default setting along with dynamic contrast etc.

Which do you prefer the image as; Reference (warm 2, no post-processing) or Preference? (Default/cooler temp + post-processing etc) and is this the same setting you use for both movies and games?

(Obviously for 100% accuracy you'd have to get your individual TV calibrated by an expert with specialised equipment but that's not practical for most people).
 
Last edited:

Hostile_18

Banned
You assume all development is made on calibrated equipment!

Saying that, I calibrated my TV as close to correct specs as possible.

Is that not the case for most game developers? Film and tv show content are made on calibrated equipment arn't they?

I have my set as close to the set standard as possible for both. I do find it more important in films/tv though for accurate skin tones.
 
Last edited:

Hostile_18

Banned
My Toshiba HDTV is always set to “Vivid” mode and for me things look great.

That's wildly changing the image that the developer creates. Not wrong of course if thats your preference but your not seeing the scenes as they were ment to be seen
 

DunDunDunpachi

Patient MembeR
How do you tell if your settings are "as the developer intended"?

Anyway, the answer is yes, I try to tweak my display to eliminate any artifacting, warping, color bleeding, lag, etc from the display itself. This is why I have several CRT TVs and PC CRT monitors. It's much easier to get old games looking good on those displays.
 

Hostile_18

Banned
How do you tell if your settings are "as the developer intended"?

Anyway, the answer is yes, I try to tweak my display to eliminate any artifacting, warping, color bleeding, lag, etc from the display itself. This is why I have several CRT TVs and PC CRT monitors. It's much easier to get old games looking good on those displays.

Basically getting as close to 6500k colour space as possible (usually warm 2 on most tvs) turning off all post image processing such as dynamic contrast, sharpening etc. Using the correct colour gamut and a gamma of 2.2 :)
 

Jigsaah

Gold Member
Does everyone have their displays on warm 2 with no post processing, correct colour space etc to get the image as intended by the developers?

I have a feeling even amongst hardcore gamers many use a cooler/default setting along with dynamic contrast etc.

Which do you prefer the image as; Reference (warm 2, no post-processing) or Preference? (Default/cooler temp + post-processing etc) and is this the same setting you use for both movies and games?

If there's any developers on here what do you advise people set their tv's to with regards to games?

(Obviously for 100% accuracy you'd have to get your individual TV calibrated by an expert with specialised equipment but that's not practical for most people).

Fuck what they intended...I'm play it how it looks good to me..

Da hell?
 

Daymos

Member
You assume none of us play on a handheld with a standardized screen that the developer has in hand.
 
Last edited:

Hostile_18

Banned
Fuck what they intended...I'm play it how it looks good to me..

Da hell?

I know what your saying but does it not make sense to play their game how they envisioned it?

Do you want scenes professionally colour graded by the people who make the content or let your tv choose what a scene should look like?

Do you want blue snow or white snow in a scene for example. If you use a cooler colour temp a scene with a bit of blue in it will have it in excess throwing off the image completley.
 

Hostile_18

Banned
You assume none of us play on a handheld with a standardized screen that the developer has in hand.

Well true but if you ever play in docked mode you'd need to make sure your tv is set up correctly to match the handheld calibated output (and most tvs out of the box are not set up correctly).
 
I honestly never really thought about it. It is dumb on my part as it is an interesting question and makes sense. I have my TV callibrated to be as natural as possible as I cannot stand to see "movie mode" or other strange modes that add and change colors to be unatural.
 
Is that not the case for most game developers? Film and tv show content are made on calibrated equipment aren't they?
We would have to get an industry wide survey on this, I suspect many don't either have the resources for this, or they don't care that much. but I may be very wrong, so don't quote me on this.... it's a feeling I got from working at different places, there is not always an emphasis on what seems important to the enthusiast - yet it works.

I'm no believer of "what they intended" at all cost. What if you have a smaller TV and some reflections on it during the day? or what if you emulate the game and get a 720p 30fps title to run above 100fps (like some do with Zelda Breath of the Wild?).

This is also often used in cinema to argue for selling theater tickets (I find I have a better experience at home compared to most theaters, my sound is well calibrated, and I prefer a good TV over a projector).
 

bati

Member
I shit on the developer's artistic vision. It's my way or the highway. I still remember when the moron in charge of Battlefield 3 visuals insisted the blue colour hue was a good way to enhance the game's visuals - I called that shit out on day one and was subsequently vindicated when it became a major movement against the art director's retarded approach to the whole thing. Felt even better when BF4 launched without this shit.

It also warms my heart when people openly call out chromatic aberration and other bullshit.

Tldr: a lot of developers in charge of visuals have no fucking idea what they're doing.
 
Last edited:

Tesseract

Banned
i generally do try to adhere to the dev's vision whenever possible, but sometimes it's hard in multiplayer games where lower settings provides significant advantages

i am however absolutely enjoying apex legends at ultra settings, and slaying
 

shark sandwich

tenuously links anime, pedophile and incels
I fiddle with the picture until human skin is a glowing orange color. Then turn on smooth motion so I get a flawless 120 FPS experience.
 

Hostile_18

Banned
We would have to get an industry wide survey on this, I suspect many don't either have the resources for this, or they don't care that much. but I may be very wrong, so don't quote me on this.... it's a feeling I got from working at different places, there is not always an emphasis on what seems important to the enthusiast - yet it works.

I'm no believer of "what they intended" at all cost. What if you have a smaller TV and some reflections on it during the day? or what if you emulate the game and get a 720p 30fps title to run above 100fps (like some do with Zelda Breath of the Wild?).

This is also often used in cinema to argue for selling theater tickets (I find I have a better experience at home compared to most theaters, my sound is well calibrated, and I prefer a good TV over a projector).

Well for reflections backlighting dosnt usually dosnt effect the image quality or presentation (it should always be adjusted to the ambient light in the room).

I'm strictly talking about colour accuracy rather than fps or resolution etc :).

I just can't imagine most AAA developers not working on calibrated screens especially when cinematography is more important than ever. I know when I first got my oled tv I was playing uncharted 4 and it made a massive difference... was I on a cold frosty island or a tropical island with the sun shining down etc.
 

Hostile_18

Banned
I remember when I first played Grand Theft Auto 5 and there was a trophy for racing along a "sun kissed street" at then beginning and I was like WTF it looks cold as fuck. That's when I started learning most TV's default mode had far too much blue in it (as it gives the appearance of whiter whites but throws everything off, in fact). The "correct" Movie mode or warm 2 looked too yellow at first but it's amazing how quickly you adjust. We're as the consumer too use to seeing incorrect displays with blue whites.
 
Last edited:
I doubt many developers bother to calibrate their displays to the standard OP is talking about which is for cinema content.

My home theater is calibrated for cinema because I use it to watch movies. I don't really care how that affects games because I know developers aren't looking at cinema-calibrated displays most of the time.
 
This thread just makes me want an OLED ; ;
OLED can be very bad for gaming due to games having fixed UI elements like HUD's that sit there on the screen for hours at a time.

 
Last edited:

Pejo

Member
OLED can be very bad for gaming due to games having fixed UI elements like HUD's that sit there on the screen for hours at a time.

Yea this is one of the things holding me back for now. Wonder if we'll ever have a display capable of true blacks without a burn-in problem. First plasma, now OLED.
 

Kamina

Golden Boy
Why is “warm 2” correct? It looks like a piss filter to me. Also what is “normal” then?
 

Hostile_18

Banned
This thread just makes me want an OLED ; ;

If you get a modern OLED it's incredibly unlikely to have burn in.

I've put hundred of hours into mine and the perfect blacks for gaming are amazing. (Full screen white isn't quite as good but no tv is perfect).
 
Last edited:

Hostile_18

Banned
Why is “warm 2” correct? It looks like a piss filter to me. Also what is “normal” then?

Normal is what they use in the shops to attract you to the tv. Warm 2 is usually the closest to what the content is mastered to. Try it for a few days and then go back and you'll be amazed.
 

Hostile_18

Banned
Thought this was worth a bump now were busier again as it's an often neglected facet of gaming and so important to the experience.

It's amazing how even amongst us hardcore gamers so many of us have our TVs setup incorrectly (as were my sets for many years).
 

TUROK

Member
Nah. I don't like the look of the warm color temperature. I spend a lot of time tweaking settings, but always to whatever looks best to me.
 
Does everyone have their displays on warm 2 with no post processing, correct colour space etc to get the image as intended by the developers?

I have a feeling even amongst hardcore gamers many use a cooler/default setting along with dynamic contrast etc.

Which do you prefer the image as; Reference (warm 2, no post-processing) or Preference? (Default/cooler temp + post-processing etc) and is this the same setting you use for both movies and games?

(Obviously for 100% accuracy you'd have to get your individual TV calibrated by an expert with specialised equipment but that's not practical for most people).

I always set my TV on game mode, and normal or warm 1 color
It makes look very good the majority of games

And I Also turn off every Filter, option and such
 
Last edited:

Hostile_18

Banned
I always set my TV on game mode, and normal or warm 1 color
It makes look very good the majority of gamds

Warm 1 is a great stepping stone to adjust to a more accurate colour space (for some sets it's actually closer to the correct setting than warm 2).

With games it's harder to see what's correct compared to a movie where you have real life for reference. Snow been truly white and skin tones are where accurate settings really benefit all media IMO.
 

rofif

Can’t Git Gud
I was interested in this when buying a new monitor. I've watched a ton of "dev logs" to see how are the games created.
For example in Sekiro dev logs, You can see that everyone in the office uses IPS monitors from Dell and some Eizo. They also had 1 tv for testing but I am not sure what it was.
I myself just play on 4k ips monitor somewhere around defauly 6500k +-200 since I do not have a calibration equipement. No warm presets or anything like that
 
Last edited:
Warm 1 is a great stepping stone to adjust to a more accurate colour space (for some sets it's actually closer to the correct setting than warm 2).

With games it's harder to see what's correct compared to a movie where you have real life for reference. Snow been truly white and skin tones are where accurate settings really benefit all media IMO.

Yeah, warm looks very good
I noticed that on Nintendo consoles, tho, warm 1-2 make the games look too much saturated, while on ps4 they look great
 

Shifty

Member
For anyone else who's having trouble taking the OP's preachy "I'm doing it as the devs intended and you probably aren't" tone at face value, the actual specification in question here is Illuminant D65.

CIE said:
[D65] is intended to represent average daylight and has a correlated colour temperature of approximately 6500 K. CIE standard illuminant D65 should be used in all colorimetric calculations requiring representative daylight, unless there are specific reasons for using a different illuminant. Variations in the relative spectral power distribution of daylight are known to occur, particularly in the ultraviolet spectral region, as a function of season, time of day, and geographic location.

D65 corresponds roughly to the average midday light in Western Europe / Northern Europe (comprising both direct sunlight and the light diffused by a clear sky), hence it is also called a daylight illuminant.

TL;DR 6500K is the standard endorsed by the International Commission on Illumination (CIE) to which visual media is created because it represents an average 'daylight' look.

Whether that's the setting you personally should be setting your TV to is another matter, since the colour temperature of your TV room is going to factor in here. If your room lighting isn't at 6500K, the screen is going to look either too warm or too cool regardless of whether it is 'correct' in terms of illumination standards.

So if you truly want to obsess over it, get ready to buy some new lightbulbs. Otherwise, just match the white point of your TV room and it'll be 'good enough' without intentionally contradicting its lighting environment.
 
Last edited:

Hostile_18

Banned
For anyone else who's having trouble taking the OP's preachy "I'm doing it as the devs intended and you probably aren't" tone at face value, the actual specification in question here is Illuminant D65.





TL;DR 6500K is the standard endorsed by the International Commission on Illumination (CIE) to which visual media is created because it represents an average 'daylight' look.

Whether that's the setting you personally should be setting your TV to is another matter, since the colour temperature of your TV room is going to factor in here. If your room lighting isn't at 6500K, the screen is going to look either too warm or too cool regardless of whether it is 'correct' in terms of illumination standards.

So if you truly want to obsess over it, get ready to buy some new lightbulbs. Otherwise, just match the white point of your TV room and it'll be 'good enough' without intentionally contradicting its lighting environment.

Not sure how advocating accurate image quality comes across as preachy. Its explaining set standards that some users might not be aware of.

You don't have to worry about room daylight other than luminance setting (higher in a bright room). If users get say 90% of the way there that's still a HUGE improvement over the manufacturer defaults in most cases.
 

Virex

Banned
Does everyone have their displays on warm 2 with no post processing, correct colour space etc to get the image as intended by the developers?

I have a feeling even amongst hardcore gamers many use a cooler/default setting along with dynamic contrast etc.

Which do you prefer the image as; Reference (warm 2, no post-processing) or Preference? (Default/cooler temp + post-processing etc) and is this the same setting you use for both movies and games?

(Obviously for 100% accuracy you'd have to get your individual TV calibrated by an expert with specialised equipment but that's not practical for most people).
Warm 2 and no post processing all the way baby. I've had it so many times when I calibrated peoples tvs and monitors to perfection after I'm done they are stunned with how much better it looks.
 

Shifty

Member
Not sure how advocating accurate image quality comes across as preachy. Its explaining set standards that some users might not be aware of.
It comes off as preachy because you went straight in with "hope you're doing it right like i am guys" without stopping to establish what "right" is in the context past "use warm 2 trust me it's what the devs intend".

Given that I had to go and research Illuminant D65 myself in order to actually understand the reasoning, I don't think you did a great job of explaining it as a standard or making other users aware of it.

I don’t need professional color grading setup to get the best image.

Sometimes you have to believe in yourself.
Research indicates that TV calibrations performed as part of a Rocky-style training montage yield superior results to ones made in a common-or-garden setting.
 
Last edited:

Hostile_18

Banned
It comes off as preachy because you went straight in with "hope you're doing it right like i am guys" without stopping to establish what "right" is in the context past "use warm 2 trust me it's what the devs intend".

Given that I had to go and research Illuminant D65 myself in order to actually understand the reasoning, I don't think you did a great job of explaining it as a standard or making other users aware of it.


Research indicates that TV calibrations performed as part of a Rocky-style training montage yield superior results to ones made in a common-or-garden setting.

I did say no post processing as well. I'll clarify by that I mean no dynamic contrast, no "live" colour, no black frame insertion, no sharpening etc as well as warm 2 or the closest to 6500k colour space.
 
Last edited:

Whitecrow

Banned
One thing you can do is look up your TV in rtings.com and see its recommended settings after calibration.

R G B gain may vary on each set, but things like color mode, contrast, brightness, color space, color temperature are accurate.
 
Last edited:

Hostile_18

Banned
Warm 1 is definitely one of the better options. On some TVs I've owned warm 2 was indeed too warm.

Its amazing how your eyes adapt though. You can change it to a warmer colour at first and think this is too yellow and then after a week it becomes "normal" and going back everything is super blue on the presets you used before.
 

cireza

Banned
Its amazing how your eyes adapt though. You can change it to a warmer colour at first and think this is too yellow and then after a week it becomes "normal" and going back everything is super blue on the presets you used before.
Using warm is good to rest your eyes. That's the case on a computer screen, if you use it for work or to browse etc...

However, I believe that when a developer uses white as a color, he intends for it be white and for us to see white. So I am never going to accept warm screen or options when playing video games. It feels totally wrong to me, and I hate it. I also hate how depending on the console, you might get a perfectly white screen or a yellowish one with Nintendo products.

I use a warm filter all day for work. Not for my consoles.
 
Last edited:

Tesseract

Banned
in all seriousness just use whatever looks good to you with various source images (calibrators, games, movies, tv)

but for competitive gaming, i'd have to be a fool to use warm colors when identifying whites matters (especially outdoors where atmosphere is rendered / simulated)
 
Last edited:
Top Bottom