• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The TV industry decides what's good for gaming.

Skyzard

Banned
I'm not against 4K, it's a tossup between framerate and resolution.

I already want it for the way you can see into the distance with games. I remember being blown away downsampling Far Cry 4 from 5K on a 4K screen and just how much further and clearer things in the distance were. It was like my monitor was a window I was looking through, and my eyesight got better.

The games I often enjoy benefit from a higher framerate slighty more ... or at least as much.

If TV companies pushing 4K puts more pressure and builds demand for graphical power leaps, awesome :) I'll make the choice how I want to spend it.

Hopefully devs keep giving the option on consoles too.

Looking forward to when 4K reaches 60/90/120 etc framerates!
 

Inotti

Member
I get that pushing tech will sell more than perfecting an old standard. But CRT still beats anything on the market right now in terms of black levels, response times and motion. I really wish companies (that includes game companies) would perfect 1080p as a standard and than move on to whatever the next thing is, but that's expecting to much.

Ever heard of OLED tv's? Black level is pretty much perfect.
I can agree with input lag though.
 

Cleve

Member
One of the highest rated and most praised games of all time just came out bucking the 1080p trend let alone the 4k trend and it was stil targetting (almost) 30 fps. 60 fps just isn't as important to the mainstream as it is to enthusiasts. I play at 144+fps any time I can on pc, but I full realize I'm a very small market. There's no cabal by the tv industry, it's just a matter of what the majority prefers.


Its embarassing that CRTs still have better motion clarity and less input lag. TVs today are only meant for the lowest common denominator who only care about buzzwords.

CRTs were definitely the better enthusiast tech for quite a while, but people want bigger screens, more than they want motion clarity and input lag. CRTs start to struggle with more issues as they get larger as well, not saying we couldn't have improved the technology to deal with the tremendous amount of lag introduced from having to sweep the entire physical surface of a 65" 1000lb tube or deal with the em interference created by the flyback but comparing one to one on modern panels to maybe 35' tvs or 21' (about as big as high end pc crts got) isn't really fair.
 

Madness

Member
The TV industry tried to make 3D a thing and that didn't work.

It sure as shit got the console makers to at least initially support it. 4K is not even close to 3D. 4K is the natural future due to its massive increase in resolution and visual fidelity. 3D worked well on the few games that utilized it well, but slow tech development and a theatre industry hellbent on gouging customers for shoddy post production 3D movies made it tough to get widespread adoption. Remember how costly the glasses were initially? Glasses free technology also didn't come in time.

Even if 4K tv become mainstream, doesn't mean devs can or will hit that mark. We barely even have 1080p/60 gaming as a base right now. Also, they almost always have chosen 30 or sub 30fps to have greater visual fidelity than 60fps. 4K tv adoption will skyrocket this year and the next.
 

patapuf

Member
I don't think this has happened at all. The new consoles were designed to sell to people with new 4K TVs. Console makers, game designers, Netflix, GPU vendors all steered by what the TV industry decide they're going to push.

Then why do AMD and Nvidia push their Gsync/freeSynch technology so heavily. Along with other technologies when only TV matters? I'm also not aware that PC moniteors have high latency. At least gaming monitors do not.

Resolutions like 1080p were also a thing on PC long before they were mainstream on TV's.
 

True Fire

Member
To be fair, 4K is absolutely necessary in hiding aliasing. Anti-aliasing only goes so far.

I imagine gaming will go 12K/IMAX as soon as it's feasible.
 
Its embarassing that CRTs still have better motion clarity and less input lag. TVs today are only meant for the lowest common denominator who only care about buzzwords.

Good luck getting a non projection set bigger than 40" without cutting a hole in your roof and using a crane to deliver deliver it. OLED or Quantum Dots using direct emission will solve the motion clarity issues relatively soon, plasma sets were great with motion clarity but had other drawbacks. Input lag on TVs could be a non issue today if manufacturers cared enough, input lag would still likely be an issue if manufacturers were making CRTs since a lot of lag comes from needless image processing and cheaping out on TCONs.

I fail to see what is embarrassing. CRTs had too many engineering issues for them to get much bigger than 40" while still being affordable and practical. Even with their oversights TV manufacturers have been tremendously successful in making large scale displays affordable and practical. On top of this modern displays are superior when it comes to geometric distortion, sharpness, and efficiency. When one looks into the work and innovation that has been required to make this a reality it is pretty amazing, it turns out that engineering and mass producing affordable large scale displays is far harder than shitposting.
 

nkarafo

Member
I fail to see what is embarrassing.
Maybe the fact that after a decade or more, CRTs still have things going for them over modern displays that are actually very important for gaming. You shouldn't have to think about throwing a CRT TV or monitor away. But these days i still keep my CRT PC monitor because browsing the Internet with anything else looks atrocious (unless i disable the smooth scrolling option in the browser). But i don't want to. Having smooth scrolling and a CRT is a treat for the eyes and you can still read even the tiniest text perfectly while it scrolls because there is no motion blur.
 
I love my 4K TV and the Games I play in 4K, been on this shit since October and I ain't looking back. That 4K make my PS4 Pro Go, and I really don't care about a 4K Blu Ray player yet because the content isn't really there yet, I can wait on that when it becomes abundantly clear that this shit will take off in the Movie Industry. DirecTv offers a couple of 4K channels with shit programming mainly music and some tourism garbage.
 
Woahhh businesses try to sell novelty and new tech and try to ride this momentum for sales holy shit did not know!

Is like you just revealed the bare heart of the economy write a book on this please.
 
Maybe the fact that after a decade or more, CRTs still have things going for them over modern displays that are actually very important for gaming. You shouldn't have to think about throwing a CRT TV or monitor away. But these days i still keep my CRT PC monitor because browsing the Internet with anything else looks atrocious (unless i disable the smooth scrolling option in the browser). But i don't want to. Having smooth scrolling and a CRT is a treat for the eyes and you can still read even the tiniest text perfectly while it scrolls because there is no motion blur.

Making non-CRT displays has been and still is a non-trivial problem and massive strides have been made over the last decade and half. We will likely start to see flat panel displays that meet and exceed the remaining metrics where CRTs are superior by the closing of this decade.
 

Auto_aim1

MeisaMcCaffrey
Obviously consoles using new advancements in TV technology is great. Gotta keep up with the new shiny stuff. Sony being a TV maker can influence their hardware design as well. I don't really see what's the problem. It's not like you are forced to buy a 4K HDR TV.
 
4K is a natural progression for screens and it's not really a stretch that people will want higher resolution as time goes on. It's not like it's some weird gimmick that TV manufacturers are trying to push on us. HDR is a bit more of a stretch, but it's still a fairly natural evolution of display technology. Same for high frame rate.
 
The thing is that for current GPU's the time for 4K is not really there, to get to native 4k you need a lower IQ than what most game's budgets would actually allow so we get this upscaled stuff which doesn't look as good as having a "3k" TV.

In that way I agree that the TV industry dictates the gaming industry, because they don't make 3k TVs, since for anything but gaming 3k TVs don't make sense.



That HDR thing works somewhat mutually beneficial. For games it's easy to do HDR, for TV's it's still not really easy. HDR demand and display tech (push for OLED and Quantum dot) is mostly driven by games at the moment.
 

Stop It

Perfectly able to grasp the inherent value of the fishing game.
Whatever the TV industry decide they need to sell TVs, is what decides the direction of gaming. If more pixels is the easiest way to get a bunch of people who don't want to replace their TV to replace their TV, then you can be sure gaming is going to trend towards those increased pixel rates. No one in gaming has made the conscious decision that 4K is good for gaming, this is all the TV industry and what makes sense for movie watchers.


  • TV industry decide 4K is the easiest thing to market to consumers to push TV sales
  • Consumers buy 4K sets
  • Consumers demand 4K from content providers/console makers to justify their purchases
  • Console makers tow the line with boxes capable of 4K at only 30fps (weak CPUs)
  • Console games scaled back to hit those high pixel counts
  • PC gamers with high end systems have nothing left to put increasing GPU power into besides 4K

Combine this with the obsession of how games look while stationary/in screenshots and the atrocious motion clarity of modern displays and it doesn't look good. Huge static resolution, awful motion resolution. What's next, 8K at 30fps?

Oh well, at least there's still VR. Can't ruin motion there without making people vomit on themselves.
Wat.

I paid £500 for a Panasonic 4K TV. It is glorious. You can't seriously expect the TV industry to stay static because it would affect gaming?

The last bit I do agree with though, motion resolution is not as good as it should be. Still, Plasma aside a decent 4KTV has better motion resolution than a 1080P one. Interpolation just introduces too much processing lag.
 
🤔

4K is good for gaming tho. It is objectively better than 1080p gaming.

Weird thread

Weird thread, indeed.

I'm planning on buying my first 4K TV this year, it will be the first TV I've owned in almost a decade (I've been using computer monitors)... And I'm frankly kind of excited about the upres.

30fps, checkerboard rendering and raw upscaling from lower resolutions are nothing really new for the industry. By the time I was finally upgrading to a 1080p display, my Xbox 360/PS3 were very, very seldomly achieving native 1080p. Hell, I didn't even own a gaming PC that could hit 1080p native for a lot of titles until 2012 or so. Even as someone who just feels like getting to 1080p was recently achieved, I never felt like the range of compromises to get games performing at an acceptable level really strongly detracted from their enjoyment, even if I have always felt that locked 60+fps is the ideal way to play.

So yeah, the notion that the TV industry is somehow determining what's best for the game industry seems a bit like projecting resistance to 4K. I get that buying a new TV or monitor is expensive, even with 4K prices having come down, but I see no reason why it should be frowned upon, or turned into some negative-bent conspiracy about how videogames suffer from it when they're really only gaining from it.
 
4K is the next progression. We're not there yet. Right now we're kind of in a beta phase but we'll get there. Everyone is eventually gonna have 4K when the prices come down possibly in the next 5 years.
 
I'm right there with you in that I would love games to pursue better visuals and steady 60 fps at 'lower' resolutions than to keep pushing more and more pointless pixels for merely a slightly sharper image.

But I read recently that with the way GPU's are structured, sometimes it's just 'easier' to keep pushing resolution , than do new advanced graphics techniques, or get that frame-rate to be extra buttery.

But it's all down to the individual game. 1080/60fps might seem like a good baseline we've had since before the PS3, but how many developers today actually use it this last decade? They will usually lock to 30fps and chose to improve fidelity instead. And that's fine. A solid 30fps is perfectly acceptable with a slower paced game

Some don't. Overwatch recently. People might enjoy the 'feel' of the game more, and it's smoothness, perhaps without knowing the technical 30/60 distinction. If 60fps affects popularity enough, more devs will make moves to do it, even if it's tougher. Free market forces n all.

So asking for 120hz support for consoles seems a bit bananas when we aren't really even getting 60fps today most of the time, HDMI 2.1 to enable it isn't even out yet, and Scorpio won't even have it. We will have to wait a while.

When it finally arrives and is supported, guess what, hardly any games are going to use it! It's going to be like 1080/60 on the PS3 all over again - hardly ever used, but good headroom for many years into the future. We will only get the odd specialist high paced game going there.

But even more than that, I think going 120fps over 60fps is a tough sell to 95%+ of people. More so than 1080p was, more than 60fps still is. The game to demo the difference would need to have F-Zero GX levels of speed, even then it will be a very subtle almost subconscious difference.

For now it's the territory of uncommon and expensive PC monitors, used mainly in a fraction of fast paced titles that are low enough fidelity to support 120hz at any kind of resolution.

VR is a much, much more suitable application for mainstream 120hz any time soon.

  • Console makers tow the line with boxes capable of 4K at only 30fps (weak CPUs)


  • When some 60fps console games exist, i'd say it's more a resolution/fidelity decision when considering the GPU than CPU in most cases.

    • PC gamers with high end systems have nothing left to put increasing GPU power into besides 4K

    I don't get this either. It's not a world where most PC's are breezing through 'console ports' at 120hz on Ultra, and then reluctantly just putting horsepower into resolution because they can't do anything else.

    If this was the case some of the best looking games on PC wouldn't be console ports. They would be PC exclusives. This is almost never the case.
 

Lord Error

Insane For Sony
Agree with OP pretty much completely, and yeah, it's crappy, but what can you do. Game industry doesn't really have as much clout as movies industry to pull the manufacturers in desired direction

��

4K is good for gaming tho. It is objectively better than 1080p gaming.

Weird thread
It's a gigantic drain on processing power for a comparably pretty small benefit. Games would benefit far more from TV technology that supports variable refresh rates (g-sync kind of thing) and very low input lag, both of which would require a chump change of investment compared to 4K panels. The motion clarity is still mostly a travesty on todays TVs compared to CRT, which is again something games would benefit more from than high resolution. Also, if the TV industry stuck by 1080p longer, we'd see a far bigger push towards 60FPS games, rather than just increased resolution.

The only thing I'm looking forward to that should hopefully pan out due to very high res TV displays, is the eventual ability to support auto-stereoscopy in full 1080p (half of vertical 2160/4K res).
 

Lord Error

Insane For Sony
So OPs post feels like 'i want 60fps thread'?
No, it's about TV tech. At the very least, manufacturers should have jumped on variable refresh train long ago, something that would precisely ease the burden of game developers targeting higher framerates.
 

HStallion

Now what's the next step in your master plan?
The only thing they've done wrong is not make 1080p sets with HDR and wide color gamut.

Sony is releasing a 1080p set this year with HDR. Its still up in the air how good a set it will actually be.
 
Next gen with 4K is gonna be fun... And not in a good way, because if devs are still struggling with 1080p at 30fps, we might barely get a fidelity boost at 4K next gen.

Of course, devs could potentially just do sub-4K, because 1080p is pretty much good enough and they care more about bells and whistles than resolution and framerate.

Honestly, while 4K is nice and all, it's not necessary yet for non-VR purposes and shouldn't be standard for gaming for another 2 gens or so.
 
Its embarassing that CRTs still have better motion clarity and less input lag. TVs today are only meant for the lowest common denominator who only care about buzzwords.

yup

I will rock Sony XBR960s till there aren't any left. Even at 4K digital diplays are shit for gaming.
 
Of course, devs could potentially just do sub-4K, because 1080p is pretty much good enough and they care more about bells and whistles than resolution and framerate.

Honestly, while 4K is nice and all, it's not necessary yet for non-VR purposes and shouldn't be standard for gaming for another 2 gens or so.

Just fucking LOL at the bolded

I'ma guess, 1080p TV owner? :>
 
Just fucking LOL at the bolded

I'ma guess, 1080p TV owner? :>

Two 1080p monitors for PC and console gaming, actually (they're both AOC G2460VQ6 Freesync monitors, they're shockingly good quality for $200 AUD monitors, they're probably even cheaper in the US, I would absolutely recommend it for budget 1080p multi-monitor setups). I find that's more than satisfactory most of the time. Don't get me wrong, 4K and HDR is neat, but the former is too expensive and likely won't have the mass adoption needed to justify standard usage anytime soon, IMO. And HDR isn't worth getting new monitors anytime soon, either, I'd rather wait until 4K monitors get a lot cheaper.
 
Top Bottom