• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Q4 2017: Monitors are a solved problem (Computex announcements)

Leonidas

Member
It's not solved for me till there are 32" 4K HDR OLED monitors. Hopefully these become a thing within the next 2 years...
 

PFD

Member
I don't really expect any of these to be actually available in Q4 at this point, but after waiting for a decade+ for the perfect gaming monitor what's a few more months?

I've never been more ready to drop a couple Ks on a product. I'd buy one of these so fast
 
I still can't understand why these companies announce products when they have zero idea when they are going to ship.. even worse announce products which may come out a year later.

I'm guessing that they get advanced info on panels that will be available in x months and announce products based on those panels.
 

dr_rus

Member
Philips to Showcase 43" HDR Display, 8K monitor and 49" Ultrawide

...

436M6VBPAB – part of the upcoming Philips Momentum line and to be seen in the entertainment zone – a 4K HDR screen with the unique Ambiglow feature, this is an all-round multitasking and entertainment solution, allowing USB-C docking and simultaneous notebook charging. This screen can act as the ultimate docking station, allowing you to easily use it with anything – from the latest 4K gaming consoles to any high end graphics processing designer computer.
http://www.tftcentral.co.uk/news_archive/38.htm#philips_ifa2017

First 40"+ monitor with HDR. Unfortunately, this one seems to be 60Hz with no x-sync of any sort. But it's possible that another version of the same panel will be able to reach 144Hz in time.
 

ookami

Member
Adaptive-Sync is a VESA standard.

Freesync is the name AMD give to their compliance to the standard, so monitors that are compliant with the standard are typically called Freesync monitors. There's no proprietary hardware (With regard to GPU vendors) in the monitors, but they are refered to as "Freesync."

Freesync is the de facto name for compliance with the Adaptive-Sync standard simply because NV refuses to comply.

So, when someone refers to a Freesync monitor, they are talking about monitors that are compliant with the VESA standard. You may not agree with this, but that's the reality of the situation.

G-sync is not compliant with the VESA standard. That's why it needs expensive proprietary hardware on desktop PC monitors. Funnily enough, NV supports the VESA standard on laptops... They just refuse to implement the standard on the driver level with their desktop GPUs, making them only work with their proprietary modules and not the VESA standard modules with regard to adaptive refresh rates.

The reason freesync monitors are cheaper than Gsync is because "freesync" monitors are just monitors with industry standard modules that NV could allow the use of if they wanted to...

I'm sure you knew all this, though...
Nope, knew some of it but thanks for the info. For me it was just two kind of implementations with AMD using as a base and extending VESA Adaptive-Sync. Good to know, made me discover about HDMI now handling this standard too.
About NVIDIA module on monitors they won't ever drop it since it's part of their solution design and handles functionalities it could not without.
Getting back on "standard" what I mean is VESA Adaptive-Sync is standard, no doubt about it. But AMD's FreeSync is not IMO since it require an AMD driver+card as NVIDIA's G-Sync need a NVIDIA card+driver. Well, that's how I see things.
 

gypsygib

Member
Since I'm not swimming in cash, I'll just settle for a 27" 2K HDR 144Hz G-SYNC monitor. While it'll still cost a lot of money, atleast it won't be super exorbitant. Can do without ultra-wide or curved.

Same, ideal for me would be 30 inch so I have a large screen without having to use more GPU power for widescreen.
 

coolasj19

Why are you reading my tag instead of the title of my post?
It seems like the Acer Predator X35 is confirmed for next year.
Source.
Glad I didn't wait to grab my first Ultra-wide then. I settled into a semi-permanent spot and I wanted to nest in it. Maybe this PG348Q will hold its value really well.
 

ookami

Member
Glad I didn't wait to grab my first Ultra-wide then. I settled into a semi-permanent spot and I wanted to nest in it. Maybe this PG348Q will hold its value really well.
Yeah, the only thing that stopped me to go for this one is that I wanted to get a curved 35" HDR monitor with G-Sync. I'll just wait and put money aside to better digest the cost.
 
I'm fine if they actually take the time and have good QC on the new ultrawides. My 144Hz/1440p GSync is going strong, no rush to upgrade
 

Chinbo37

Member
This sounds incredible. Let me know when they get down to sub 500. The size is way to big for me tho. 27 inch is fine for me.

Ill roll along with my 1080p Asus 27 inch 144hz non Sync for the next couple of years. It looks incredible to me and I dont play enough to justify a new one anytime soon.
 

Paragon

Member
Glad I didn't wait to grab my first Ultra-wide then. I settled into a semi-permanent spot and I wanted to nest in it. Maybe this PG348Q will hold its value really well.
I'm really happy with my PG348Q too.
I was excited for HDR, but there are so few games which support it across all platforms, and there are weird situations where some games support HDR on one platform but not another, or different HDR formats depending on the system.
The list of games with HDR support on PC right now is pretty disappointing.
It will get better, but for a feature that was said to require minimal effort from the developers to implement, I'm surprised that the rate of adoption has been so low.

I've no interest in going from an IPS panel back to a VA panel despite the higher contrast ratio, or from a 34" UW to a 27" 16:9 panel.
My TV has a 5000:1 native VA panel and local dimming, and I haven't used it for gaming since getting the PG348Q.
Contrast is nice to have, but for me, it's not as important as good viewing angles (image stability) and fast response times.

I think what would get me to upgrade is if there's an IPS panel with the same 512-zone backlight, and it's used to combine G-Sync with ULMB and the option to single-strobe at any refresh rate. That 85Hz lower-limit for ULMB needs to go.
Playing a game like Sonic Mania sucks on an LCD or OLED display without backlight scanning/BFI. There's so much motion blur since it's capped at 60 FPS.
 

dr_rus

Member
It will get better, but for a feature that was said to require minimal effort from the developers to implement, I'm surprised that the rate of adoption has been so low.
It's relatively easy to implement but there's like no HDR hardware (monitors) on the PC market right now so devs just don't bother - mostly because it's hard to implement something which you can't even test by yourself due to lack of h/w.

I've no interest in going from an IPS panel back to a VA panel despite the higher contrast ratio, or from a 34" UW to a 27" 16:9 panel.
My TV has a 5000:1 native VA panel and local dimming, and I haven't used it for gaming since getting the PG348Q.
Contrast is nice to have, but for me, it's not as important as good viewing angles (image stability) and fast response times.
Yeah, IPS is generally better for gaming than xVA because it's generally a lot faster at switching in black-white-black pattern which result in significantly less noticeable blurring of high contrast edges. It will be interesting to see what these 35" UW 200Hz xVAs will be capable of though as it makes little sense to rate the panel at 200Hz if it will still have lots of edge blurring.
 

Paragon

Member
It's relatively easy to implement but there's like no HDR hardware (monitors) on the PC market right now so devs just don't bother - mostly because it's hard to implement something which you can't even test by yourself due to lack of h/w.
Adoption seems very slow even for console games, it's not just a problem on PC.
There's nothing stopping you connecting a PC to an HDR TV though. That's how most devs are implementing HDR right now anyway, not with HDR monitors.
It's frustrating because it seems very unlikely that devs will go back and patch in support for HDR to existing games like Deus Ex: Mankind Divided, Gears of War 4, Forza Horizon 3, and others, once HDR does become more widespread.

Yeah, IPS is generally better for gaming than xVA because it's generally a lot faster at switching in black-white-black pattern which result in significantly less noticeable blurring of high contrast edges. It will be interesting to see what these 35" UW 200Hz xVAs will be capable of though as it makes little sense to rate the panel at 200Hz if it will still have lots of edge blurring.
These are not the first 200Hz VA panels - Acer's Z35 uses one too, it's just 2560x1080 rather than 3440x1440 due to DisplayPort 1.2 bandwidth limitations. Motion handling is... not great.
VA panels have really bad response times for red, and near black, so you get a lot of smearing of the image in dark content.
That and the gamma shifting / loss of contrast with viewing angle really negates a lot of the contrast advantage that they claim to have. A VA panel may have ~3000:1 contrast in the center of the image, but <1000:1 contrast at the edges, while IPS keeps its ~1200:1 across the whole screen.
Of course local dimming will make a huge difference, but I just don't like the lack of image stability that VA panels have. The edges of the display look different from the center. Shift your head even a little and the image changes.

For many people, contrast is the most important thing in a display though, and it is especially important with HDR.
But as an owner of a 34" UW IPS G-Sync monitor (and a very high contrast VA TV), motion handling is the main thing which would have me thinking about upgrading, not HDR support or higher contrast - unless it was natively higher contrast like those IPS panels from Panasonic which achieve 1,000,000:1 contrast without using local dimming.
 

dr_rus

Member
Adoption seems very slow even for console games, it's not just a problem on PC.
There's nothing stopping you connecting a PC to an HDR TV though. That's how most devs are implementing HDR right now anyway, not with HDR monitors.
It's frustrating because it seems very unlikely that devs will go back and patch in support for HDR to existing games like Deus Ex: Mankind Divided, Gears of War 4, Forza Horizon 3, and others, once HDR does become more widespread.

Devs usually don't develop PC games on TVs. Low adoption on consoles may actually be related to the absence of HDR monitors too as even console games aren't exactly developed on TVs either - although in their case they are at least tested on TVs while it's rarely the case for PC versions.
 

Paragon

Member
Devs usually don't develop PC games on TVs. Low adoption on consoles may actually be related to the absence of HDR monitors too as even console games aren't exactly developed on TVs either - although in their case they are at least tested on TVs while it's rarely the case for PC versions.
I know devs at a few studios where the people working on that sort of thing all have HDR TVs at their desks, and even some of the other artists/developers with dual monitor setups (often 27") are being given the option of a 40" 4K HDR TV to replace their monitors now.
Whatever setup Eidos Montréal used to develop HDR for the PS4 version of Deus Ex, I'm sure it could also have been used to implement HDR in the PC version of the game without having to wait for G-Sync HDR monitors to be released.
We know that Nixxes developed HDR support for Rise of the Tomb Raider, but it hasn't been released for some reason. Perhaps that's being held back for the release of these monitors.
 

dr_rus

Member
Whatever setup Eidos Montréal used to develop HDR for the PS4 version of Deus Ex, I'm sure it could also have been used to implement HDR in the PC version of the game without having to wait for G-Sync HDR monitors to be released.

The catch is that PC version of DXMD was developed by Nixxes and not Eidos Montreal.
 

Smokey

Member
I know devs at a few studios where the people working on that sort of thing all have HDR TVs at their desks, and even some of the other artists/developers with dual monitor setups (often 27") are being given the option of a 40" 4K HDR TV to replace their monitors now.
Whatever setup Eidos Montréal used to develop HDR for the PS4 version of Deus Ex, I'm sure it could also have been used to implement HDR in the PC version of the game without having to wait for G-Sync HDR monitors to be released.
We know that Nixxes developed HDR support for Rise of the Tomb Raider, but it hasn't been released for some reason. Perhaps that's being held back for the release of these monitors.

What are these 40" HDR TVs, and are they shit?
 

Paragon

Member
The catch is that PC version of DXMD was developed by Nixxes and not Eidos Montreal.
True, but as I linked to above, Nixxes worked on HDR support for Rise of the Tomb Raider - even though that hasn't been released. So it's not like they're unfamiliar with, or ill-equipped to implement HDR on the PC.

What are these 40" HDR TVs, and are they shit?
I don't know the specifics, only that they were mainly Sony TVs. They might have been 43" rather than 40".
I think the people actually implementing HDR support in the engine had higher-end displays for reference, but again, I don't know the specifics.
The point is that at least some studios are starting to give their artists HDR displays, and everything I've heard about so far from people working in the industry has been ~40" HDR TVs hooked up to a PC, typically replacing their existing monitors - not waiting for HDR monitors to be released. Cost is probably a big factor in using TVs rather than monitors too.
So they won't be the most high-end HDR TVs being used, but that's the TV market for you. High-end displays have gradually been moving to larger and larger sizes, with most starting around 55" now.
Whatever the reason is for that, it's not a technical limitation because Dolby's own reference monitor is 42" with 1500 LED zones - more than any TV. Sony's reference OLED monitor is 30" in size. Even these G-Sync HDR monitors with 384/512 zones have more zones than most big TVs, and at much smaller sizes.

It's just frustrating that with HDR adoption being so low as it is, there are several games which only support HDR on specific platforms, rather than all versions of the game supporting it. There's no way I'd be playing the console version of Deus Ex for its HDR support, since it only runs at 1080p30 on consoles.
Maybe things will be better next year, but the rate of HDR adoption in games has been so low, without even supporting all platforms the game is being released on, that it's not really something which would affect my decision to buy a display now.
Better motion handling would improve all games, and be a far more meaningful difference to me.
With a 1000 nit backlight and 384/512 zones (which I would guess is 24x16 / 32x16) these monitors could have amazing motion handling performance in SDR if ULMB was implemented right, but with all the current restrictions that ULMB has, I'm not convinced that it will be.
 

mhayze

Member
Like most, I feel like perfect gaming monitors are always just over the horizon, and never available (especially at an affordable price).

I currently have a ROG swift 27" WQHD G-sync 144hz monitor. It's nice. But it's still got issues - not OLED, only 27", not 4K, not HDR, etc. I run a multi-monitor setup, and I'm considering a 4K TV as a 2nd monitor. Even though ultra-wides are pretty cool to mess around with for a while, at the end of the day, I prefer the extra vertical real estate. I'd consider buying an ultra-wide, but unlike some here, I'd be perfectly happy with a 'normal' wide monitor.

At work I have a 32" IPS 4K work monitor, and I realize that 32" is still not a good size for a 4K monitor (never mind 27", which is ridiculously low for 4K), around 37"-42" would be ideal, if you have the space for it. Keep in mind, 32" 4k is like a 1080p 15.5" laptop but at 2' away (typical seating distance for most with desks). I have used a 40" 4K monitor and it is great. Those who have tried the 35" ultra-wide monitors, it's a lot like that, just taller.

While this monitor is not good for gaming, those that want a 4K OLED for the contrast/PQ and resolution, not for high hz and high refresh rates - this is available in the US:
http://www.dell.com/en-us/shop/accessories/apd/210-aiei
Dell Ultrasharp UP3017Q 30" 4K@60hz OLED monitor ~$1550

I'm still waiting for a HDR/g-sync/120hz+ version, the closer to 37" the better.
 
I will gladly sell a kidney for that.

Also yeah the price is still an unsolved problem. I ain't paying over $1000 for a monitor, hell I would have trouble buying a monitor for $500.
 

Paragon

Member
At work I have a 32" IPS 4K work monitor, and I realize that 32" is still not a good size for a 4K monitor (never mind 27", which is ridiculously low for 4K)
Smaller 4K screens are great; but anything smaller than 40" is intended to use display scaling. I would even consider using 125% scaling at 40". A 4K screen should really be 46" to use 100% scaling.
A 27" 4K screen is not about giving you more workspace than a 1440p screen, it's about rendering everything at a higher resolution. 150% scaling displays everything the same size as a 1440p monitor - but at 50% higher resolution. 175% scaling is closer to Windows' intended size though. (based around 96 PPI)
5K and 8K do handle scaling of legacy applications better than 4K screens do though, since standard monitor sizes end up using integer scales (2x / 3x) rather than non-integer scales. (1.25x, 1.50x, 1.75x)
 

btrboyev

Member
I don't wanna spend as much on a computer monitor as I would on a TV...but that's going to be a thing going forward isn't it.
 

dr_rus

Member
Smaller 4K screens are great; but anything smaller than 40" is intended to use display scaling. I would even consider using 125% scaling at 40". A 4K screen should really be 46" to use 100% scaling.
A 27" 4K screen is not about giving you more workspace than a 1440p screen, it's about rendering everything at a higher resolution. 150% scaling displays everything the same size as a 1440p monitor - but at 50% higher resolution. 175% scaling is closer to Windows' intended size though. (based around 96 PPI)
5K and 8K do handle scaling of legacy applications better than 4K screens do though, since standard monitor sizes end up using integer scales (2x / 3x) rather than non-integer scales. (1.25x, 1.50x, 1.75x)

The biggest issue with HiDPI displays is the fact that a lot or games do not scale their HUDs / menus with Windows scaling settings or by themselves and as such would be nearly unplayable on such displays. DXHR is a prime example of such game from not so recent times but there are many more, even from recent years.
 

Kambing

Member
oled would be a terrible idea for a desktop, you'd get burn in from desktop/browser tabs.

Though OLED's do burn in (these instances can be localized), surprisingly desktop use does not seem to cause it. If you peruse through an OLED burn-in thread on AVSforum, you'll be able to come to your own conclusion. Anecdotally, i own two OLEDS (2017 model), both of which have had 300+ hrs each of desktop usage and don't have burn in. Same thing for the 2016 model before i upgraded, each of those OLEDS had 500+ hours of desktop use.

It will be interesting to read RTINGS burn in results on these OLEDS.
 
I have an x34 and I'm really excited for that aced x35 or asus pg35vq

I haven't used A VA panel though, what are the major concerns?
 

m29a

Neo Member
Color shift when you move your head. The viewing angles aren't as good as IPS

I'm pretty sure VA viewing angles are still good. I have one and it's barely noticeable when you go above/below/sides. The difference between IPS and VA are overstated. I'm sure IPS is still slightly better, but a VA panel really is nice. I've also read from various review sites that VA is better than IPS when it comes to contrast. Blacks are far darker, which really shows during movies.
 

Paragon

Member
Throw out g-sync and lower the cost. While they're at it, have a non-curved version.
G-Sync is half the appeal. I wouldn't buy another display without variable refresh rate support after using a G-Sync monitor.

The biggest issue with HiDPI displays is the fact that a lot or games do not scale their HUDs / menus with Windows scaling settings or by themselves and as such would be nearly unplayable on such displays. DXHR is a prime example of such game from not so recent times but there are many more, even from recent years.
That's true, but there are many people who don't just want a display for gaming, and HiDPI displays are a huge step forward for everything else. (and gaming too, for games which scale up the UI or where it doesn't matter)
40-46" 4K displays are a bit of a compromise too, because there are no gaming monitors in that size, and none of the TVs in that size are high-end.
Even 40" is really pushing it for a lot of people, so I doubt that a high-end 55" TV would be an option at a desk for most.

The main issue with scaling is actually that 4K is not enough resolution.
If you had a 5K display in the 27-32" size range, you could render 2560x1440 at a pixel-perfect 2x scale for those older games which don't scale their UI with resolution. For 8K, it would be 3x.
Using pixel-perfect scaling would look just as though you were using a 1440p native display.
Render 1440p on a 4K display and you have to use blurry non-integer scaling. For 4K you have to drop down to 1080p for pixel-perfect scaling, which looks rough on a 27-32" monitor.
 

Herne

Member
I'm hoping I'll be able to get a 4K curved ultra-wide monitor with FreeSync and 144Hz refresh rate in two years time for a reasonable price. Am I dreaming?
 

nomis

Member
Though OLED's do burn in (these instances can be localized), surprisingly desktop use does not seem to cause it. If you peruse through an OLED burn-in thread on AVSforum, you'll be able to come to your own conclusion. Anecdotally, i own two OLEDS (2017 model), both of which have had 300+ hrs each of desktop usage and don't have burn in. Same thing for the 2016 model before i upgraded, each of those OLEDS had 500+ hours of desktop use.

It will be interesting to read RTINGS burn in results on these OLEDS.

Do you use the feature that shifts the image periodically? If so, is it perceptible? Thanks from a prospective oled buyer
 

dr_rus

Member
That's true, but there are many people who don't just want a display for gaming, and HiDPI displays are a huge step forward for everything else. (and gaming too, for games which scale up the UI or where it doesn't matter)
40-46" 4K displays are a bit of a compromise too, because there are no gaming monitors in that size, and none of the TVs in that size are high-end.
Even 40" is really pushing it for a lot of people, so I doubt that a high-end 55" TV would be an option at a desk for most.
We are talking about gaming displays here. I'm very cautious of any display which DPI won't be usable without scaling as chances are that a lot of older games will be unplayable on such display because of scaling issues.

40" 16:9 is probably the biggest monitors will ever get although they may extend horizontally from there for some 21:9 aspect. It's not a compromise, you're getting great PPI with 4K 40" which you can use without scaling. It's true that so far there were only a couple of workstation offerings of such size but I think that we'll eventually get a 40" gaming screen.

The main issue with scaling is actually that 4K is not enough resolution.
If you had a 5K display in the 27-32" size range, you could render 2560x1440 at a pixel-perfect 2x scale for those older games which don't scale their UI with resolution. For 8K, it would be 3x.
Using pixel-perfect scaling would look just as though you were using a 1440p native display.
Render 1440p on a 4K display and you have to use blurry non-integer scaling. For 4K you have to drop down to 1080p for pixel-perfect scaling, which looks rough on a 27-32" monitor.
While what you're saying would make sense on a very high DPI screen, there is no way of achieving it at the moment - no monitor and no GPU has any kind of support for such "pixel-perfect" scaling. Running a game in a low resolution on a hiDPI screen will result in the image being blurry because it will be upscaled with the usual bilinear filtering, in the same way some 1280x720 is upscaled on modern 2560x1440 screens.
 

Kambing

Member
Do you use the feature that shifts the image periodically? If so, is it perceptible? Thanks from a prospective oled buyer

Yes, definitely. 100% not noticeable though. The only time I become aware that the screen has shifted is when I use FRAPS. A very, very small portion of the FPS counter becomes clipped. The screen literally moves 1/6th of an inch to the left or right. My Pioneer Plasma was way worse. When it shifted I could tell and see it occur.
 

Paragon

Member
We are talking about gaming displays here. I'm very cautious of any display which DPI won't be usable without scaling as chances are that a lot of older games will be unplayable on such display because of scaling issues.
40" 16:9 is probably the biggest monitors will ever get although they may extend horizontally from there for some 21:9 aspect. It's not a compromise, you're getting great PPI with 4K 40" which you can use without scaling. It's true that so far there were only a couple of workstation offerings of such size but I think that we'll eventually get a 40" gaming screen.
Well there are no 40" gaming displays. The largest gaming monitors at ~110 pixels per inch are the 34" Ultrawides.
I don't necessarily agree with 40" 4K being "great PPI" though, as 110 PPI monitors look rough these days compared to phones, tablets, and notebooks that are all in the region of 220-400 PPI now.
I'm often finding myself increasing the zoom level in applications to 120% or so with a 3440x1440 display just to make the text look that little bit crisper, without being so large that it's uncomfortable to read.

I'd also argue that if you want a "standard DPI" display, you should be looking for something that is closer to 96 PPI rather than 110. That is what Windows is actually built for. It's just that pixel density gradually started creeping upwards before companies committed to making proper "High DPI" displays. Notebooks even hit around 130 PPI rather than 110.
That being said, a 58" a 5160x2160 (or 10320x4320) ultrawide would be an amazing display if someone were to build a gaming monitor that size. (which is not as large as it sounds, compared to a 40" 4K screen)

While what you're saying would make sense on a very high DPI screen, there is no way of achieving it at the moment - no monitor and no GPU has any kind of support for such "pixel-perfect" scaling. Running a game in a low resolution on a hiDPI screen will result in the image being blurry because it will be upscaled with the usual bilinear filtering, in the same way some 1280x720 is upscaled on modern 2560x1440 screens.
Yes, it can be problematic right now. What we really need is driver-level support for "integer scaling".
The GPU should have no trouble using Nearest Neighbor scaling instead of Bilinear or Lanczos scaling, but we need to convince NVIDIA / AMD that it's necessary.
It looks like there may be some preliminary support in the Linux driver now, so hopefully that means someone is working on it, and it will end up in the Windows driver too.
Ideally this would pre-scale to the largest integer your display can support, and then either present it with borders or perform the last scaling step to fit the screen using Bilinear filtering, depending on the scaling mode you have selected.

There are some workarounds for it though.
If a game supports running in Windowed Mode at a fixed resolution, tools like Borderless Gaming can scale that to fill the screen without applying any filtering. It doesn't always work, which is why we really need driver-level support, but it does work for a lot of games.

oled would be a terrible idea for a desktop, you'd get burn in from desktop/browser tabs.
LG apparently have an OLED panel for monitors due to be released some time next year.
I don't know if that means they think they've solved the burn-in and brightness problems, or if they are just going to release it anyway.
MicroLED seems like it would be more suited for desktop displays than OLED, but that seems like it's going to be a few more years away.
 
Top Bottom