I don't really expect any of these to be actually available in Q4 at this point, but after waiting for a decade+ for the perfect gaming monitor what's a few more months?
I still can't understand why these companies announce products when they have zero idea when they are going to ship.. even worse announce products which may come out a year later.
http://www.joesav.com/Sony-OLED-HDR...T4WSXntFcZ17ON07LtrDS0fzLmyY3QFBoC-7QQAvD_BwEGot a link?
I'd certainly bite at 2k $/ for a 65" 4k (HDR?) OLED
http://www.tftcentral.co.uk/news_archive/38.htm#philips_ifa2017Philips to Showcase 43" HDR Display, 8K monitor and 49" Ultrawide
...
436M6VBPAB – part of the upcoming Philips Momentum line and to be seen in the entertainment zone – a 4K HDR screen with the unique Ambiglow feature, this is an all-round multitasking and entertainment solution, allowing USB-C docking and simultaneous notebook charging. This screen can act as the ultimate docking station, allowing you to easily use it with anything – from the latest 4K gaming consoles to any high end graphics processing designer computer.
Nope, knew some of it but thanks for the info. For me it was just two kind of implementations with AMD using as a base and extending VESA Adaptive-Sync. Good to know, made me discover about HDMI now handling this standard too.Adaptive-Sync is a VESA standard.
Freesync is the name AMD give to their compliance to the standard, so monitors that are compliant with the standard are typically called Freesync monitors. There's no proprietary hardware (With regard to GPU vendors) in the monitors, but they are refered to as "Freesync."
Freesync is the de facto name for compliance with the Adaptive-Sync standard simply because NV refuses to comply.
So, when someone refers to a Freesync monitor, they are talking about monitors that are compliant with the VESA standard. You may not agree with this, but that's the reality of the situation.
G-sync is not compliant with the VESA standard. That's why it needs expensive proprietary hardware on desktop PC monitors. Funnily enough, NV supports the VESA standard on laptops... They just refuse to implement the standard on the driver level with their desktop GPUs, making them only work with their proprietary modules and not the VESA standard modules with regard to adaptive refresh rates.
The reason freesync monitors are cheaper than Gsync is because "freesync" monitors are just monitors with industry standard modules that NV could allow the use of if they wanted to...
I'm sure you knew all this, though...
Since I'm not swimming in cash, I'll just settle for a 27" 2K HDR 144Hz G-SYNC monitor. While it'll still cost a lot of money, atleast it won't be super exorbitant. Can do without ultra-wide or curved.
Glad I didn't wait to grab my first Ultra-wide then. I settled into a semi-permanent spot and I wanted to nest in it. Maybe this PG348Q will hold its value really well.It seems like the Acer Predator X35 is confirmed for next year.
Source.
Yeah, the only thing that stopped me to go for this one is that I wanted to get a curved 35" HDR monitor with G-Sync. I'll just wait and put money aside to better digest the cost.Glad I didn't wait to grab my first Ultra-wide then. I settled into a semi-permanent spot and I wanted to nest in it. Maybe this PG348Q will hold its value really well.
This sounds incredible. Let me know when they get down to sub 500.
This sounds incredible. Let me know when they get down to sub 500.
I'm really happy with my PG348Q too.Glad I didn't wait to grab my first Ultra-wide then. I settled into a semi-permanent spot and I wanted to nest in it. Maybe this PG348Q will hold its value really well.
It's relatively easy to implement but there's like no HDR hardware (monitors) on the PC market right now so devs just don't bother - mostly because it's hard to implement something which you can't even test by yourself due to lack of h/w.It will get better, but for a feature that was said to require minimal effort from the developers to implement, I'm surprised that the rate of adoption has been so low.
Yeah, IPS is generally better for gaming than xVA because it's generally a lot faster at switching in black-white-black pattern which result in significantly less noticeable blurring of high contrast edges. It will be interesting to see what these 35" UW 200Hz xVAs will be capable of though as it makes little sense to rate the panel at 200Hz if it will still have lots of edge blurring.I've no interest in going from an IPS panel back to a VA panel despite the higher contrast ratio, or from a 34" UW to a 27" 16:9 panel.
My TV has a 5000:1 native VA panel and local dimming, and I haven't used it for gaming since getting the PG348Q.
Contrast is nice to have, but for me, it's not as important as good viewing angles (image stability) and fast response times.
Adoption seems very slow even for console games, it's not just a problem on PC.It's relatively easy to implement but there's like no HDR hardware (monitors) on the PC market right now so devs just don't bother - mostly because it's hard to implement something which you can't even test by yourself due to lack of h/w.
These are not the first 200Hz VA panels - Acer's Z35 uses one too, it's just 2560x1080 rather than 3440x1440 due to DisplayPort 1.2 bandwidth limitations. Motion handling is... not great.Yeah, IPS is generally better for gaming than xVA because it's generally a lot faster at switching in black-white-black pattern which result in significantly less noticeable blurring of high contrast edges. It will be interesting to see what these 35" UW 200Hz xVAs will be capable of though as it makes little sense to rate the panel at 200Hz if it will still have lots of edge blurring.
Adoption seems very slow even for console games, it's not just a problem on PC.
There's nothing stopping you connecting a PC to an HDR TV though. That's how most devs are implementing HDR right now anyway, not with HDR monitors.
It's frustrating because it seems very unlikely that devs will go back and patch in support for HDR to existing games like Deus Ex: Mankind Divided, Gears of War 4, Forza Horizon 3, and others, once HDR does become more widespread.
I know devs at a few studios where the people working on that sort of thing all have HDR TVs at their desks, and even some of the other artists/developers with dual monitor setups (often 27") are being given the option of a 40" 4K HDR TV to replace their monitors now.Devs usually don't develop PC games on TVs. Low adoption on consoles may actually be related to the absence of HDR monitors too as even console games aren't exactly developed on TVs either - although in their case they are at least tested on TVs while it's rarely the case for PC versions.
Whatever setup Eidos Montréal used to develop HDR for the PS4 version of Deus Ex, I'm sure it could also have been used to implement HDR in the PC version of the game without having to wait for G-Sync HDR monitors to be released.
I know devs at a few studios where the people working on that sort of thing all have HDR TVs at their desks, and even some of the other artists/developers with dual monitor setups (often 27") are being given the option of a 40" 4K HDR TV to replace their monitors now.
Whatever setup Eidos Montréal used to develop HDR for the PS4 version of Deus Ex, I'm sure it could also have been used to implement HDR in the PC version of the game without having to wait for G-Sync HDR monitors to be released.
We know that Nixxes developed HDR support for Rise of the Tomb Raider, but it hasn't been released for some reason. Perhaps that's being held back for the release of these monitors.
True, but as I linked to above, Nixxes worked on HDR support for Rise of the Tomb Raider - even though that hasn't been released. So it's not like they're unfamiliar with, or ill-equipped to implement HDR on the PC.The catch is that PC version of DXMD was developed by Nixxes and not Eidos Montreal.
I don't know the specifics, only that they were mainly Sony TVs. They might have been 43" rather than 40".What are these 40" HDR TVs, and are they shit?
21:9, curved
Then they are still an "unsolved" problem
Yeah I'm not sure I want to be on the curved screen bandwagon.Monitors are not a solved problem
Smaller 4K screens are great; but anything smaller than 40" is intended to use display scaling. I would even consider using 125% scaling at 40". A 4K screen should really be 46" to use 100% scaling.At work I have a 32" IPS 4K work monitor, and I realize that 32" is still not a good size for a 4K monitor (never mind 27", which is ridiculously low for 4K)
Smaller 4K screens are great; but anything smaller than 40" is intended to use display scaling. I would even consider using 125% scaling at 40". A 4K screen should really be 46" to use 100% scaling.
A 27" 4K screen is not about giving you more workspace than a 1440p screen, it's about rendering everything at a higher resolution. 150% scaling displays everything the same size as a 1440p monitor - but at 50% higher resolution. 175% scaling is closer to Windows' intended size though. (based around 96 PPI)
5K and 8K do handle scaling of legacy applications better than 4K screens do though, since standard monitor sizes end up using integer scales (2x / 3x) rather than non-integer scales. (1.25x, 1.50x, 1.75x)
It's not solved for me till there are 32" 4K HDR OLED monitors. Hopefully these become a thing within the next 2 years...
oled would be a terrible idea for a desktop, you'd get burn in from desktop/browser tabs.
I have an x34 and I'm really excited for that aced x35 or asus pg35vq
I haven't used A VA panel though, what are the major concerns?
Color shift when you move your head. The viewing angles aren't as good as IPS
G-Sync is half the appeal. I wouldn't buy another display without variable refresh rate support after using a G-Sync monitor.Throw out g-sync and lower the cost. While they're at it, have a non-curved version.
That's true, but there are many people who don't just want a display for gaming, and HiDPI displays are a huge step forward for everything else. (and gaming too, for games which scale up the UI or where it doesn't matter)The biggest issue with HiDPI displays is the fact that a lot or games do not scale their HUDs / menus with Windows scaling settings or by themselves and as such would be nearly unplayable on such displays. DXHR is a prime example of such game from not so recent times but there are many more, even from recent years.
I haven't used A VA panel though, what are the major concerns?
Though OLED's do burn in (these instances can be localized), surprisingly desktop use does not seem to cause it. If you peruse through an OLED burn-in thread on AVSforum, you'll be able to come to your own conclusion. Anecdotally, i own two OLEDS (2017 model), both of which have had 300+ hrs each of desktop usage and don't have burn in. Same thing for the 2016 model before i upgraded, each of those OLEDS had 500+ hours of desktop use.
It will be interesting to read RTINGS burn in results on these OLEDS.
We are talking about gaming displays here. I'm very cautious of any display which DPI won't be usable without scaling as chances are that a lot of older games will be unplayable on such display because of scaling issues.That's true, but there are many people who don't just want a display for gaming, and HiDPI displays are a huge step forward for everything else. (and gaming too, for games which scale up the UI or where it doesn't matter)
40-46" 4K displays are a bit of a compromise too, because there are no gaming monitors in that size, and none of the TVs in that size are high-end.
Even 40" is really pushing it for a lot of people, so I doubt that a high-end 55" TV would be an option at a desk for most.
While what you're saying would make sense on a very high DPI screen, there is no way of achieving it at the moment - no monitor and no GPU has any kind of support for such "pixel-perfect" scaling. Running a game in a low resolution on a hiDPI screen will result in the image being blurry because it will be upscaled with the usual bilinear filtering, in the same way some 1280x720 is upscaled on modern 2560x1440 screens.The main issue with scaling is actually that 4K is not enough resolution.
If you had a 5K display in the 27-32" size range, you could render 2560x1440 at a pixel-perfect 2x scale for those older games which don't scale their UI with resolution. For 8K, it would be 3x.
Using pixel-perfect scaling would look just as though you were using a 1440p native display.
Render 1440p on a 4K display and you have to use blurry non-integer scaling. For 4K you have to drop down to 1080p for pixel-perfect scaling, which looks rough on a 27-32" monitor.
Do you use the feature that shifts the image periodically? If so, is it perceptible? Thanks from a prospective oled buyer
Well there are no 40" gaming displays. The largest gaming monitors at ~110 pixels per inch are the 34" Ultrawides.We are talking about gaming displays here. I'm very cautious of any display which DPI won't be usable without scaling as chances are that a lot of older games will be unplayable on such display because of scaling issues.
40" 16:9 is probably the biggest monitors will ever get although they may extend horizontally from there for some 21:9 aspect. It's not a compromise, you're getting great PPI with 4K 40" which you can use without scaling. It's true that so far there were only a couple of workstation offerings of such size but I think that we'll eventually get a 40" gaming screen.
Yes, it can be problematic right now. What we really need is driver-level support for "integer scaling".While what you're saying would make sense on a very high DPI screen, there is no way of achieving it at the moment - no monitor and no GPU has any kind of support for such "pixel-perfect" scaling. Running a game in a low resolution on a hiDPI screen will result in the image being blurry because it will be upscaled with the usual bilinear filtering, in the same way some 1280x720 is upscaled on modern 2560x1440 screens.
LG apparently have an OLED panel for monitors due to be released some time next year.oled would be a terrible idea for a desktop, you'd get burn in from desktop/browser tabs.