I got the Acer (for $400) and that is only because I was getting a new PC whether I liked it or not since my old one broke down and I couldn't stand the color on my TN panel anymore. Fact is, the HDR is not real HDR. Does it still look great when a game actually allows calibration? Yes. However, a lot of games only use automated calibration that doesn't work in such low parameters. Real HDR pretty much only exists in TVs over $1500 right now.
But what if you say okay well who cares about HDR, let's get some 10bit color going at least. There you'll find better options at reasonable prices, but also much better will be coming out soon. Also, 4k is an absolute waste of money at anything less than 32" and you need to be fully aware of that.
However, at 32" the difference of motion artifacts between 60hz and over 120hz is much more apparent. Is that only really seen in super competitive games? Well, yes and no. It will only help your gameplay in really competitive games, but is still noticed elsewhere. I'd say the loss of clarity in motion is great enough that it makes a drop to 1440p superfluous, which brings us to the next complication.
Say you go for the 144hz monitor with motion blur reduction for that sweet, sweet clarity that makes the 4k benefit over 1440p apparent all the time. Are you playing many games at 4k at 144fps? HELL NO. And you won't. For a very long time. So long, in fact, that by time it is an affordable thing to do, a 4k 144hz truly HDR monitor that meets all Freesync 2 specs will be your average everyday monitor at reasonable prices. 144hz can still improve clarity for 60fps games some if you run borderless windowed, but the strategy is not giving the full benefit and is incompatible with monitor sync technologies.
So what did I do? I didn't chase the 4k dream dropping an extra $1000 on a 1080ti and another $1000 for the monitor to support it, but but got the most bang for my buck on this Acer with a 1070ti which I use to play most games 1440p@60fps and tossed some of the extra money into an Xbox One X which lets me experience the HDR a bit more often than I would have. If I ever miss the clarity of my 144hz BenQ, I have it hooked up at the same time and could play any game on it, but the image quality loss of a 1080p 8bit TN panel isn't worth it to me.
If you want to move past 1080p 8bit panels I completely understand. Games these days are simply produced for a higher level of fidelity and if you don't keep up, the returns are diminishing greatly. However, don't fall prey to the marketing of top tier tech that is pretending it's all reasonably obtainable. It just isn't done forming yet, so you will be paying out the ass to have it doing things in part that without the complete feature set doesn't really present it to your end-user experience. Wait until you can get 4K HDR at 120hz and 120fps on a budget that is reasonable to you.
In the meantime, I recommend doing what I did and settling for the awkward middle, especially since you expressed that image quality is important to you. You will run into some annoyances when developers don't provide HDR adjustment options, and you'll probably sometimes think 60fps isn't clear enough, and you'll sometimes wrestle with yourself over a decision between 1440p solid 60fps or inconsistent framerate at 4k, but overall the benefit over old panels is worth it, but the massive increase of investment to step past those little issues isn't really.