Will we ever see Variable Rate Displays? [EDIT: YES.]

I'm just wondering...why have we not seen or heard of any variable rate displays in the PC/tech space? I'm talking about a display that, instead of working at a constant rate (60Hz, 120Hz, etc...), would be able to work at any rate given a certain range (24Hz - 120Hz).

So, what are the befits of such a display?

- Being able to run a game at any framerate without tearing or stuttering.
- Games could run at a variable framerate. No need to lock a game at 60Hz or 30Hz...though you still could from within the engine if you want to keep a consistent framerate.
- No input lag. Frames are displayed as soon as they're drawn, which means no buffering (which causes input lag).

What are the downsides/hurdles?

- Possible incompatibility with current sources. Displays currently work by polling the display adapter (video card, console, etc...) at a set rate (60Hz).
- You need a display that can maintain a single frame for a variable (non discrete) length of time. I believe OLED can accomplish this. I'm not sure if any current HDTV display technology can, however.

A new system would have to be designed. Instead of having the display poll for new frames, the display adapter would have to send them to the display as they come in. The best way to go about this would probably to create a new interface, and maintain legacy interfaces for compatibility's sake.

Has anyone ever looked into this? It seems like an obvious step forward at this point.
Interesting idea. On the whole I would bet it's a lot easier to fix the frame rate of software versus creating a hardware scheme to vary the refresh rate there but I'm just guessing.

EDIT: I am very slow today. doh!
Correct me if I'm wrong but 120hz and above displays have no issues displaying content with a lower framerate?
EDIT: It would appear that I am wrong, and this was bumped as a result of the nvidia announcement. It still seems to me like a variable framerate display would be akin to the soap opera effect you get in other variable frametime scenarios.
TheExodu5 is like those crazy science fiction writers that write about silly shit like man visiting other worlds. Turns out he was on to something.

Tell us all your secrets.
It was a thread. I was so sad when no one replied. :(
I wasn't a member back then. I'm free of any guilt.

Like people say here, you're having the last laugh, but something to potentially make you sadder:
Was your idea patentable at the time? (Well they might have filed for patents to this tech years before but still!)
It was a thread. I was so sad when no one replied. :(
I'm sorry bro. I would totally have answered if I had seen it (and been horribly wrong).

I wonder if this saves any noticeable amount of energy. I would guess not or nvidia would have mentioned it.

How long has nvidia been developping this? Did they say? How hard will it be to replicate by AMD, I wonder. What kind of lead does nVidia have now?

This is obviously way better than mantle. In another league, really.
This is one of those situations where I knew this was a bump because of the NVIDIA thing and spent a while slowly scrolling from the bottom up trying to find the actual bump.

My face when it ended up being the first reply.
Such an insightful post - and not a single reply. The life of a prophet is harsh. His voice cries out in the wilderness but is not heard.