• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Quest 2 gets rendering saving tech that boosts up to 70% more performance

Romulus

Member
ASW can give apps roughly 70% more power to work with compared to rendering at full framerate. That’s the kind of jump usually seen between hardware generations. It could lead to noticeably higher fidelity graphics or even new games that wouldn’t otherwise be possible on standalone VR.

That means apps using ASW on Quest 1 will be able to render at 36 FPS, while on Quest 2 developers can choose between 36 FPS, 45 FPS, or 60 FPS by changing the refresh rate mode.







Very interesting for Quest, but also for future standalone headsets as they become more powerful.
 

Fredrik

Gold Member
I’m so confused, who’s Meta? Is it former Oculus or Facebook or is it a new company?
And is this update made in software or for a new hardware iteration?
 

Elysion

Member
Could something like this be used for non-VR games too? Since it’s a form of framerate interpolation I don’t see why not. If something like this could be used in conjunction with DLSS, the possible performance gains should be quite remarkable.
 
Last edited:

Fbh

Member
Won't this look similarly awful than those "motion Pro" modes on TV's?.

If it actually feels like "native" performance that would be massive, and not just for VR.

I’m so confused, who’s Meta? Is it former Oculus or Facebook or is it a new company?
And is this update made in software or for a new hardware iteration?
Facebook the company at large that owns Facebook.com, Instagram, whatsapp, Oculus, etc is now called Meta.

Facebook the app/site still exists under the Facebook name.
 
Last edited:

SF Kosmo

...please disperse...
This feature has actually been available (and on by default) on the PC side of the Oculus platform for some time. Steam has its own frame interpolation but Oculus' is definitely better. I think it's more helpful on the PC side, because PC games are not generally optimized to your specifc hardware, and framerate bounce up and down more compared to Quest, where games are made with that platform's limits in mind and it's rare to see frame drops.

Holy hell that's an upgrade
It's also super misleading. ASW isn't generally something you want to have on all the time, because it leaves visible artifacts and there are limits to what it can do effectively.

What it's REALLY GOOD for is covering up when the framerate dips for a few seconds. And it lets devs push their games closer to the needle because they know that if there are some spots where the framerate dips for a few seconds it's not going to ruin the immersion. It keeps latency from tanking in those situations too (at least with regard to head movement), because it pulls from tracking data rather than simple frame tweening.

Could something like this be used for non-VR games too? Since it’s a form of framerate interpolation I don’t see why not. If something like this could be used in conjunction with DLSS, the possible performance gains should be quite remarkable.
Yes, but you wouldn't want to. It's far from perfect. You'll see some warping around foreground objects that are moving.
 
Last edited:

intbal

Member
Andreev is probably kicking himself for not getting a patent on this technology for gaming purposes a decade ago.
 
I'm a bit confused on how this is different from asynchronous spacewarp that's already present in Quest 2 and works some black magic there.
For example, this shit doesn't look jerky on the HMD at all. The only problem is, it does eat some of the inputs since game logic still runs on a pretty low framerate.
 

Alexios

Cores, shaders and BIOS oh my!
Insert DmC the feel of 60fps meme here. If it's so good why even give devs the choice, just have it cap out at 30fps so they utilize all the remaining horsepower for other stuff by default (or save on battery life in other cases for low-fi/simple physics/ai games by not utilizing the cpu/gpu much). This type of tech may mitigate the visual result of fps drops in regards to headtracking vs the static environments around you but actual moving objects (which don't just go in straight lines to simply infer their next position by their velocity on the previous frames) including your own two moving hands and held/used objects simply have no magical way to properly convey and display their positioning and animation inbetween frames, with jarring results. Maybe this is so they bring games to older models when newer more powerful versions are around and the games are made primarily for them, so owners of the old stuff don't feel left behind so soon. Even though the experience will be far from great they can technically claim supporting them. VR should generally stick with the original 90fps minimum standards, concessions in that core regard suck.
 
Last edited:

SScorpio

Member
Won't this look similarly awful than those "motion Pro" modes on TV's?.

If it actually feels like "native" performance that would be massive, and not just for VR.

On a TV it's just looking at the flat image and applying an algorithm to interpolate the motion.

With this, the algorithm will have the depth of everything, as well as motion vectors. It won't be as good as a native render. But it should be better than what's on a TV.
 

Fafalada

Fafracer forever
I'm a bit confused on how this is different from asynchronous spacewarp that's already present in Quest 2
That operates only on frame-buffer data (no depth, and the only motion information is from headset itself), and requires no integration with application (it works on everything as a post-process).
This will require per-application integration, much like most temporal algorithms in 'flat' land, but it offers far more potential for better quality (ASW had a ton of things it does poorly or just completely fails at).
For reference point - think of it like going from DLSS 1.x to DLSS 2.x as far as improvements it can provide.

Could something like this be used for non-VR games too?
Yes, this is research space that's decades old:

It's a fair bit 'easier' to make this work on VR though because camera/head (and hand) motion input sampling is both guaranteed at certain precision/frequency and has highly accurate prediction (it's reliable up to 50ms into the future), something that can't be said/done for '2d' input methods like controlling camera with mouse, joystick or keyboard.

Games do exist that have successfully done motion-interpolation without anyone being the wiser, but it's probably better that way because if people were 'told' they'd immediately decide to change perception of said games framerate/responsiveness.
 
Last edited:

SF Kosmo

...please disperse...
I'm a bit confused on how this is different from asynchronous spacewarp that's already present in Quest 2 and works some black magic there.
For example, this shit doesn't look jerky on the HMD at all. The only problem is, it does eat some of the inputs since game logic still runs on a pretty low framerate.
It is a version of the same thing, but it works on headset-native games, which the current implementation does not. And where the Rift ASW is a driver-level feature that doesn't have to be supported by the game, this one relies on motion-vector data (the way TAA and DLSS do), which supposedly reduces the appearance of artifacts and allows tracking to feel more accurate/lower latency, but also means a game has to code in support for this feature.
 
Last edited:

mrcroket

Member
Why people post without read the link from OP? Literally explain why is not the same that the actual ASW on PCVR or quest
 

Romulus

Member
This feature has actually been available (and on by default) on the PC side of the Oculus platform for some time. Steam has its own frame interpolation but Oculus' is definitely better. I think it's more helpful on the PC side, because PC games are not generally optimized to your specifc hardware, and framerate bounce up and down more compared to Quest, where games are made with that platform's limits in mind and it's rare to see frame drops.


It's also super misleading. ASW isn't generally something you want to have on all the time, because it leaves visible artifacts and there are limits to what it can do effectively.

What it's REALLY GOOD for is covering up when the framerate dips for a few seconds. And it lets devs push their games closer to the needle because they know that if there are some spots where the framerate dips for a few seconds it's not going to ruin the immersion. It keeps latency from tanking in those situations too (at least with regard to head movement), because it pulls from tracking data rather than simple frame tweening.


Yes, but you wouldn't want to. It's far from perfect. You'll see some warping around foreground objects that are moving.


Is this really the exact same thing as pcvr? Sounds like it's more geared toward closed box optimization.
 

CamHostage

Member
On a TV it's just looking at the flat image and applying an algorithm to interpolate the motion.

With this, the algorithm will have the depth of everything, as well as motion vectors. It won't be as good as a native render. But it should be better than what's on a TV.

Also on a TV, you're taking a real image stream and making fake frames of it, and it just feels weird. (It's also interpolating the motion to display at 60FPS or more, and for some reason, our eyes do not enjoy 60+ of photographed reality; in videogames, we love all the tweening we can get.)
 
Last edited:

BadBurger

Gold Member
Jesus. And I think the Quest 2 performance in stand alone mode is fine already (obviously games that make use of your PC and the fiber cable are better). Edit: why did I type fiber?
 
Last edited:

ZehDon

Member
To clarify, because the OP didn't, ASW here doesn't mean Asynchronous SpaceWarp, here it's Application SpaceWarp.

From the article:
There are usually two major side effects when using technologies like ASW: latency almost doubles as input is only sampled half as often, and visual artifacts can often be seen as the extrapolation is not perfect. But Meta claims it has solutions to both.

The company notes that while the PC-based Oculus Rift’s SpaceWarp estimated the motion vectors itself, Quest’s SpaceWarp will require the game engine to provide the true motion vectors instead. This “significantly” higher quality input, Meta claims, results in “little to no” visible artifacts in most cases. For details on the edge cases where artifacts will be visible, watch the full Connect ASW presentation.

To combat the latency increase, Meta is also releasing Positional TimeWarp. While SpaceWarp will be a developer choice, TimeWarp is always enabled at the system level. The current TimeWarp – on Quest since launch – reduces rotational latency by skewing each finished frame by the angle your head rotated since rendering began. Positional TimeWarp will use the same depth buffer provided for ASW to also reproject each frame in the direction your head moved, reducing translational latency too. Meta claims Positional TimeWarp is so significant that even apps using Application SpaceWarp will have lower head latency than any app build currently on a Quest today.

So, ASW will work with PTW, and in combination, will actually allow the Quest headsets to generate every second frame synthetically from accumulated data, and not just to smooth over frame hiccups. This, appears to be, because it can now extrapolate full translation and rotational vectors, allowing the headset to still supply entirely new frames with full movement applied, whereas previously this was not available. I suspect this would be limited in use by the amount of difference between each frame. In a racing game, the translation difference between two frames could be absolutely enormous, in which case I would expect this technique wouldn't have good results.

It's worth noting that this isn't universally supported, and requires specific APIs to be used by the applications:
Meta says Application SpaceWarp and Positional TimeWarp will be included in the next Oculus SDK release, which should launch November 8. Both will require apps to use the OpenXR and Vulkan APIs – legacy Oculus API and OpenGL are not supported. Unity apps will also need to use a Scriptable Render Pipeline like URP as the legacy built-in pipeline isn’t supported either.

Overall, this is an interesting approach, however, if you've ever played a game at 60FPS movement, but with animations locked to 30FPS, it can be rather distracting, and I suspect it wouldn't be too dissimilar to the end result here.
 
Last edited:

GymWolf

Gold Member
Yeah i don't know about that...i already felt the difference between 90 frame on pc and the 72 on quest 1, even less framerate would be puke inducing imo...
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
This is brilliant news. Can’t wait to see the first games that take advantage of it.
 

Majukun

Member
Yeah i don't know about that...i already felt the difference between 90 frame on pc and the 72 on quest 1, even less framerate would be puke inducing imo...
i think you missed the point of the etire thing..the idea is the game generating only 36 fps and then the new system duplicating/simulating the rest..so the games would still run for the user's eyes at 72 fps
 
To clarify, because the OP didn't, ASW here doesn't mean Asynchronous SpaceWarp, here it's Application SpaceWarp.

From the article:


So, ASW will work with PTW, and in combination, will actually allow the Quest headsets to generate every second frame synthetically from accumulated data, and not just to smooth over frame hiccups. This, appears to be, because it can now extrapolate full translation and rotational vectors, allowing the headset to still supply entirely new frames with full movement applied, whereas previously this was not available. I suspect this would be limited in use by the amount of difference between each frame. In a racing game, the translation difference between two frames could be absolutely enormous, in which case I would expect this technique wouldn't have good results.

It's worth noting that this isn't universally supported, and requires specific APIs to be used by the applications:


Overall, this is an interesting approach, however, if you've ever played a game at 60FPS movement, but with animations locked to 30FPS, it can be rather distracting, and I suspect it wouldn't be too dissimilar to the end result here.
I think you’re generally right about some of the limitations, but even the artifact-laden rerojection techniques that have been around for a few years on PC are good enough for most people with most types of experiences. Because of depth-buffer/translation additions with ASW, this is likely going to be even better. Coupled with the job developers have to get ~4K VR from a snapdragon chip I’m guessing most games will be designed around taking advantage of this going forward.
 

stranno

Member
still facebook so no thanks.
What is going on with the "facebook thing".

I got the Q2 one year ago, logged for the first time and never got anything from FB again. Q2 works wonderfully, Windows client is pretty lightweight (and not as ugly as SteamVR) and everything works as intended.
 
Last edited:

SF Kosmo

...please disperse...
Is this really the exact same thing as pcvr? Sounds like it's more geared toward closed box optimization.
No, it's not exactly the same, it has new optimizations. But I doubt the use cases will be wildly different.
 
I bought my daughter a quest 2 for her birthday. She really wanted it. Anything I need to know? I believe it needs a Facebook account right? Can she only use their store or can she use other ones?
 

SF Kosmo

...please disperse...
I bought my daughter a quest 2 for her birthday. She really wanted it. Anything I need to know? I believe it needs a Facebook account right? Can she only use their store or can she use other ones?
There are alternative stores like sidequest and app lab, but they're slightly more complicated to access. On the PC side, you can buy content wherever, as long as it's SteamVR, OpenXR, or Oculus compatible. It's really not as much of a walled garden as people like to pretend.
 
Last edited:

chixdiggit

Gold Member
still facebook so no thanks.


 
I'm considering getting one of these. Just not sure there are enough games for it.

I don't have a PC that would help. I really want VR to include wireless, and controllers and for a low price. Only other thing I'd be considering is PSVR2, but if that's some $500 thing that's wired with no controllers, I might as well just skip it.

Games I've heard that are worth it:
  1. RE4 VR
  2. Walking Dad
  3. Moss
  4. ???
Not really a crushing line-up just yet, considering I'm really only interested in RE4 currently.
 

SF Kosmo

...please disperse...
I'm considering getting one of these. Just not sure there are enough games for it.

I don't have a PC that would help. I really want VR to include wireless, and controllers and for a low price. Only other thing I'd be considering is PSVR2, but if that's some $500 thing that's wired with no controllers, I might as well just skip it.

Games I've heard that are worth it:
  1. RE4 VR
  2. Walking Dad
  3. Moss
  4. ???
Not really a crushing line-up just yet, considering I'm really only interested in RE4 currently.
Add Beat Saber, Superhot, EchoVR, Myst, Arizona Sunshine, Pixel Ripped, Thumper, Vader Immortal, Thumper, Rez...

Having a PC definitely adds a lot (including better graphics in all of the above games), but there's still quite a lot to play on Quest.
 
Add Beat Saber, Superhot, EchoVR, Myst, Arizona Sunshine, Pixel Ripped, Thumper, Vader Immortal, Thumper, Rez...

Having a PC definitely adds a lot (including better graphics in all of the above games), but there's still quite a lot to play on Quest.
PC is a non-starter for me. I'm only going to get into VR if it's cheap. Been waiting for Nintendo to put out a bundle similar to Quest honestly. Being wireless and stand-alone is what makes it an attractive option for me.

Thanks for the game suggestions
 
Top Bottom