I have experienced a level of AI interactivity in VR that can only be done in VR, though it was not crazy complex. It was just NPCs reacting to certain gestures or handing me objects or vice versa, but these direct interactive experiences only work in VR. It's also possible right now to use eye-tracking and face tracking add-ons (while we wait for Project Cambria later this year which has built-in support) to control game mechanics. In Neos VR, I have seen people raise their eyebrows and have it lift them into the air. Make an angry face and cause them to have a dark aura around them. Open their mouth to breathe fire. Point is, these actions can be represented today, which means we can already get AI to react to these.
Having realistic VR physics interactions does not mean the games are getting inherently less fun. If anything, people love the freedom they can get. People mess around like crazy in Boneworks - it has arguably the most versatile physics in VR - the closest to simulating the real world in terms of player agency, and yet people goof around and have a blast.
You may have some circumstances where a game with depth perception actually limits it, like the game Superliminal which works because you don't have that depth, but in many if not most cases, I struggle to see why depth perception would make people lazy. Take Hellblade VR as an example. I actually got more into it because the combat was more visceral when in it's happening right in front of you. I was contorting my face and really getting into certain actions. VR Depth perception could however cause issues in design if we start relying on the depth and a stereoblind person plays the game, so that is a valid point. Accounting for physical body types is more about games that use room-scale movement. One of the main competitors in a VR E-Sport (Echo VR) is actually a wheelchair user. If we have games that allow you to be seated or simply gamepad games in VR, then it's pretty workable.
For motion sickness, we know that there are two sides. You have sickness through latency/optics limitations where someone can wear a headset, do nothing, and still get sick. This just requires iterative improvements. Then there is sickness through the sensory disconnect of moving virtually without your inner ear sensing it. This can be helped by improving latency/optics, but a true solution would require tricking the inner ear. There is some promising research on this (Sony is also working on the same technique):
https://www.roadtovr.com/researchers-head-mounted-haptics-combat-vr-discomfort-walkingvibe/
Hard to say if that will truly pan out, but there is promise.