You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think it would be very cool if dagon supported VR out of the box, so basically some wrapper calls to check if a HMD is present, to fetch input from controllers and HMD and to render and output the game stereographically to the vr compositor.
I'm currently porting OpenVR to D and it's actually fairly easy to start getting it running on VR. To test this Steam also offers some debugging tools to run VR basically like an FPS game. After a few VR initialization calls you get some context pointer to use. Then you basically render the scene twice, once for the left eye and another time for the right eye. Afterwards you submit the render results as (framebuffer) texture (DirectX, OpenGL, Vulkan, IOSurface, DirectX12, DXGI shared handle or Metal) to the VR compositor which handles the lens distortion (optional) and other rendering things.
Before each frame you can either wait for the VR device to report the position of the headset or use an API to immediately obtain the last one or predict it for when the frame is going to be displayed.
The reported position is a translation matrix for rotation & translation of the head of the user. Additionally you then offset this translation matrix by the per-eye matrices before each render to get each eye translation. The head translation matrix is usually around [0, height, 0] where height is the height of the HMD from the floor. The user can freely move around in space, this is usually just a few meters away from the origin point and this should only really be considered as movement for the physics once the user moves with a joystick on the controller or when the user has passed some movement threshold.
The VR compositor or user may change the resolution of the eyes at any time, so this should be accounted for as well to recreate the eye framebuffers.
For the API as a user I would expect it to be opt-in at startup, with some API to query if a HMD is present and if controllers are present. Then there would be some kind of VR camera instead of a normal camera and it should be configurable what is visible on the normal game window. For example the normal game window might be just the left or the right view or it might be some third view from some entirely different viewpoint. It might also just be hidden or any arbitrary other content. Additionally when there is the VR view it should also be possible to trivially add 2D UI elements to the desktop window for recording purposes.
Adding this kind of VR API might also be a good point to think of AR functionality to additionally support showing game objects inside the real world through cameras and some kind of calculated origin points coming from some API. Additionally instead of only OpenVR this should also work with OpenXR and maybe others once there are D APIs for those.
The text was updated successfully, but these errors were encountered:
This is very interesting, VR support in Dagon would indeed be cool. I don't have a full-featured headset though, only a cardboard viewer, so can't test the API right now.
I would really like to support OpenXR too, but my headset is only SteamVR compatible and SteamVR currently only supports OpenXR through Vulkan, not through OpenGL. I could try to get it working with OpenVR (SteamVR) through OpenGL as that works fine, but I think Vulkan would be better because the extremely high performance is essential for VR games.
I think it would be very cool if dagon supported VR out of the box, so basically some wrapper calls to check if a HMD is present, to fetch input from controllers and HMD and to render and output the game stereographically to the vr compositor.
I'm currently porting OpenVR to D and it's actually fairly easy to start getting it running on VR. To test this Steam also offers some debugging tools to run VR basically like an FPS game. After a few VR initialization calls you get some context pointer to use. Then you basically render the scene twice, once for the left eye and another time for the right eye. Afterwards you submit the render results as (framebuffer) texture (DirectX, OpenGL, Vulkan, IOSurface, DirectX12, DXGI shared handle or Metal) to the VR compositor which handles the lens distortion (optional) and other rendering things.
Before each frame you can either wait for the VR device to report the position of the headset or use an API to immediately obtain the last one or predict it for when the frame is going to be displayed.
The reported position is a translation matrix for rotation & translation of the head of the user. Additionally you then offset this translation matrix by the per-eye matrices before each render to get each eye translation. The head translation matrix is usually around
[0, height, 0]
where height is the height of the HMD from the floor. The user can freely move around in space, this is usually just a few meters away from the origin point and this should only really be considered as movement for the physics once the user moves with a joystick on the controller or when the user has passed some movement threshold.The VR compositor or user may change the resolution of the eyes at any time, so this should be accounted for as well to recreate the eye framebuffers.
For the API as a user I would expect it to be opt-in at startup, with some API to query if a HMD is present and if controllers are present. Then there would be some kind of VR camera instead of a normal camera and it should be configurable what is visible on the normal game window. For example the normal game window might be just the left or the right view or it might be some third view from some entirely different viewpoint. It might also just be hidden or any arbitrary other content. Additionally when there is the VR view it should also be possible to trivially add 2D UI elements to the desktop window for recording purposes.
Adding this kind of VR API might also be a good point to think of AR functionality to additionally support showing game objects inside the real world through cameras and some kind of calculated origin points coming from some API. Additionally instead of only OpenVR this should also work with OpenXR and maybe others once there are D APIs for those.
The text was updated successfully, but these errors were encountered: