OpenVR裝置位置獲取的兩種方法
阿新 • • 發佈:2019-01-26
OpenVR 下,獲取裝置位置的方法有兩個:WaitGetPoses 和 GetDeviceToAbsoluteTrackingPose。 WaitGetPoses: 會阻塞程式執行,直到底層獲取到新的裝置姿態位置才可以返回。 GetDeviceToAbsoluteTrackingPose: 則時通過傳遞預計算裝置的位置,來預測新的裝置位置。
兩個函式都可以獲取到裝置位置,但作用不同。WaitGetPoses:不僅獲取裝置位置,同時也負責同步、重新整理等工作,如果不呼叫該函式,即使submit了左右眼的紋理資料,Vive也無法現實提交的資料。而GetDeviceToAbsoluteTrackingPose,是通過傳遞時間引數,進行預計算,當前裝置的預測位置,這樣更能對當前的渲染系統提供準確的裝置資訊。
WaitGetPoses:更新裝置的姿態資訊
/** Scene applications should call this function to get poses to render with (and optionally poses predicted an additional frame out to use for gameplay). * This function will block until "running start" milliseconds before the start of the frame, and should be called at the last moment before needing to * start rendering. * * Return codes: * - IsNotSceneApplication (make sure to call VR_Init with VRApplicaiton_Scene) * - DoNotHaveFocus (some other app has taken focus - this will throttle the call to 10hz to reduce the impact on that app) */ virtual EVRCompositorError WaitGetPoses( VR_ARRAY_COUNT(unRenderPoseArrayCount) TrackedDevicePose_t* pRenderPoseArray, uint32_t unRenderPoseArrayCount, VR_ARRAY_COUNT(unGamePoseArrayCount) TrackedDevicePose_t* pGamePoseArray, uint32_t unGamePoseArrayCount ) = 0;
GetDeviceToAbsoluteTrackingPose
/** The pose that the tracker thinks that the HMD will be in at the specified number of seconds into the * future. Pass 0 to get the state at the instant the method is called. Most of the time the application should * calculate the time until the photons will be emitted from the display and pass that time into the method. * * This is roughly analogous to the inverse of the view matrix in most applications, though * many games will need to do some additional rotation or translation on top of the rotation * and translation provided by the head pose. * * For devices where bPoseIsValid is true the application can use the pose to position the device * in question. The provided array can be any size up to k_unMaxTrackedDeviceCount. * * Seated experiences should call this method with TrackingUniverseSeated and receive poses relative * to the seated zero pose. Standing experiences should call this method with TrackingUniverseStanding * and receive poses relative to the Chaperone Play Area. TrackingUniverseRawAndUncalibrated should * probably not be used unless the application is the Chaperone calibration tool itself, but will provide * poses relative to the hardware-specific coordinate system in the driver. */ virtual void GetDeviceToAbsoluteTrackingPose( ETrackingUniverseOrigin eOrigin, float fPredictedSecondsToPhotonsFromNow, VR_ARRAY_COUNT(unTrackedDevicePoseArrayCount) TrackedDevicePose_t *pTrackedDevicePoseArray, uint32_t unTrackedDevicePoseArrayCount ) = 0;