stridervr-logo-pink

Developers

your ideas to bend the path

Information for Developers

Overview:
striderVR is a hardware platform that allows for natural movement in virtual reality environments. The user steps onto a platform, that is able to apply a rotation and a linear displacement to him at the same time. As the user walks on the striderVR platform, it will response with rotation and linear displacement so that the user never leaves the platform and his body orientation is always kept the same. This gives the user a near perfect illusion of walking in a virtual world, while in the physical world he neither leaves the striderVR platform nor turns his body away.

Controlling striderVR platform motion:
In the following, we will be talking about the “physical” user movement when we refer to the actual movement of the user in the physical world, and the “virtual” user movement when we speak about the users perceived movement in the virtual world. The mentioned platform rotation and linear displacement are both “real” (rather than “virtual”).

Motor control:
The physical rotation and linear displacement are both caused by two sensorless brushed DC motors embedded in striderVR. Both motors are controlled by a Cytron SmartDriveDuo-60 motor driver circuit. This allows to individually set the speeds of the motors through serial commands sent over an USB connection. A .NET library (along with C# source code) has been developed that simplifies setting the motor speeds between -100% (full reverse) and +100% (full forward) of the maximum motor speed (determined by the maximum voltage supplied to the hardware). This library can easily be ported to other platforms that allow serial communication over USB ports.

User position detection:
To supply adequate motor input, the user’s physical position on the platform, namely linear front/back displacement and rotation around the platform center, must be reliably determined in near real time. This is done using a Microsoft Kinect 2 (“Kinect”) sensor and averaging over the position of various body parts. Due to the particular boundaries of the Kinect system the user must not turn away his body from the tracking camera.

Platform response:
The user’s detected linear and rotational movements are then used to calculate a platform rotation and linear transport response to compensate his body rotation and move the user back to the center of the platform. striderVR itself does not mandate any particular algorithm to calculate the platform response to a users displacement. Instead, we have determined a combination of two PID (proportional-integral-derivative) controller algorithms with dynamically adjusted target values to be well suited for calculating the platform response. A C# application (along with source code) that implements these PID algorithms has been implemented, which allows for parameter adjustments and also gives visual feedback of measured positions and generated response values. The PID algorithm can easily ported to other platforms that allow interfacing with the Kinect hardware.

Virtual movement:
For a realistic user experience in the virtual environment, the user’s physical movement as detected by the Kinect must be transformed into virtual movement. Virtual movement is not equal to physical movement, since part of the physical movement is caused by striderVR’s response which transports the user back to the center of the platform. This transportation response has to be subtracted from the users physical movement as detected by the Kinect to get the resulting virtual movement to be presented optically to the user, for example through an Oculus Rift or HTC Vive head gear. A Unity based C# example application (along with source code) that implements the calculation of the correct virtual movement based on the physical input has been developed and can easily be ported to other platforms.

Body avateering:
The Microsoft Kinect 2 detects the body posture of the user. This information can be used to adjust the body posture of the user’s avatar in a virtual environment accordingly. A Unity based C# example application (along with source code) that implements body avateering has been implmented and can easily be ported to other platforms.

Leap Motion:
For additional realistic hand posture in the virtual environment, a Leap Motion device can be added to the striderVR setup. This allows for precise measurement of the users hand configuration and application to the user’s virtual world avatar. Standard libraries are available from the Leap Motion homepage, an example Unity based C# application has been developed by us.