Skip to content
STimberlake edited this page May 17, 2021 · 2 revisions

Q: What makes VSDK different than VRTK v3.3?

A: VSDK Unity began life as a fork of VRTK version 3.3, but has grown in different directions due to differing priorities. Our initial goals were informed by our own need to support a variety of devices for immersive training simulations and first-person experiences, and later morphed to make our development on top of VRTK easier. Here are the changes we’ve made in more detail:

  • Support for additional device types and devices: The VSDK development team has access to a VR lab with many of the most advanced HMDs and peripherals. Some of these devices lack widespread developer support either because their user base is too small, too niche, or still in development, but we still needed to be able to evaluate and use these devices and so we added support for them.

    • Hand SDK: Our Hand SDK allows developers to make use of hand-tracking devices, whether they are gloves (e.g. ManusVR), optical (e.g. Ultraleap), or even controllers (Knuckles hand SDK support coming soon!). Our system generalizes hand input to behave similarly to controllers and provides hooks for our gesture system.

    • Haptics SDK: Haptic feedback is a great way to immerse players into virtual environments when used well. However, many different devices have haptics installed, from controllers to full-body haptic suits, and we needed a convenient way to build haptic effects once and reuse them across devices, so we made one! Our SDK maps effects to Body Coordinates which individual device SDKs map to actuators or groups of actuators. This includes supports for both UX haptic feedback and location/physics driven haptics.

    • SteamVR 2.0: Many of the latest and greatest HMDs require SteamVR 2.0; unfortunately, VRTK didn’t have support for SteamVR 2.0 when we forked, so we added our own in order to support the latest HMDs.

  • The Reaction System: While developing our first few programs in VSDK, we liked working with Interactable Objects but found it difficult to rapidly prototype specialized objects, so we created the Reaction System as a way of tying object events to effects. If you’re familiar with Playmaker or Bolt, it provides a similar utility to what those do but without visual scripting.

    • Interaction Areas: Interaction Areas came about because we found that while Interactable Objects are great, sometimes you need an action to occur if and only if an object is a specific location. Interaction Areas have since grown to allow for more complex spatial interactions.

    • Reactors: Reactors are objects that tie events to reactions, and form the glue of the Reaction system. Reactors are what allow us to prototype interactions without writing a single line of code.

    • Reactions: Reactions are small pieces of code that allow developers to trigger effects via a Reactor. A Reaction could change the color of an object, turn a particle system on or off, etc.

  • Gesture System: The Gesture System evolved naturally once we had created the Hand SDK. Our Gesture System does not provide hand tracking on its own; rather it leverages existing systems and allows developers to specify gestures based on the state of each of the fingers on each hand. We further evolved the Gesture System with Advanced Gestures, which allow for additional conditions (e.g. hand must be facing in a certain direction) on top of the hand state to trigger gesture events.

  • Removed VRTK’s 3D Control Utilities: We found that all of the functionality provided by these controls was outmoded by Interaction Areas and the Reaction System, so we removed them in order to reduce confusion. The scripts are compatible and may be imported if needed for a project.

Clone this wiki locally