Description

MyVirtualGrasp is a public script inherited from VG_MainScript, which encodes the main functionality of VirtualGrasp, and is the main component that you need to add and configure in your project to enable VirtualGrasp.

In Unity, VG_MainScript inherits from Monobehavior so you can use it as a component on a GameObject.

On this page, we are going to describe all the major configuration options covered in MyVirtualGrasp.cs.

AutoSetup & Sensors

In VirtualGrasp we use the term sensor and controller exchangeably since a controller is essentially a sensing device for hand poses.

VirtualGrasp allows you to configure multiple sensors in an application.

This allows developers to combine two sensors to control avatar’s hands. For example you can choose to use a data glove to control avatar’s finger pose and grasp triggers, while using an Oculus touch controller to control wrist position and orientation. Though this is not most common setup for today’s development use cases, this feature may become useful expecially for research and development of new hand controllers.

In the majority of use cases only 1 single sensor is used.

As you can see in MyVirtualGrasp, Sensors is a list in the interface. The first sensor element is listed as Element 0. All of the sensor elements will share the same interface, so in the descriptions below, we will focus on the importance of each element for each Sensor.

Sensor configuration options in Unity.
Sensor configuration options in Unity.

AutoSetup

AutoSetup will auto-configure a number of controller-related settings and thereyby allow you to quickly switch between different controller inputs, such as UnityXR (e.g. supporting Quest), LeapMotion, Mouse, and others.

Select an AutoSetup option from the dropdown menu, it will automatically adjust various Sensor options.

Pay attention to the Console in case there is anything you may need to take care of manually to complete the auto-setup process.

The integer value 1 (Sensors–> Size in the image) relates to the number sensor elements (maximum 2) that you want to auto-configure. And when there is only one sensor, “Element 0” indicate the first sensor element. In most cases you will have only one avatar in your scene that is controlled by a single sensor. However, if you want to use multiple sensor elements, you can also modify size of sensors (maximum 2) and auto-configure them respectively.

Avatars

For each sensor, you can assign multiple avatars, though in most cases you will have only one avatar per sensor.

Option Description Supported VG Version
Model should be HUMANOID_HAND in most application use cases. There could also be robotic hand options, but they will not be discussed here. All Versions
SkeletalMesh used to provide a reference to the SkinnedMeshRenderer of the avatar that you have imported in your scene and which should be controlled by VG during runtime. All Versions
Replay enable this if you want to use this avatar for replay recorded sensor data, as explained in Sensor Record and Replay, or the VG_Recorder Component. This is a feature that is not available in the free version. Pro Version
Remote This is a placeholder for advanced multiplayer support for hand interaction that we are working on towards a stable version. This is a feature that is not available in the free version. Pro Version
Physical enable this if you want VG to create colliders for this avatar and enable the hand for physical interactions. NOTE: at the moment, this option is experimental and should not be used apart from testing. All Versions

Sensor

Option AutoSetup Description
Sensor supported will always be External Controller for the VG SDK.
External supported name of the external controller, as a string, so one can write your own external controller.
Finger Control Type supported specify how sensor controls the finger motion. See Finger Control Type.
Control not supported specify what this sensor element controls. If you added two sensors, then one could control wrist position, rotation and haptics, another controls fingers and grasp for example.
Origin supported the origin of sensor where the sensor data is interpreted.
Offset supported when the virtual hands do not match to the position or rotation of your real hands holding the controllers, you can adjust the offset to synchronize them. Note that the hand coordinate system’s axes, XYZ, are defined like you strech out three axes with thumb, index, and middle finger (i.e. X is thumb up, Y is index forward, and Z is middle inward) of each hand. In other words, with a fully flat hand, all finger point along the positive Y axis, and your palm faces the positive Z axis.
LHS/RHS
The offset is applied in LHS (left hand system) for the left and RHS (right hand system) for the right hand.
Source: Original by PrimalShell, Creative Commons Attribution-Share Alike 3.0 Unported license.

Global Grasp Interaction Settings

VG global grasp interaction settings.
VG global grasp interaction settings

After you have setup how your avatar’s hands are controlled, you can use the interface to specify Grasp Button globally for all of the sensors.

You can set the default grasp Interaction parameters for all objects in the scene globally in Global Grasp Interaction Settings. See detailed explanation of the parameters in page grasp Interaction.

Note that Synthesis Method, Interaction Type, Throw Velocity Scale and Throw Angular Velocity Scale can be set locally for each object by attaching VG_Interactable component to the object. These local settings will overwrite the global settings for that object.

Debug Settings

VG Debug Settings.
VG debug settings.
Parameters Description
Export Scene in Runtime Enabling this and running the application will create a vg_tmp subdirectory in your project and save sources that are used for different purposes. (see create debug files)
Export Scene in Editor Alternatively to check “Export Scene in Runtime”, pressing Export Scene in Edit will simulate a launch of the VG plugin from the Unity Editor, thus without the need of launching the scene. This option is provided for convenience, but objects that are not in your scene yet will not be included. (see create debug files)