Description

MyVirtualGrasp is a public script inherited from VG_MainScript, which encodes the main functionality of VirtualGrasp, and is the main component that you need to add and configure in your project to enable VirtualGrasp.

In Unity, VG_MainScript inherits from Monobehavior so you can use it as a component on a GameObject.

On this page, we are going to describe all the major configuration options covered in MyVirtualGrasp.cs.

AutoSetup & Sensors

In VirtualGrasp we use the term sensor and controller exchangeably since a controller is essentially a sensing device for hand poses.

VirtualGrasp allows you to configure multiple sensors in an application.

This allows developers to combine two sensors to control avatar’s hands. For example you can choose to use a data glove to control avatar’s finger pose and grasp triggers, while using an Oculus touch controller to control wrist position and orientation. Though this is not most common setup for today’s development use cases, this feature may become useful expecially for research and development of new hand controllers.

In the majority of use cases only 1 single sensor is used.

As you can see in MyVirtualGrasp, Sensors is a list in the interface. The first sensor element is listed as Element 0. All of the sensor elements will share the same interface, so in the descriptions below, we will focus on the importance of each element for each Sensor.

Sensor configuration options in Unity.
Sensor configuration options in Unity.

AutoSetup

AutoSetup will auto-configure a number of controller-related settings and thereyby allow you to quickly switch between different controller inputs, such as UnityXR (e.g. supporting Quest), LeapMotion, Mouse, and others.

Select an VG_AutoSetup option from the dropdown menu, then click “Setup” to automatically adjust the Sensor options and SensorSetting:FingerControlType).

Pay attention to the Console in case there is anything you may need to take care of manually to complete the auto-setup process.

The integer value (0 in the image) relates to the element of the Sensors list that you want to auto-configure. In most cases you will have only one avatar in your scene that is controlled by a single sensor, so 0 is the default. However, if you use multiple sensor elements, you can also quickly auto-configure them by modifying the integer value.

Avatars

For each sensor, you can assign multiple avatars, though in most cases you will have only one avatar per sensor.

Option Description
Model should be HUMANOID_HAND in most application use cases. There could also be robotic hand options, but they will not be discussed here.
SkeletalMesh used to provide a reference to the SkinnedMeshRenderer of the avatar that you have imported in your scene and which should be controlled by VG during runtime.
Replay enable this if you want to use this avatar for replay recorded sensor data, as explained in Sensor Record and Replay, or the VG_Recorder Component. This is a feature that is not available in the free version.
Remote This is a placeholder for advanced multiplayer support for hand interaction that we are working on towards a stable version. This is a feature that is not available in the free version.
Physical enable this if you want VG to create colliders for this avatar and enable the hand for physical interactions. NOTE: at the moment, this option is experimental and should not be used apart from testing.

Sensor

Option AutoSetup Description
Sensor supported will always be External Controller for the VG SDK.
External supported name of the external controller, as a string, so one can write your own external controller.
Finger Control Type supported specify how sensor controls the finger motion. See Finger Control Type.
Control not supported specify what this sensor element controls. If you added two sensors, then one could control wrist position, rotation and haptics, another controls fingers and grasp for example.
Origin supported the origin of sensor where the sensor data is interpreted.
Offset supported when the virtual hands do not match to the position or rotation of your real hands holding the controllers, you can adjust the offset to synchronize them. Note that the hand coordinate system’s axes, XYZ, are defined like you strech out three axes with thumb, index, and middle finger (i.e. X is thumb up, Y is index forward, and Z is middle inward) of each hand. In other words, with a fully flat hand, all finger point along the positive Y axis, and your palm faces the positive Z axis.
LHS/RHS
The offset is applied in LHS (left hand system) for the left and RHS (right hand system) for the right hand.
Source: Original by PrimalShell, Creative Commons Attribution-Share Alike 3.0 Unported license.

Sensor Settings

Sensor settings in Unity.
Sensor settings in Unity.

After you have setup how your avatar’s hands are controlled, you can use the Sensor Settings interface to specify Trigger Button globally for all of the sensors.

Global Grasp Interaction Settings

VG global grasp interaction settings.
VG global grasp interaction settings

You can set the default grasp Interaction parameters for all objects in the scene globally in Global Grasp Interaction Settings. See detailed explanation of the parameters in page grasp Interaction.

Note that Synthesis Method, Interaction Type, Throw Velocity Scale and Throw Angular Velocity Scale can be set locally for each object by attaching VG_Interactable component to the object. These local settings will overwrite the global settings for that object.

Debug Settings

VG Debug Settings.
VG debug settings.
Parameters Description
Export Scene in Runtime Enabling this and running the application will create a vg_tmp subdirectory in your project and save sources that are used for different purposes. (see create debug files)
Export Scene in Editor Alternatively to check “Export Scene in Runtime”, pressing Export Scene in Edit will simulate a launch of the VG plugin from the Unity Editor, thus without the need of launching the scene. This option is provided for convenience, but objects that are not in your scene yet will not be included. (see create debug files)