MyVirtualGrasp is a public script inherited from VG_MainScript, which encodes the main functionality of VirtualGrasp, and is the main component that you need to add and configure in your project to enable VirtualGrasp.
In Unity, VG_MainScript inherits from Monobehavior so you can use it as a component on a GameObject.
In the VirtualGrasp SDK, you can find the Prefab “GleechiLib” from the ThirdParty/VirtualGrasp/Resources directory that has been configured with the MyVirtualGrasp component. You can simply drag and drop this into your scene. In your Hierarchy view, you will see the instantiated GameObject – GleechiLib.
On this page, we are going to describe all the major configuration options covered in MyVirtualGrasp.cs.
AutoSetup & Sensors
VirtualGrasp allows you to configure multiple sensors in an application.
This allows developers to combine two sensors to control avatar’s hands. For example you can choose to use a data glove to control avatar’s finger pose and grasp triggers, while using an Oculus touch controller to control wrist position and orientation. Though this is not most common setup for today’s development use cases, this feature may become useful expecially for research and development of new hand controllers.
In the majority of use cases only 1 single sensor is used.
As you can see in MyVirtualGrasp, Sensors is a list in the interface. The first sensor element is listed as Element 0. All of the sensor elements will share the same interface, so in the descriptions below, we will focus on the importance of each element for each Sensor.
AutoSetup will auto-configure a number of controller-related settings and thereyby allow you to quickly switch between different controller inputs, such as UnityXR (e.g. supporting Quest), LeapMotion, Mouse, and others.
The integer value (0 in the image) relates to the element of the Sensors list that you want to auto-configure. In most cases you will have only one avatar in your scene that is controlled by a single sensor, so 0 is the default. However, if you use multiple sensor elements, you can also quickly auto-configure them by modifying the integer value.
For each sensor, you can assign multiple avatars, though in most cases you will have only one avatar per sensor.
|Model||should be HUMANOID_HAND in most application use cases. There could also be robotic hand options, but they will not be discussed here.|
|SkeletalMesh||used to provide a reference to the SkinnedMeshRenderer of the avatar that you have imported in your scene and which should be controlled by VG during runtime.|
|Remote||enable this if you want to use this avatar to reflect networked data (i.e. listening to another client over network in a multiplayer scenario), as explained in Multiplayer Interaction, or the VG_Networking Component.|
|Physical||enable this if you want VG to create colliders for this avatar and enable the hand for physical interactions. NOTE: at the moment, this option is experimental and should not be used apart from testing.|
|Sensor||supported||will always be External Controller for the VG SDK.|
|External||supported||name of the external controller, as a string, so one can write your own external controller.|
|Finger Control Type||supported||specify how sensor controls the finger motion. See Finger Control Type.|
|Control||not supported||specify what this sensor element controls. If you added two sensors, then one could control wrist position, rotation and haptics, another controls fingers and grasp for example.|
|Origin||supported||the origin of sensor where the sensor data is interpreted.|
|Offset||supported||when the virtual hands do not match to the position or rotation of your real hands holding the controllers, you can adjust the offset to synchronize them. Note that the hand coordinate system’s axes, XYZ, are defined like you strech out three axes with thumb, index, and middle finger (i.e. X is thumb up, Y is index forward, and Z is middle inward) of each hand. In other words, with a fully flat hand, all finger point along the positive Y axis, and your palm faces the positive Z axis.|
Source: Original by PrimalShell, Creative Commons Attribution-Share Alike 3.0 Unported license.
After you have setup how your avatar’s hands are controlled, you can use the Sensor Settings interface to specify Trigger Button globally for all of the sensors.
VirtualGrasp is using names to identify which objects are marked as interactable. You can customize component and layer names in MyVirtualGrasp → Object Identifiers. VG_Articulation component is a default entry, but this method also allows you to quickly adjust your project if you already have a layer or a component that marks your interactable objects.
Grasp Interaction Settings
Debug settings will show up when “Show Advanced” is checked.
Save Debug Files
|Save Debug Files||Enabling this and running the application will create a vg_tmp subdirectory in your project and save sources that are used for different purposes (see debug files).|
|Physics Default Contact Offset||Overwrite Unity physics contact offset for more accurate collision detection. Currently only relevant for the experimental feature push with physics.|