Affordance
An object affordance determines which kind of hand interaction can be done with the object, and how the object’s JointState reacts to the hand interaction. See object affordances.
ApproachDirection
The direction along which the PushAgent approaches the object to be pushed; PushPivot assigned to a pushable object is used to specify a preferred approach direction by the PushAgent.
Bounce
One of object’s StateAffordance that is only relevant for 1-dof joints; when object is released from hand interaction, it bounces back to the lowest value of the DiscreteStates. See object affordances.
CABVG
Refers to VirtualGrasp’s baking server, to which you can connect through VG_BakingClient – a GUI interface that is part of the VG SDK; it is used for object baking.
Cone
A constrained 3-dof rotational JointType, object rotates around a pivot point (JointCenter) limited by a cone limit, parameterized by a swing limit angle that determines the cone size, and twist limit angle that determines how much the object can rotate around the axis (center axis of the cone)
Controller
Refers to a hand sensing device such as hand-held VR controllers like Oculus Touch or finger tracking devices like LeapMotion; they can sense wrist position, orientation and optionally finger poses; in VG we also refer to a controller as sensor.
DiscreteStates
The parameter for object articulation only relevant for 1-dof joints (PRISMATIC and REVOLUTE), and they are a set of N (N>=2) joint state values in the range of JointLimit; they are needed for State Affordances (BOUNCE, TWO_STAGE, SNAPS) which, when hand releases from the object, will pin the object into one of these discrete states.
DynamicGrasp
Or DG, is a grasp synthesis method that computes grasping configuration at the moment of grasp triggering. See grasp interaction.
FingerControlType
Describes how sensor is controlling the finger movement on the avatar hands; the different finger control types are explained in detail in sensor settings.
Fixed
A completely constrained JointType, an object won’t be moved relative to its parent object.
Floating
An unconstrained JointType, an object with this type of joint can be moved by hand freely.
GameObject
In Unity, any node in the Hierarchy is called a GameObject.
Gesture
A hand forms a gesture in air without contacting or interacting with object. Currently VG only support index finger push gesture. And it is only defined at a single frame as a “static” gesture. For example we are not supporting more “dynamic” gestures like waving yet.
Grasp
A hand grasps and holds an object.
GraspConfiguration
Describes the hand pose when grasping an object; including wrist position, orientation and all finger bone’s rotation angles; it can be synthesized in runtime by VG’s GraspSynthesis algorithms.
GraspSpeed
A parameter in unit (second) specifying how long it takes for hand to move from SensorPose to synthesized grasp pose in TriggerGrasp interaction type, and how long it takes for object to jump to hand in JumpGrasp interaction type.
GraspSynthesis
A runtime process of creating hand grasping configuration – the wrist and fingers pose w.r.t the object – when an user triggers grasp with VR controllers.
GraspSynthesisMethod
Describes the method used for GraspSynthesis. See synthesis method.
GraspType
What types of grasps depicting shape of hand/fingers when holding an object, for example Pinch, Power grasp types.
InteractionAffordance
An object’s interaction affordance determines what kind of action the hand can perform on the object, for example, graspable means the object can be grasped by a hand. See object affordances.
InteractionSegment
Refers to a segment of recorded sensor data that contains continuous frames for hand interaction with an object, or without any object. See sensor recording and replay.
InteractionSequence
Refers to successive order of two or more recorded InteractionSegment that contains continuous frames for hand interaction with an object, or without any object. See sensor recording and replay.
InteractionType
Describes how hand/object moves when VR user selects an object or triggers grasp to start interacting with an object. See interaction type.
InteractiveBehaviors
Refers to when hand(s) interact with an object, how the object behaves (moves); it is determined by the object’s joint and affordances setup.
Joint
Joint is where two bones or GameObjects meet, and in VirtualGrasp it defines how a child object moves w.r.t. its parent (not the other way around); and different from the physics-based joint systems in some game engines like Unity, VG’s joint system is non-physical (kinematic).
JointAxis
The axis of the joint, specified by the Pivot (Anchor) transform’s zaxis; for Revolute joint the axis around which the object rotates; for Prismatic joint the axis along which the object translate; for Planar joint this is normal of the plane; for Cone joint the axis in the center of the cone representing zero swing angle.
JointAxis2
The secondary axis of the joint, specified by the Pivot (Anchor) transform’s xaxis; this is the xaxis for PLANAR joint’s rectangular limit.
JointCenter
The center point of an object’s joint, around which an object is rotating around, specified by the Pivot transform’s position.
JointLimit
The parameter for object articulation of any joint types. For 1-dof joints (PRISMATIC and REVOLUTE) the limit is defined by a range of scalar values [Min, Max]. For 3-dof rotational joint (CONE) the limit is defined by a [swing, twist] angles.
JointState
The state of a 1-dof joint (PRISMATIC or REVOLUTE), or the state along anchor’s xaxis of the PLANAR joint, which can be represented by a single scalar value in the range of JointLimit; value 0 corresponds to the zero pose of the joint defined by the initial pose of the object.
JointType
The type of a VG Joint, can be Floating, Fixed, Revolute, Prismatic and Cone. See object articulation.
JumpGrasp
Is a grasp interaction type where when user triggers grasp, the object jumps to the grasped position in the avatar’s hand. See interaction type.
JumpPrimaryGrasp
Is a grasp interaction type where when user triggers grasp, the object jumps to the grasped position in the avatar’s hand one of labeled primary grasps in grasp DB. See interaction type.
ObjectBaking
Refers to a preprocess of expensive object analysis, which is needed for DynamicGrasp synthesis in runtime, can also be referred to as Grasp Baking.
ObjectHierarchy
Refers to parent-child relations between objects in a game scene; VG synchronizes its internal object hierarchy with that in the game scene in runtime.
ObjectSelection
A runtime process of user selecting one of the many objects in the scene to interact with (interaction can be push or grasp); note only one object can be selected at a time.
PhysicalAvatar
Refers to the avatar whose hands are physical, currently with colliders attached to the palm and fingers.
PhysicalObject
Refers to an object with dynamical properties assigned; in Unity a GameObject that has Rigidbody (non-kinematic) or Articulation Body component assigned is a physical object.
Pivot
Pivot or Anchor is a transform needed to specify the JointCenter (position) and JointAxis (zaxis) of an object Joint; for PLANAR joint (xaxis, yaxis) are used to define rectangular limit space on the plane.
Planar
A constrained 2-dof translational JointType, object moves along a defined planary surface, limited by a rectangular shaped area.
PreviewGrasp
Is a grasp interaction type where once user selected an object, the grasp configuration is previewed on the object, so that user can push the trigger button to pick up the object if the grasp is satisfactory. See interaction type.
PreviewOnly
Is a grasp interaction type where once user selected an object, the grasp configuration is previewed on the object, and the grasp trigger won’t take effect to pick up object. See interaction type.
PrimaryGrasp
Is a label for a grasp entry in grasp DB to indicate it is one of the primarily used grasps on an object for a given hand; and used only JumpPrimaryGrasp interaction type. See interaction type.
Prismatic
A constrained 1-dof translational JointType, object moves linearly along an axis, limited by an distance range.
PushAgent
Refers to who is performing the push action, for example if index finger tip is pushing, index finger tip is the push agent.
PushPivot
Push Pivot or Push Direction is a transform to specify along which direction (zaxis) the hand can approach and push an object that has push affordance; this direction is not the direction of object movement, but only used for object selection for push without physics.
ReleaseSpeed
A parameter in unit (second) specifying how long it takes for hand to move back to SensorPose once a grasp is released.
ReplayAvatar
Refers to the avatar whose hands are controlled by pre-recorded sensor signal to move and interact with the objects.
Revolute
A constrained 1-dof rotational JointType, object rotates around an axis through a pivot point (JointCenter), limited by an angle range.
ScrewJoint
A joint of type Revolute when the parameter ScrewRate > 0, then the object will translate at an amount of ScrewRate (cm / degree) for every degree of rotation.
ScrewRate
A parameter for Revolute joint, describing how much the object linearly move along the axis given every degree of rotation, in unit (cm / degree); if > 0 will turn a Revolute joint into a ScrewJoint.
SecondaryJointState
The state along anchor’s yaxis of the PLANAR joint; value 0 corresponds to the zero pose of the joint defined by the initial pose of the object.
SelectionWeight
Refers to the weight specifying the likelihood of an object to be selected for interaction (grasp or push) with hands; by default all object have 1 as selection weight; 0 means an object is completely hidden for interaction, and there is no upper boundary for this weight.
Sensor
Refers to a hand sensing device such as VR controllers like Oculus Touch or finger tracking devices like LeapMotion; they can sense wrist position, orientation and optionally finger poses; since sensor is used to control avatars in VR, it can also be referred to as controller.
SensorAvatar
Refers to the avatar whose hands are controlled by a sensor signal to move and interact with the objects.
SensorData
Refers to the signals provided by a hand sensing device such as VR controllers or finger tracking devices; it contains the wrist position, orientation, and optionally finger poses, as well as the grasp trigger. See sensor record and replay.
SensorOffset
When the virtual hands do not match to the position or rotation of your real hands holding the controllers, you can adjust an offset to synchronize them.
SensorPose
The 3D position and orientation of an avatar’s wrist determined by a sensor (VR controller or finger tracking devices); note this can be different from the actual rendered pose of the avatar wrist. See grasp interaction.
SensorRecordAndReplay
Refers to the functionality of VirtualGrasp to allow recording SensorData during runtime interaction, and later replaying to reproduce those interaction. See sensor record and replay.
Snaps
One of object’s StateAffordance that is only relevant for 1-dof joints or 2-dof Planer joint; when object is released from hand interaction, it snaps to the closest value of the DiscreteStates. See object affordances.
StateAffordance
An object’s state affordance determines how the object’s JointState reacts to the hand interaction. See object affordances.
StaticGrasp
Or SG, is a grasp synthesis method that creates grasping configuration from one of N grasps stored in a grasp database. See grasp interaction.
StickyHand
Is a fall-back grasp interaction type when object is not baked, where the grasp configuration is directly taken from the hand pose at the moment of grasp triggering as if hand is stick to the object. See interaction type.
SwingAngle
For 3-dof Cone joint, this is an angle describes how much object swing away from the joint axis (at the center of the Cone-shaped joint limit boundary).
TriggerGrasp
Is a grasp interaction type where when user triggers grasp, avatar hand moves to the wrist pose in the synthesized grasp configuration around the object.
TwistAngle
For 3-dof Cone joint, this is an angle describes how much the object is rotating around its swinged away axis.
TwoStage
One of object’s StateAffordance that is only relevant for 1-dof joints; when object is released from hand interaction, it bounces back to the highest and lowest values of the DiscreteStates in an alternating order. See object affordances.
VGInteractable
An object is described as VG-interactable means it can be grasped or pushed by hands supported by VG, and adding a VG_Articulation component will make object interactable.
VGInternalScript
Refers to a non-customizable component and part of fundamental communication functions of the VirtualGrasp API; it is a Unity script that inherits from Monobehavior so you can use this as a component to a GameObject in Unity.
VGPublicScript
Refers to a customizable component that exemplifies the use of certain communication functions of the VirtualGrasp API; it is a Unity script that inherits from Monobehavior so you can use this as a component to a GameObject in Unity.
VGSceneFiles
Refers to VirtualGrasp’s JSON representation of the current interaction configuration of the game scene.
VirtualGrasp
Gleechi’s software for hand interaction.