← Unity APP
← Xvisio SDK Documentation Home Page
Scene:GestureDemo_Grap.unity
In this scene, we will show the gesture recognition function which developed based on the Xvisio AR glass. 25 feature points of both hands will be catched by the fish eye cameras on both sides of the AR glass. And display the position and rotation data of the 25 feature points in the scene as two 25 joint point models.
Simultaneously user can drag, scale, pinch and do other model operations on the model in the scene through gestures, and click the buttons in the space. A dotted line will shown when facing a three-dimensional object with palm. When the dotted line points to an object with a collision body, a solid origin will appear at the end of the dotted line. At this time, gestures and pinching operations can remotely control the virtual objects in the scene through rays.
The virtual object can be scaled by dragging the vertex of the bounding box, and the virtual object can be rotated by dragging the edge center of the bounding box.
User can enlarge or reduce objects by grasping them with both hands and moving them in different directions.
Click the button in the scene through ray or gesture
The object will move and rotate with the gesture when the object is captured.
As shown in the figure below, we provide a simple gesture development scenario.