Exoskeleton Haptic Interface for Bimanual Manipulation of Virtual objects

depositphotos_90001218-stock-illustratio

The ability to dexterously manipulate the surrounding objects occurs with external sensory perturbations that manifest with spatial and mechanical physical appearances. Standard research approaches typically estimate the hemispheric specialization with unimanual or bimanual tasks in which the two hands solve distinguished functions. However, the most common Activities of Daily Living require coordinated use of both arms and the functional coordinated interaction between the two cerebral hemispheres to exploit the bimanual coordination. Since the remarkable importance of haptic feedback and coupled bimanual interaction for practical applications and thanks to the currently available technologies, our approach aims to provide evidence that haptic control and virtual reality could be available methods for analysing bimanual coordination strategies. To this extent, our work consists in (i) implementing an exoskeleton-based haptic interface with virtual objects providing customized stimuli; (ii) identifying the strategies of motor control, i.e., the distribution of functional roles between the two limbs concurring in a bimanual coupled manipulation scenario; (iii) characterizing how subjects' performance change according to the perception of different simulated objects with variable compliances.

 

Following this rationale, we designed a specific task where subjects can perform bimanual manipulation/reaching actions employing the 6 DoFs exoskeleton, ALEx-RS (Figure 1A). Participants were requested to grab and lift virtual objects using two arms and move them across a 3D workspace. Four different types of visuo-haptic feedback conditions were implemented to replicate objects of various dynamics and, therefore, compliances with the associated weight and inertia (Figure 1B).

Figure1_A.png
Figure1_B.png

                         (A)                                                                                                        (B)

 

Figure 1: (A) The subject is performing the experiment while wearing the ALEx-RS exoskeleton (http://www.wearable-robotics.com/kinetek/). The virtual scenario includes the virtual object (VO, red cube), the god-objects (GOl and GOr left and right black spheres), the reaching targets (yellow shapes: left (L), central (C) and right (R)) and the target reference to ensure a correct 3D visualization (green shape). (B) Physical features of the four virtual objects (VOs): HSHB (High Stiffness, High Breaking point), MSHB (Medium Stiffness, High Breaking point), LSHB (Low Stiffness, High Breaking point) and HSLB (High Stiffness, Low Breaking point).

 

The controller’s architecture has been designed to provide stable force feedback during manipulations of virtual objects’ and included three main components (Figure 2): a Virtual Haptic Unit (VHU), an Impedance Controller, and a Torque Computation module.

Figure2.png

 

Figure 2: Control Architecture Scheme. (A) The user wearing the exoskeleton ALEx-RS. (B) High-level controller (Virtual Haptic Unit - VHU): a visual representation of the interactive scenario. Here is computed the “physical-based” response to the user’s movements in the virtual environment, generated by using Unity 3D. (C) Impedance Controller module: this block computes the haptic feedback. The force generation is calculated by means of elastic (K) and viscous (C) components, which are proportional to the distance between the End Effector spheres (𝑥𝐸𝐸: 𝑥𝐸𝐸𝑙 or 𝑥𝐸𝐸𝑟) and the God-Objects (𝑥𝐺𝑂: 𝑥𝐺𝑂𝑙 or 𝑥𝐺𝑂𝑟) and the corresponding velocities of the End Effector spheres (𝑥̇𝐸𝐸: 𝑥̇𝐸𝐸𝑙 or 𝑥̇𝐸𝐸𝑟) and the God-Objects (𝑥̇𝐺𝑂: 𝑥̇𝐺𝑂𝑙 or 𝑥̇𝐺𝑂𝑟). (D) Torque Computation module: here the force values at each EE (𝐹𝐸𝐸𝑟, 𝐹𝐸𝐸𝑙) are then sent back to the device, and then converted in torques at the joints (τ) of each exoskeleton, by means of the transpose of the Jacobian (𝐽 𝑇 ).

 

Our outcomes on healthy subjects show the potentialities of the implemented haptic interface since the designed control represents a starting point for a fully customized and measurable haptic environment. The current setup implementation could have implications and usages for several teleoperated applications, complementing stable haptic feedback involving the whole upper limbs.