Real-time vibrotactile pattern generation and identification using discrete event-driven feedback.
Somatosens Mot Res
; : 1-13, 2023 Feb 07.
Article
en En
| MEDLINE
| ID: mdl-36751096
This study assesses human identification of vibrotactile patterns by using real-time discrete event-driven feedback. Previously acquired force and bend sensor data from a robotic hand were used to predict movement-type (stationary, flexion, contact, extension, release) and object-type (no object, hard object, soft object) states by using decision tree (DT) algorithms implemented in a field-programmable gate array (FPGA). Six able-bodied humans performed a 2- and 3-step sequential pattern recognition task in which state transitions were signaled as vibrotactile feedback. The stimuli were generated according to predicted classes represented by two frequencies (F1: 80 Hz, F2: 180 Hz) and two magnitudes (M1: low, M2: high) calibrated psychophysically for each participant; and they were applied by two actuators (Haptuators) placed on upper arms. A soft/hard object was mapped to F1/F2; and manipulating it with low/high force was assigned to M1/M2 in the left actuator. On the other hand, flexion/extension movement was mapped to F1/F2 in the right actuator, with movement in air as M1 and during object manipulation as M2. DT algorithm performed better for the object-type (97%) than the movement-type (88%) classification in real time. Participants could recognize feedback associated with 14 discrete-event sequences with low-to-medium accuracy. The performance was higher (76 ± 9% recall, 76 ± 17% precision, 78 ± 4% accuracy) for recognizing any one event in the sequences. The results show that FPGA implementation of classification for discrete event-driven vibrotactile feedback can be feasible in haptic devices with additional cues in the physical context.
Texto completo:
1
Colección:
01-internacional
Base de datos:
MEDLINE
Tipo de estudio:
Diagnostic_studies
Idioma:
En
Revista:
Somatosens Mot Res
Asunto de la revista:
NEUROLOGIA
/
PSICOFISIOLOGIA
Año:
2023
Tipo del documento:
Article
País de afiliación:
Estados Unidos
Pais de publicación:
Reino Unido