Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Artículo en Inglés | MEDLINE | ID: mdl-38231812

RESUMEN

Neural architecture search (NAS) has shown great promise in automatically designing neural network models. Recently, block-wise NAS has been proposed to alleviate deep coupling problem between architectures and weights existed in the well-known weight-sharing NAS, by training the huge weight-sharing supernet block-wisely. However, the existing block-wise NAS methods, which resort to either supervised distillation or self-supervised contrastive learning scheme to enable block-wise optimization, take massive computational cost. To be specific, the former introduces an external high-capacity teacher model, while the latter involves supernet-scale momentum model and requires a long training schedule. Considering this, in this work, we propose a resource-friendly deeply supervised block-wise NAS (DBNAS) method. In the proposed DBNAS, we construct a lightweight deeply-supervised module after each block to enable a simple supervised learning scheme and leverage ground-truth labels to indirectly supervise optimization of each block progressively. Besides, the deeply-supervised module is specifically designed as structural and functional condensation of the supernet, which establishes global awareness for progressive block-wise optimization and helps search for promising architectures. Experimental results show that the DBNAS method only takes less than 1 GPU day to search out promising architectures on the ImageNet dataset with less GPU memory footprint than the other block-wise NAS works. The best-performing model among the searched DBNAS family achieves 75.6% Top-1 accuracy on ImageNet, which is competitive with the state-of-the-art NAS models. Moreover, our DBNAS family models also achieve good transfer performance on CIFAR-10/100, as well as two downstream tasks: object detection and semantic segmentation.

2.
Front Neurorobot ; 15: 627157, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-33574748

RESUMEN

In this paper, an adaptive locomotion control approach for a hexapod robot is proposed. Inspired from biological neuro control systems, a 3D two-layer artificial center pattern generator (CPG) network is adopted to generate the locomotion of the robot. The first layer of the CPG is responsible for generating several basic locomotion patterns and the functional configuration of this layer is determined through kinematics analysis. The second layer of the CPG controls the limb behavior of the robot to adapt to environment change in a specific locomotion pattern. To enable the adaptability of the limb behavior controller, a reinforcement learning (RL)-based approach is employed to tune the CPG parameters. Owing to symmetrical structure of the robot, only two parameters need to be learned iteratively. Thus, the proposed approach can be used in practice. Finally, both simulations and experiments are conducted to verify the effectiveness of the proposed control approach.

3.
Front Robot AI ; 6: 113, 2019.
Artículo en Inglés | MEDLINE | ID: mdl-33501128

RESUMEN

Soft robots have recently received much attention with their infinite degrees of freedoms and continuously deformable structures, which allow them to adapt well to the unstructured environment. A new type of soft actuator, namely, dielectric elastomer actuator (DEA) which has several excellent properties such as large deformation and high energy density is investigated in this study. Furthermore, a DEA-based soft robot is designed and developed. Due to the difficulty of accurate modeling caused by nonlinear electromechanical coupling and viscoelasticity, the iterative learning control (ILC) method is employed for the motion trajectory tracking with an uncertain model of the DEA. A D 2 type ILC algorithm is proposed for the task. Furthermore, a knowledge-based model framework with kinematic analysis is explored to prove the convergence of the proposed ILC. Finally, both simulations and experiments are conducted to demonstrate the effectiveness of the ILC, which results show that excellent tracking performance can be achieved by the soft crawling robot.

4.
Bioinspir Biomim ; 13(6): 066005, 2018 10 09.
Artículo en Inglés | MEDLINE | ID: mdl-30221628

RESUMEN

Soft actuators have played an indispensable role in generating compliant motions of soft robots. Among the various soft actuators explored for soft robotic applications, dielectric elastomer actuators (DEAs) have caught the eye with their intriguing attributes similar to biological muscles. However, the control challenge of DEAs due to their strong nonlinear behaviors has hindered the development of DEA-based soft robots. To overcome the control challenge, this paper proposes a bioinspired control approach of DEAs. A three-dimensional muscle-like DEA, capable of large forces and giant deformation, is fabricated and adopted as the control platform. To facilitate the controller design, the dynamic model of the DEA is developed through experimental analysis, which takes electromechanical coupling, viscoelastic effects and dynamics uncertainties into consideration. Motivated by the proprioception of the biological muscles, the self-sensing capability of the actuator is explored and exhibits good accuracy. Thus the self-sensing of the actuator is utilized to provide the sensory feedback in the control loop without the need of additional external sensors. Inspired from the role of the cerebellum in motor learning, a cerebellum model articulation nonlinear controller is proposed to compensate the dynamics uncertainties and to provide motion correction. Finally, the effectiveness of the proposed control approach is verified by both the simulation and the experiments.


Asunto(s)
Biomimética/métodos , Sistema Musculoesquelético/fisiopatología , Robótica/métodos , Cerebelo/fisiología , Elastómeros/química , Retroalimentación , Movimiento (Física) , Neuronas Motoras/fisiología , Dinámicas no Lineales
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA