Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Sci Rep ; 12(1): 15938, 2022 09 24.
Artículo en Inglés | MEDLINE | ID: mdl-36153413

RESUMEN

Floor cleaning robots are widely used in public places like food courts, hospitals, and malls to perform frequent cleaning tasks. However, frequent cleaning tasks adversely impact the robot's performance and utilize more cleaning accessories (such as brush, scrubber, and mopping pad). This work proposes a novel selective area cleaning/spot cleaning framework for indoor floor cleaning robots using RGB-D vision sensor-based Closed Circuit Television (CCTV) network, deep learning algorithms, and an optimal complete waypoints path planning method. In this scheme, the robot will clean only dirty areas instead of the whole region. The selective area cleaning/spot cleaning region is identified based on the combination of two strategies: tracing the human traffic patterns and detecting stains and trash on the floor. Here, a deep Simple Online and Real-time Tracking (SORT) human tracking algorithm was used to trace the high human traffic region and Single Shot Detector (SSD) MobileNet object detection framework for detecting the dirty region. Further, optimal shortest waypoint coverage path planning using evolutionary-based optimization was incorporated to traverse the robot efficiently to the designated selective area cleaning/spot cleaning regions. The experimental results show that the SSD MobileNet algorithm scored 90% accuracy for stain and trash detection on the floor. Further, compared to conventional methods, the evolutionary-based optimization path planning scheme reduces 15% percent of navigation time and 10% percent of energy consumption.


Asunto(s)
Aprendizaje Profundo , Robótica , Algoritmos , Pisos y Cubiertas de Piso , Humanos , Robótica/métodos
2.
Sensors (Basel) ; 22(1)2021 Dec 21.
Artículo en Inglés | MEDLINE | ID: mdl-35009556

RESUMEN

Vibration is an indicator of performance degradation or operational safety issues of mobile cleaning robots. Therefore, predicting the source of vibration at an early stage will help to avoid functional losses and hazardous operational environments. This work presents an artificial intelligence (AI)-enabled predictive maintenance framework for mobile cleaning robots to identify performance degradation and operational safety issues through vibration signals. A four-layer 1D CNN framework was developed and trained with a vibration signals dataset generated from the in-house developed autonomous steam mopping robot 'Snail' with different health conditions and hazardous operational environments. The vibration signals were collected using an IMU sensor and categorized into five classes: normal operational vibration, hazardous terrain induced vibration, collision-induced vibration, loose assembly induced vibration, and structure imbalanced vibration signals. The performance of the trained predictive maintenance framework was evaluated with various real-time field trials with statistical measurement metrics. The experiment results indicate that our proposed predictive maintenance framework has accurately predicted the performance degradation and operational safety issues by analyzing the vibration signal patterns raised from the cleaning robot on different test scenarios. Finally, a predictive maintenance map was generated by fusing the vibration signal class on the cartographer SLAM algorithm-generated 2D environment map.


Asunto(s)
Inteligencia Artificial , Robótica , Algoritmos , Vibración
3.
Sensors (Basel) ; 22(1)2021 Dec 30.
Artículo en Inglés | MEDLINE | ID: mdl-35009802

RESUMEN

Periodic inspection of false ceilings is mandatory to ensure building and human safety. Generally, false ceiling inspection includes identifying structural defects, degradation in Heating, Ventilation, and Air Conditioning (HVAC) systems, electrical wire damage, and pest infestation. Human-assisted false ceiling inspection is a laborious and risky task. This work presents a false ceiling deterioration detection and mapping framework using a deep-neural-network-based object detection algorithm and the teleoperated 'Falcon' robot. The object detection algorithm was trained with our custom false ceiling deterioration image dataset composed of four classes: structural defects (spalling, cracks, pitted surfaces, and water damage), degradation in HVAC systems (corrosion, molding, and pipe damage), electrical damage (frayed wires), and infestation (termites and rodents). The efficiency of the trained CNN algorithm and deterioration mapping was evaluated through various experiments and real-time field trials. The experimental results indicate that the deterioration detection and mapping results were accurate in a real false-ceiling environment and achieved an 89.53% detection accuracy.


Asunto(s)
Aprendizaje Profundo , Robótica , Algoritmos , Animales , Redes Neurales de la Computación , Roedores
4.
Sensors (Basel) ; 20(18)2020 Sep 15.
Artículo en Inglés | MEDLINE | ID: mdl-32942750

RESUMEN

Insect detection and control at an early stage are essential to the built environment (human-made physical spaces such as homes, hotels, camps, hospitals, parks, pavement, food industries, etc.) and agriculture fields. Currently, such insect control measures are manual, tedious, unsafe, and time-consuming labor dependent tasks. With the recent advancements in Artificial Intelligence (AI) and the Internet of things (IoT), several maintenance tasks can be automated, which significantly improves productivity and safety. This work proposes a real-time remote insect trap monitoring system and insect detection method using IoT and Deep Learning (DL) frameworks. The remote trap monitoring system framework is constructed using IoT and the Faster RCNN (Region-based Convolutional Neural Networks) Residual neural Networks 50 (ResNet50) unified object detection framework. The Faster RCNN ResNet 50 object detection framework was trained with built environment insects and farm field insect images and deployed in IoT. The proposed system was tested in real-time using four-layer IoT with built environment insects image captured through sticky trap sheets. Further, farm field insects were tested through a separate insect image database. The experimental results proved that the proposed system could automatically identify the built environment insects and farm field insects with an average of 94% accuracy.


Asunto(s)
Aprendizaje Profundo , Insectos , Internet de las Cosas , Control de Plagas , Animales , Redes Neurales de la Computación
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA