Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Dev Cogn Neurosci ; 57: 101140, 2022 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-35926469

RESUMEN

Event-Related Potential (ERP) designs are a common method for interrogating neurocognitive function with electroencephalography (EEG). However, the traditional method of preprocessing ERP data is manual-editing - a subjective, time-consuming processes. A number of automated pipelines have recently been created to address the need for standardization, automation, and quantification of EEG data pre-processing; however, few are optimized for ERP analyses (especially in developmental or clinical populations). We propose and validate the HAPPE plus Event-Related (HAPPE+ER) software, a standardized and automated pre-processing pipeline optimized for ERP analyses across the lifespan. HAPPE+ER processes event-related potential data from raw files through preprocessing and generation of event-related potentials for statistical analyses. HAPPE+ER also includes post-processing reports of both data quality and pipeline quality metrics to facilitate the evaluation and reporting of data processing in a standardized manner. Finally, HAPPE+ER includes post-processing scripts to facilitate validating HAPPE+ER performance and/or comparing to performance of other preprocessing pipelines in users' own data via simulated ERPs. We describe multiple approaches with simulated and real ERP data to optimize pipeline performance and compare to other methods and pipelines. HAPPE+ER software is freely available under the terms of GNU General Public License at https://www.gnu.org/licenses/#GPL.

2.
Neuroimage ; 260: 119390, 2022 10 15.
Artículo en Inglés | MEDLINE | ID: mdl-35817295

RESUMEN

Lower-density Electroencephalography (EEG) recordings (from 1 to approximately 32 electrodes) are widely-used in research and clinical practice and enable scalable brain function measurement across a variety of settings and populations. Though a number of automated pipelines have recently been proposed to standardize and optimize EEG pre-processing for high-density systems with state-of-the-art methods, few solutions have emerged that are compatible with lower-density systems. However, lower-density data often include long recording times and/or large sample sizes that would benefit from similar standardization and automation with contemporary methods. To address this need, we propose the HAPPE In Low Electrode Electroencephalography (HAPPILEE) pipeline as a standardized, automated pipeline optimized for EEG recordings with lower density channel layouts of any size. HAPPILEE processes task-free (e.g., resting-state) and task-related EEG (including event-related potential data by interfacing with the HAPPE+ER pipeline), from raw files through a series of processing steps including filtering, line noise reduction, bad channel detection, artifact correction from continuous data, segmentation, and bad segment rejection that have all been optimized for lower density data. HAPPILEE also includes post-processing reports of data and pipeline quality metrics to facilitate the evaluation and reporting of data quality and processing-related changes to the data in a standardized manner. Here the HAPPILEE steps and their optimization with both recorded and simulated EEG data are described. HAPPILEE's performance is then compared relative to other artifact correction and rejection strategies. The HAPPILEE pipeline is freely available as part of HAPPE 2.0 software under the terms of the GNU General Public License at: https://github.com/PINE-Lab/HAPPE.


Asunto(s)
Electroencefalografía , Procesamiento de Señales Asistido por Computador , Artefactos , Electrodos , Electroencefalografía/métodos , Programas Informáticos
3.
Appl Spectrosc ; 72(9): 1322-1340, 2018 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-29855196

RESUMEN

Spectral preprocessing is frequently required to render Raman spectra useful for further processing and analyses. The various preprocessing steps, individually and sequentially, are increasingly being automated to cope with large volumes of data from, for example, hyperspectral imaging studies. Full automation of preprocessing is especially desirable when it produces consistent results and requires minimal user input. It is therefore essential to evaluate the "quality" of such preprocessed spectra. However, relatively few methods exist to evaluate preprocessing quality, and fully automated methods for doing so are virtually non-existent. Here we provide a brief overview of fully automated spectral preprocessing and fully automated quality assessment of preprocessed spectra. We follow this with the introduction of fully automated methods to establish figures-of-merit that encapsulate preprocessing quality. By way of illustration, these quantitative methods are applied to simulated and real Raman spectra. Quality factor and quality parameter figures-of-merit resulting from individual preprocessing step quality tests, as well as overall figures-of-merit, were found to be consistent with the quality of preprocessed spectra.


Asunto(s)
Algoritmos , Automatización de Laboratorios/métodos , Automatización de Laboratorios/normas , Espectrometría Raman/métodos , Espectrometría Raman/normas , Animales , Células CHO , Cricetinae , Cricetulus , Procesamiento de Señales Asistido por Computador
4.
Front Neurosci ; 12: 236, 2018.
Artículo en Inglés | MEDLINE | ID: mdl-29692705

RESUMEN

Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA