RESUMEN
The conceptual expansion, fast development, and general acceptance of flow analysis are consequence of its adherence to the principles of green and white analytical chemistry, and chemical derivatization plays an essential role in this context. Through the flow analysis development, however, some of its potentialities and limitations have been overlooked. This is more evident when the involved modifications in flow rates, timing and/or manifold architecture deteriorate the analytical signals. These aspects have not always been systematically investigated, and are addressed here in relation to flow analyzers with UV-Vis spectrophotometric detection. Novel strategies for solution handling, guidance for dealing with the aforementioned analytical signal deterioration, and an alternative possibility for exploiting differential aspiration are presented. The concept of blank reagent carrier stream is proposed.
RESUMEN
A recent paper shows that in gene expression space the manifold spanned by normal tissues and the manifold spanned by the corresponding tumors are disjoint. The statement is based on a two-dimensional projection of gene expression data. In the present paper, we show that, for the multi-dimensional vectors defining the centers of cloud samples: 1. The closest tumor to a given normal tissue is the tumor developed in that tissue, 2. Two normal tissues define quasi-orthogonal directions, 3. A tumor may have a projection onto its corresponding normal tissue, but it is quasi-orthogonal to all other normal tissues, and 4. The cancer manifold is roughly obtained by translating the normal tissue manifold along an orthogonal direction defined by a global cancer progression axis. These geometrical properties add a new characterization of normal tissues and tumors and may have biological significance. Indeed, normal tissues at the vertices of a high-dimensional simplex could indicate genotype optimization for given tissue functions, and a way of avoiding errors in embryonary development. On the other hand, the cancer progression axis could define relevant pan-cancer genes and seems to be consistent with the atavistic theory of tumors.
Asunto(s)
Neoplasias , Humanos , Neoplasias/genética , Neoplasias/patología , Regulación Neoplásica de la Expresión Génica , Algoritmos , Perfilación de la Expresión Génica/métodos , Progresión de la EnfermedadRESUMEN
Single-cell transcriptomics (scRNA-seq) is revolutionizing biological research, yet it faces challenges such as inefficient transcript capture and noise. To address these challenges, methods like neighbor averaging or graph diffusion are used. These methods often rely on k-nearest neighbor graphs from low-dimensional manifolds. However, scRNA-seq data suffer from the 'curse of dimensionality', leading to the over-smoothing of data when using imputation methods. To overcome this, sc-PHENIX employs a PCA-UMAP diffusion method, which enhances the preservation of data structures and allows for a refined use of PCA dimensions and diffusion parameters (e.g., k-nearest neighbors, exponentiation of the Markov matrix) to minimize noise introduction. This approach enables a more accurate construction of the exponentiated Markov matrix (cell neighborhood graph), surpassing methods like MAGIC. sc-PHENIX significantly mitigates over-smoothing, as validated through various scRNA-seq datasets, demonstrating improved cell phenotype representation. Applied to a multicellular tumor spheroid dataset, sc-PHENIX identified known extreme phenotype states, showcasing its effectiveness. sc-PHENIX is open-source and available for use and modification.
RESUMEN
The use of sensors in different applications to improve the monitoring of a process and its variables is required as it enables information to be obtained directly from the process by ensuring its quality. This is now possible because of the advances in the fabrication of sensors and the development of equipment with a high processing capability. These elements enable the development of portable smart systems that can be used directly in the monitoring of the process and the testing of variables, which, in some cases, must evaluated by laboratory tests to ensure high-accuracy measurement results. One of these processes is taste recognition and, in general, the classification of liquids, where electronic tongues have presented some advantages compared with traditional monitoring because of the time reduction for the analysis, the possibility of online monitoring, and the use of strategies of artificial intelligence for the analysis of the data. However, although some methods and strategies have been developed, it is necessary to continue in the development of strategies that enable the results in the analysis of the data from electrochemical sensors to be improved. In this way, this paper explores the application of an electronic tongue system in the classification of liquor beverages, which was directly applied to an alcoholic beverage found in specific regions of Colombia. The system considers the use of eight commercial sensors and a data acquisition system with a machine-learning-based methodology developed for this aim. Results show the advantages of the system and its accuracy in the analysis and classification of this kind of alcoholic beverage.
Asunto(s)
Nariz Electrónica , Gusto , Inteligencia Artificial , Bebidas , Bebidas Alcohólicas , LenguaRESUMEN
The causes of ventricular fibrillation (VF) are not yet elucidated, and it has been proposed that different mechanisms might exist. Moreover, conventional analysis methods do not seem to provide time or frequency domain features that allow for recognition of different VF patterns in electrode-recorded biopotentials. The present work aims to determine whether low-dimensional latent spaces could exhibit discriminative features for different mechanisms or conditions during VF episodes. For this purpose, manifold learning using autoencoder neural networks was analyzed based on surface ECG recordings. The recordings covered the onset of the VF episode as well as the next 6 min, and comprised an experimental database based on an animal model with five situations, including control, drug intervention (amiodarone, diltiazem, and flecainide), and autonomic nervous system blockade. The results show that latent spaces from unsupervised and supervised learning schemes yielded moderate though quite noticeable separability among the different types of VF according to their type or intervention. In particular, unsupervised schemes reached a multi-class classification accuracy of 66%, while supervised schemes improved the separability of the generated latent spaces, providing a classification accuracy of up to 74%. Thus, we conclude that manifold learning schemes can provide a valuable tool for studying different types of VF while working in low-dimensional latent spaces, as the machine-learning generated features exhibit separability among different VF types. This study confirms that latent variables are better VF descriptors than conventional time or domain features, making this technique useful in current VF research on elucidation of the underlying VF mechanisms.
Asunto(s)
Electrocardiografía , Fibrilación Ventricular , Animales , Electrocardiografía/métodos , Redes Neurales de la ComputaciónRESUMEN
Chemical derivatization for improving selectivity and/or sensitivity is a common practice in analytical chemistry. It is particularly attractive in flow analysis in view of its highly reproducible reagent addition(s) and controlled timing. Then, measurements without attaining the steady state, kinetic discrimination, exploitation of unstable reagents and/or products, as well as strategies compliant with Green Analytical Chemistry, have been efficiently exploited. Flow-based chemical derivatization has been accomplished by different approaches, most involving flow and manifold programming. Solid-phase reagents, novel strategies for sample insertion and reagent addition, as well as to increase sample residence time have been also exploited. However, the required alterations in flow rates and/or manifold geometry may lead to spurious signals (e.g., Schlieren effect) resulting in distorted peaks and a noisy/drifty baseline. These anomalies can be circumvented by a proper flow system design. In this review, these aspects are critically discussed mostly in relation to spectrophotometric and luminometric detection.
RESUMEN
We used a finger force matching task to explore the role of efferent signals in force perception. Healthy, young participants performed accurate force production tasks at different force levels with the index and middle fingers of one hand (task-hand). They received visual feedback during an early part of each trial only. After the feedback was turned off, the force drifted toward lower magnitudes. After 5â¯s of the drift, the participants matched the force with the same finger pair of the other hand (match-hand). The match-hand consistently overshot the task-hand force by a magnitude invariant over the initial force levels. During force matching, both hands were lifted and lowered smoothly to estimate their referent coordinate (RC) and apparent stiffness values. These trials were performed without muscle vibration and under vibration applied to the finger/hand flexors or extensors of the task-hand or match-hand. Effects of vibration were seen in the match-hand only; they were the same during vibration of flexors and extensors. We interpret the vibration-induced effects as consequences of using distorted copies of the central commands to the task-hand during force matching. In particular, using distorted copies of the RC for the antagonist muscle group could account for the differences between the task-hand and match-hand. We conclude that efferent signals may be distorted before their participation in the perceptual process. Such distortions emerge spontaneously and may be amplified by the response of sensory endings to muscle vibration combined over both agonist and antagonist muscle groups.
Asunto(s)
Desempeño Psicomotor , Vibración , Retroalimentación Sensorial , Dedos , Humanos , Músculos , PercepciónRESUMEN
Significance: Solutions for group-level analysis of connectivity from fNIRS observations exist, but groupwise explorative analysis with classical solutions is often cumbersome. Manifold-based solutions excel at data exploration, but there are infinite surfaces crossing the observations cloud of points. Aim: We aim to provide a systematic choice of surface for a manifold-based analysis of connectivity at group level with small surface interpolation error. Approach: This research introduces interpolated functional manifold (IFM). IFM builds a manifold from reconstructed changes in concentrations of oxygenated Δ c HbO 2 and reduced Δ c HbR hemoglobin species by means of radial basis functions (RBF). We evaluate the root mean square error (RMSE) associated to four families of RBF. We validated our model against psychophysiological interactions (PPI) analysis using the Jaccard index (JI). We demonstrate the usability in an experimental dataset of surgical neuroergonomics. Results: Lowest interpolation RMSE was 1.26 e - 4 ± 1.32 e - 8 for Δ c HbO 2 [A.U.] and 4.30 e - 7 ± 2.50 e - 13 [A.U.] for Δ c HbR . Agreement with classical group analysis was JI = 0.89 ± 0.01 for Δ c HbO 2 . Agreement with PPI analysis was JI = 0.83 ± 0.07 for Δ c HbO 2 and JI = 0.77 ± 0.06 for Δ c HbR . IFM successfully decoded group differences [ANOVA: Δ cHbO 2 : F ( 2,117 ) = 3.07 ; p < 0.05 ; Δ c HbR : F ( 2,117 ) = 3.35 ; p < 0.05 ]. Conclusions: IFM provides a pragmatic solution to the problem of choosing the manifold associated to a cloud of points, facilitating the use of manifold-based solutions for the group analysis of fNIRS datasets.
RESUMEN
A nonlinear feature extraction-based approach using manifold learning algorithms is developed in order to improve the classification accuracy in an electronic tongue sensor array. The developed signal processing methodology is composed of four stages: data unfolding, scaling, feature extraction, and classification. This study aims to compare seven manifold learning algorithms: Isomap, Laplacian Eigenmaps, Locally Linear Embedding (LLE), modified LLE, Hessian LLE, Local Tangent Space Alignment (LTSA), and t-Distributed Stochastic Neighbor Embedding (t-SNE) to find the best classification accuracy in a multifrequency large-amplitude pulse voltammetry electronic tongue. A sensitivity study of the parameters of each manifold learning algorithm is also included. A data set of seven different aqueous matrices is used to validate the proposed data processing methodology. A leave-one-out cross validation was employed in 63 samples. The best accuracy (96.83%) was obtained when the methodology uses Mean-Centered Group Scaling (MCGS) for data normalization, the t-SNE algorithm for feature extraction, and k-nearest neighbors (kNN) as classifier.
RESUMEN
We explored the origin of the impaired control of action stability in Parkinson's disease (PD) by testing levodopa-naïve PD patients to disambiguate effects of PD from possible effects of long-term exposure to levodopa. Thirteen levodopa-naïve PD patients and 13 controls performed single- and multi-finger force production tasks, including producing a self-paced quick force pulse into a target. A subgroup of patients (n = 10) was re-tested about 1 h after the first dose of levodopa. Compared to controls, PD patients showed lower maximal forces and synergy indices stabilizing total force (reflecting the higher inter-trial variance component affecting total force). In addition, PD patients showed a trend toward shorter anticipatory synergy adjustments (a drop in the synergy index in preparation to a quick action) and larger non-motor equivalent finger force deviations. Lower maximal force, higher unintentional force production (enslaving) and higher inter-trial variance indices occurred in PD patients after one dosage of levodopa. We conclude that impairment in synergies is present in levodopa-naïve patients, mainly in indices reflecting stability (synergy index), but not agility (anticipatory synergy adjustments). A single dose of levodopa, however, did not improve synergy indices, as it did in PD patients on chronic anti-PD medication, suggesting a different mechanism of action. The results suggest that indices of force-stabilizing synergies may be used as an early behavioral sign of PD, although it may not be sensitive to acute drug effects in drug-naïve patients.
Asunto(s)
Antiparkinsonianos/farmacología , Dedos/fisiopatología , Levodopa/farmacología , Actividad Motora/fisiología , Enfermedad de Parkinson/fisiopatología , Desempeño Psicomotor/fisiología , Anciano , Anciano de 80 o más Años , Femenino , Humanos , Masculino , Persona de Mediana Edad , Actividad Motora/efectos de los fármacos , Enfermedad de Parkinson/tratamiento farmacológico , Desempeño Psicomotor/efectos de los fármacosRESUMEN
DIMSUB is a computer program to complement a decision support tool (DST) to effectively study different hydraulic design alternatives in microirrigation systems. We developed environments in Visual Basic for applications for Microsoft Excel® that allow specific step-by-step functions to be created for the design of irrigation subunits. Different alternatives can be considered, such as types of emitter, lateral and submain pipe sizes, different feeding points, irregular subunit shapes and topography slopes. Furthermore, specific uniformity criteria need to be considered to achieve efficient water applications and proper design systems. Lengths of run lateral and submain pipes, position of the hydrant connection, pressure head and head loss in pipes or pressure-compensating emitters can be assigned to evaluate the results and choose the best design alternative. This user-friendly tool to study hydraulic variables is expected to be a valuable aid for the decision-making process in designing irrigation systems. Some examples of practical cases under specific crop conditions to design drip irrigation subunits are given using DIMSUB.
Asunto(s)
Riego Agrícola/instrumentación , Programas InformáticosRESUMEN
DIMSUB is a computer program to complement a decision support tool (DST) to effectively study different hydraulic design alternatives in microirrigation systems. We developed environments in Visual Basic for applications for Microsoft Excel® that allow specific step-by-step functions to be created for the design of irrigation subunits. Different alternatives can be considered, such as types of emitter, lateral and submain pipe sizes, different feeding points, irregular subunit shapes and topography slopes. Furthermore, specific uniformity criteria need to be considered to achieve efficient water applications and proper design systems. Lengths of run lateral and submain pipes, position of the hydrant connection, pressure head and head loss in pipes or pressure-compensating emitters can be assigned to evaluate the results and choose the best design alternative. This user-friendly tool to study hydraulic variables is expected to be a valuable aid for the decision-making process in designing irrigation systems. Some examples of practical cases under specific crop conditions to design drip irrigation subunits are given using DIMSUB.(AU)
Asunto(s)
Riego Agrícola/instrumentación , Programas InformáticosRESUMEN
We used force-matching tasks between the two hands to test predictions of the recently introduced scheme of perception based on the concept of iso-perceptual manifold (IPM) in the combined afferent-efferent space of neural signals. The main hypothesis was that accuracy and variability of individual finger force matching would be worse in a four-finger task compared to one-finger tasks. The subjects produced accurate force levels under visual feedback by pressing with either all four fingers or by one of the fingers of a hand (task-hand). They tried to match the total four-finger force or individual finger forces by pressing with the other hand (match-hand, no visual feedback). The match-hand consistently overshot the task-hand force during single-finger matching episodes. It showed higher inter-trial force variability during single-finger matching when the task-hand performed the four-finger task compared to trials when the task-hand performed single-finger tasks. These findings confirm our main hypothesis by showing that perception of individual finger forces can vary in multi-finger tasks within a space (IPM) corresponding to veridical perception of total force. Matching hypothetical commands to fingers, rather than finger forces, could be responsible for the consistent force overshoots. Indices of inter-trial variance affecting and unaffecting total force showed strong stabilization of total force in the task-hand but not in the match-hand in support of an earlier hypothesis on the importance of visual feedback for force stabilization. No differences were seen between the right and left hands suggesting that the dynamic dominance hypothesis may not be generalizable to perceptual phenomena.
Asunto(s)
Dedos/fisiología , Destreza Motora/fisiología , Percepción/fisiología , Desempeño Psicomotor/fisiología , Adulto , Retroalimentación Sensorial , Femenino , Lateralidad Funcional , Fuerza de la Mano , Humanos , Cinestesia , Masculino , Contracción Muscular , Propiocepción , Percepción del TactoRESUMEN
Consider µ a probability measure and P µ the set of µ -equivalent strictly positive probability densities. To endow P µ with a structure of a C ∞ -Banach manifold we use the φ -connection by an open arc, where φ is a deformed exponential function which assumes zero until a certain point and from then on is strictly increasing. This deformed exponential function has as particular cases the q-deformed exponential and κ -exponential functions. Moreover, we find the tangent space of P µ at a point p, and as a consequence the tangent bundle of P µ . We define a divergence using the q-exponential function and we prove that this divergence is related to the q-divergence already known from the literature. We also show that q-exponential and κ -exponential functions can be used to generalize of Rényi divergence.
RESUMEN
The transition from occasional to obligate bipedalism is a milestone in human evolution. However, because the fossil record is fragmentary and reconstructing behaviour from fossils is difficult, changes in the motor control strategies that accompanied this transition remain unknown. Quadrupedal primates that adopt a bipedal stance while using percussive tools provide a unique reference point to clarify one aspect of this transition, which is maintaining bipedal stance while handling massive objects. We found that while cracking nuts using massive stone hammers, wild bearded capuchin monkeys (Sapajus libidinosus) produce hammer trajectories with highly repeatable spatial profiles. Using an uncontrolled manifold analysis, we show that the monkeys used strong joint synergies to stabilize the hammer trajectory while lifting and lowering heavy hammers. The monkeys stringently controlled the motion of the foot. They controlled the motion of the lower arm and hand rather loosely, showing a greater variability across strikes. Overall, our findings indicate that while standing bipedally to lift and lower massive hammers, an arboreal quadrupedal primate must control motion in the joints of the lower body more stringently than motion in the joints of the upper body. Similar changes in the structure of motor variability required to accomplish this goal could have accompanied the evolutionary transition from occasional to obligate bipedalism in ancestral hominins.
Asunto(s)
Cebinae/fisiología , Articulaciones/fisiología , Postura , Comportamiento del Uso de la Herramienta , Animales , Fenómenos Biomecánicos , Brasil , NuecesRESUMEN
In this paper, we investigate the mixture arc on generalized statistical manifolds. We ensure that the generalization of the mixture arc is well defined and we are able to provide a generalization of the open exponential arc and its properties. We consider the model of a φ -family of distributions to describe our general statistical model.
RESUMEN
La hemofiltración venovenosa continua (HFVVC), es unatécnica realizada en pacientes postoperados de cirugía cardíaca;consiste en extraer agua plasmática del sistema circulatorio,a través de una membrana semipermeable (hemoconcentrador)por convección y gradiente de presión, con el objetivode disminuir hipervolemia, mejorar la función renal, eliminarmediadores de respuesta inflamatoria y optimizar el estado hemodinámicode los pacientes. Este procedimiento es realizadopor el personal de enfermería de perfusión en la unidad de terapiaintensiva con un circuito que se arma y diseña en unárea blanca. Una de las principales ventajas que se le adjudicana las terapias de HFVVC, es la tolerancia hemodinámica ala extracción de grandes volúmenes de líquido en forma gradualy controlada, en pacientes en estado crítico.
The continuous veno-venous hemofiltration (HFVVC), is atechnique realized in post-operated patients of cardiac surgery;it consists of extracting plasmatic water of the circulatorysystem, through a semi permeable membrane (hemoconcentrador)by convection and pressure gradient, with the aimof diminishing hipervolemia, improving the renal function,to eliminate mediators of inflammatory answer and to optimizethe hemodinámico state of the patients. This procedureis realized by the personnel of perfusion infirmary in theunit of intensive therapy with a circuit that weapon and isdesigned in a white area. One of the main advantages thatare adjudged to the HFVVC therapies, is the tolerance haemodynamicsto the extraction of great volumes of liquid ingradual and controlled form, in patients in critical state.