RESUMEN
PURPOSE OF REVIEW: The goal of this article is to present a concise review about digital twin (DT) methodology and its application in food processing. We aim to identify the building blocks, current state and bottlenecks, and to discuss future developments of this approach. RECENT FINDINGS: DT methodology appears as a powerful approach for digital transformation of food production, via integration of modelling and simulation tools, sensors, actuators and communication platforms. This methodology allows developing virtual environments for real-time monitoring and controlling of processes, as well as providing actionable metrics for decision-making, which are not possible to obtain by physical sensors. So far, main applications were focused on refrigerated transport and storage of fresh produces, and thermal processes like cooking and drying. DT methodology can provide useful solutions to food industry towards productivity and sustainability, but requires of multidisciplinary efforts. Wide and effective implementation of this approach will largely depend on developing high-fidelity digital models with real-time simulation capability.
Asunto(s)
Manipulación de Alimentos , Manipulación de Alimentos/métodos , Humanos , Culinaria/métodos , Simulación por ComputadorRESUMEN
While the estimate of hospital costs concerns the past, its planning focuses on the future. However, in many low and middle-income countries, public hospitals do not have robust accounting health systems to evaluate and project their expenses. In Brazil, public hospitals are funded based on government estimates of available hospital infrastructure, historical expenditures and population needs. However, these pieces of information are not always readily available for all hospitals. To solve this challenge, we propose a flexible simulation-based optimisation algorithm that integrates this dual task: estimating and planning hospital costs. The method was applied to a network of 17 public hospitals in Brazil to produce the estimates. Setting the model parameters for population needs and future hospital infrastructure can be used as a cost-projection tool for divestment, maintenance, or investment. Results show that the method can aid health managers in hospitals' global budgeting and policymakers in improving fairness in hospitals' financing.
Asunto(s)
Costos de Hospital , Hospitales Públicos , Hospitales Públicos/economía , Brasil , Costos de Hospital/estadística & datos numéricos , Humanos , AlgoritmosRESUMEN
Antibiotics, as a class of environmental pollutants, pose a significant challenge due to their persistent nature and resistance to easy degradation. This study delves into modeling and optimizing conventional Fenton degradation of antibiotic sulfamethoxazole (SMX) and total organic carbon (TOC) under varying levels of H2O2, Fe2+ concentration, pH, and temperature using statistical and artificial intelligence techniques including Multiple Regression Analysis (MRA), Support Vector Regression (SVR) and Artificial Neural Network (ANN). In statistical metrics, the ANN model demonstrated superior predictive accuracy compared to its counterparts, with lowest RMSE values of 0.986 and 1.173 for SMX and TOC removal, respectively. Sensitivity showcased H2O2/Fe2+ ratio, time and pH as pivotal for SMX degradation, while in simultaneous SMX and TOC reduction, fine tuning the time, pH, and temperature was essential. Leveraging a Hybrid Genetic Algorithm-Desirability Optimization approach, the trained ANN model revealed an optimal desirability of 0.941 out of 1000 solutions which yielded a 91.18% SMX degradation and 87.90% TOC removal under following specific conditions: treatment time of 48.5 min, Fe2+: 7.05 mg L-1, H2O2: 128.82 mg L-1, pH: 5.1, initial SMX: 97.6 mg L-1, and a temperature: 29.8 °C. LC/MS analysis reveals multiple intermediates with higher m/z (242, 270 and 288) and lower m/z (98, 108, 156 and 173) values identified, however no aliphatic hydrocarbon was isolated, because of the low mineralization performance of Fenton process. Furthermore, some inorganic fragments like NH4+ and NO3- were also determined in solution. This comprehensive research enriches AI modeling for intricate Fenton-based contaminant degradation, advancing sustainable antibiotic removal strategies.
Asunto(s)
Antibacterianos , Inteligencia Artificial , Peróxido de Hidrógeno , Hierro , Redes Neurales de la Computación , Sulfametoxazol , Sulfametoxazol/química , Peróxido de Hidrógeno/química , Antibacterianos/química , Hierro/química , Contaminantes Químicos del Agua/química , Contaminantes Químicos del Agua/análisis , Concentración de Iones de Hidrógeno , TemperaturaRESUMEN
Electroporation is a vital process that facilitates the use of modern recombineering and other high-throughput techniques in a wide array of microorganisms, including non-model bacteria like plant growth-promoting bacteria (PGPB). These microorganisms play a significant role in plant health by colonizing plants and promoting growth through nutrient exchange and hormonal regulation. In this study, we introduce a sequential Design of Experiments (DOE) approach to obtain highly competent cells swiftly and reliably for electroporation. Our method focuses on optimizing the three stages of the electroporation procedure-preparing competent cells, applying the electric pulse field, and recovering transformed cells-separately. We utilized a split-plot fractional design with five factors and a covariate to optimize the first step, response surface methodology (RSM) for the second step, and Plackett-Burman design for two categorical factors and one continuous factor for the final step. Following the experimental sequence with three bacterial models, we achieved efficiencies 10 to 100 times higher, reaching orders of 105 to 106 CFU/µg of circular plasmid DNA. These results highlight the significant potential for enhancing electroporation protocols for non-model bacteria.
Asunto(s)
ADN , Transformación Bacteriana , Plásmidos , Electroporación/métodos , Plantas , Bacterias/genéticaRESUMEN
Intensity modulated radiation therapy (IMRT) is one of the most used techniques for cancer treatment. Using a linear accelerator, it delivers radiation directly at the cancerogenic cells in the tumour, reducing the impact of the radiation on the organs surrounding the tumour. The complexity of the IMRT problem forces researchers to subdivide it into three sub-problems that are addressed sequentially. Using this sequential approach, we first need to find a beam angle configuration that will be the set of irradiation points (beam angles) over which the tumour radiation is delivered. This first problem is called the Beam Angle Optimisation (BAO) problem. Then, we must optimise the radiation intensity delivered from each angle to the tumour. This second problem is called the Fluence Map Optimisation (FMO) problem. Finally, we need to generate a set of apertures for each beam angle, making the intensities computed in the previous step deliverable. This third problem is called the Sequencing problem. Solving these three sub-problems sequentially allows clinicians to obtain a treatment plan that can be delivered from a physical point of view. However, the obtained treatment plans generally have too many apertures, resulting in long delivery times. One strategy to avoid this problem is the Direct Aperture Optimisation (DAO) problem. In the DAO problem, the idea is to merge the FMO and the Sequencing problem. Hence, optimising the radiation's intensities considers the physical constraints of the delivery process. The DAO problem is usually modelled as a Mixed-Integer optimisation problem and aims to determine the aperture shapes and their corresponding radiation intensities, considering the physical constraints imposed by the Multi-Leaf Collimator device. In solving the DAO problem, generating clinically acceptable treatments without additional sequencing steps to deliver to the patients is possible. In this work, we propose to solve the DAO problem using the well-known Particle Swarm Optimisation (PSO) algorithm. Our approach integrates the use of mathematical programming to optimise the intensities and utilizes PSO to optimise the aperture shapes. Additionally, we introduce a reparation heuristic to enhance aperture shapes with minimal impact on the treatment plan. We apply our proposed algorithm to prostate cancer cases and compare our results with those obtained in the sequential approach. Results show that the PSO obtains competitive results compared to the sequential approach, receiving less radiation time (beam on time) and using the available apertures with major efficiency.
RESUMEN
Traditional methods for designing concrete mixtures provide good results; however, they do not guarantee the optimum composition. Consequently, applying operational research techniques is motivated by an increasing need for designers to proportion the concrete's raw materials that satisfy the concrete performance requirements such as mechanical properties, chemical properties, workability, sustainability, and cost. For this reason, many authors have been looking for mathematical programming and machine learning solutions to predict concrete mix properties and optimise concrete mixtures. Therefore, a comprehensive review of operational research techniques concerning the design and proportioning of concrete mixtures and a classification framework are presented herein.
RESUMEN
The extraction of total lipids and tocopherol compounds from Patagonian squid (Doriteuthis gahi) by-products (viscera, heads, skin, etc.), resulting from squid mantel commercialisation, was studied. An optimisation simplex-lattice design by employing low-toxicity solvents (ethanol, acetone, and ethyl acetate) was carried out taking into account their relative concentrations. The variance analysis of data showed that the quadratic model was statistically significant (p < 0.05); empirical coded equations were obtained as a function of the low-toxicity solvent ratios. The optimised lipid extraction was obtained by employing the 0.642/0.318/0.040 (ethanol/acetone/ethyl acetate) solvent ratio, respectively, leading to an 84% recovery of the total lipids extracted by the traditional procedure. In all extracting systems tested, the presence of α-, γ-, and δ-tocopherol compounds was detected, α-tocopherol being the most abundant. For α-, γ-, and δ-tocopherol compounds, the optimisation process showed that acetone extraction led to the highest concentrations in the lipid extract obtained (2736.5, 36.8, and 2.8 mg·kg-1 lipids, respectively). Taking into account the recovery yield on a by-product basis, the values obtained for the three tocopherols were included in the 88.0-97.7%, 80.0-95.0%, and 25-75% ranges, respectively, when compared to the traditional extraction. This study provides a novel and valuable possibility for α-tocopherol extraction from marine by-products.
RESUMEN
Research background: Research into bacterial cellulose production has been growing rapidly in recent years, as it has a potential use in various applications, such as in the medical and food industries. Previous studies have focused on optimising the production process through various methods, such as using different carbon sources and manipulating environmental conditions. However, further research is still needed to optimise the production process and understand the underlying mechanisms of bacterial cellulose synthesis. Experimental approach: We used Plackett-Burman and Box-Behnken experimental designs to analyse the effect of different factors on bacterial cellulose production. The fermentation kinetics of the optimised medium was analysed, and the produced cellulose was characterised. This approach was used because it allows the identification of significant factors influencing bacterial cellulose growth, the optimisation of the culture medium and the characterisation of the produced cellulose. Results and conclusions: The results showed that higher sucrose concentrations, higher kombucha volume fractions and a smaller size of the symbiotic culture of bacteria and yeast were the most important factors for the improvement of bacterial cellulose production, while the other factors had no relevant influence. The optimised medium showed an increase in the concentrations of total phenolic compounds and total flavonoids as well as significant antioxidant activity. The produced pure bacterial cellulose had a high water absorption capacity as well as high crystallinity and thermal stability. Novelty and scientific contribution: The study makes an important scientific contribution by optimising the culture medium to produce bacterial cellulose more productively and efficiently. The optimised medium can be used for the production of a kombucha-like beverage with a high content of bioactive compounds and for the production of bacterial cellulose with high crystallinity and thermal stability. Additionally, the study highlights the potential of bacterial cellulose as a highly water-absorbent material with applications in areas such as packaging and biomedical engineering.
RESUMEN
In many households, preparation of food in normal times proves to be problematic, particularly when parents endeavour to keep their children on a balanced diet. The COVID-19 pandemic has further exacerbated this problem imposing the requirement of social distancing, which led to disruptions in the food supply chain and multiplication of responsibilities faced by families with children. The present study revisits the standard "Diet Problem" to address these challenges and to develop a participatory approach to provide a diversified weekly meal plan that is easy and fun but simultaneously complies with the unique requirements of each participant. This is done by providing a novel framework, which combines linear optimisation with the Parsimonious Analytic Hierarchy Process, a method for individual choices. This novel approach to participatory modelling is tested within two young family settings in Brazil. The model produced through this contemporary framework provides a weekly menu that best meets expectations of the members of a young family in the context of the COVID-19 pandemic.
RESUMEN
Trub, a brewing by-product, can be used as alternative ingredient for foods nutritional enrichment after its bitter compounds extraction. Study presents the optimisation of bitter compounds extraction from trub by Box-Behnken design, and use of debittered trub (DT) as new ingredient to enrich pasta. Bitterness extraction process was evaluated at different pH levels, time and extraction steps, and physical-chemical properties of DT (obtained under optimal conditions) were evaluated. Pasta was enriched with DT (5%, 10% and 15%) and its physical-chemical and quality properties were evaluated. Protein structure and chemical composition of trub were altered after process, also modifying its technological properties. Pasta with 10% DT increased in 33.51% protein content. Interaction of DT and wheat proteins resulted in a more compact structure, and DT water absorption capacity provided pasta texture changes. DT use improved pasta nutritional and quality properties, enabling trub valorisation and its use as vegetable proteins alternative source.
Asunto(s)
Harina , Triticum , Triticum/química , Harina/análisis , Culinaria , Mejoramiento de la Calidad , Proteínas de Vegetales Comestibles , AguaRESUMEN
Machine learning research has been able to solve problems in multiple domains. Machine learning represents an open area of research for solving optimisation problems. The optimisation problems can be solved using a metaheuristic algorithm, which can find a solution in a reasonable amount of time. However, the time required to find an appropriate metaheuristic algorithm, that would have the convenient configurations to solve a set of optimisation problems properly presents a problem. The proposal described in this article contemplates an approach that automatically creates metaheuristic algorithms given a set of optimisation problems. These metaheuristic algorithms are created by modifying their logical structure via the execution of an evolutionary process. This process employs an extension of the reinforcement learning approach that considers multi-agents in their environment, and a learning agent composed of an analysis process and a process of modification of the algorithms. The approach succeeded in creating a metaheuristic algorithm that managed to solve different continuous domain optimisation problems from the experiments performed. The implications of this work are immediate because they describe a basis for the generation of metaheuristic algorithms in an online-evolution.
RESUMEN
Abstract Piper sarmentosum is a herbaceous shrub with numerous pharmacological benefits. However, the presence of two toxic phenylpropanoids (α- and β-asarone) limits the medicinal usage of the plant. In this study, the extraction of three asarone isomers, namely α-, β-, and -asarone was optimised using supercritical carbon dioxide extraction (SC-CO2) combined with Box-Behnken experimental design. Comparison of asarone contents in different conventional solvent extracts of P. sarmentosum leaves prior to and after SC-CO2 extraction was performed. The SC-CO2 method successfully maximised the extraction of α-, β-, and ɣ-asarone at P = 81.16 bar, T = 50.11°C, and t = 80.90 min, yielding 13.91% α-asarone, 3.43% β-asarone, and 14.95% ɣ-asarone. The SC-CO2 residue of the leaves re-extracted with conventional solvents showed a significant decrease of asarone ranging from 45% to 100% (p<0.001) compared to their counterparts without SC-CO2 treatment. α-, β-, and ɣ-asarone were completely removed in the ethanol extract of the residue. These findings suggested that the optimised SC-CO2 extraction parameters may serve as a quick treatment step for the selective removal of asarone from P. sarmentosum to develop safer extracts for the food and nutraceutical industries applications.
RESUMEN
Green extraction was applied to Argentinean shortfin squid (Illex argentinus) viscera, consisting of a wet pressing method including a drying step, mechanic pressing, centrifugation of the resulting slurry, and oil collection. To maximise the oil yield and ω3 fatty acid content and to minimise the oil damage degree, a response surface methodology (RSM) design was developed focused on the drying temperature (45-85 °C) and time (30-90 min). In general, an increase of the drying time and temperature provided an increase in the lipid yield recovery from the viscera. The strongest drying conditions showed a higher recovery than 50% when compared with the traditional chemical method. The docosahexaenoic and eicosapentaenoic acid contents in the extracted oil revealed scarce dependence on drying conditions, showing valuable ranges (149.2-166.5 and 88.7-102.4 g·kg-1 oil, respectively). Furthermore, the values of free fatty acids, peroxides, conjugated dienes, and ω3/ω6 ratio did not show extensive differences by comparing oils obtained from the different drying conditions. Contrary, a polyene index (PI) decrease was detected with increasing drying time and temperature. The RSM analysis indicated that optimised drying time (41.3 min) and temperature (85 °C) conditions would lead to 74.73 g·kg-1 (oil yield), 1.87 (PI), and 6.72 (peroxide value) scores, with a 0.67 desirability value.
Asunto(s)
Decapodiformes , Ácidos Grasos Omega-3/química , Animales , Organismos Acuáticos , Tecnología Química Verde , Vísceras/químicaRESUMEN
Dental implants are widely used as a long-term treatment solution for missing teeth. A titanium implant is inserted into the jawbone, acting as a replacement for the lost tooth root and can then support a denture, crown or bridge. This allows discreet and high-quality aesthetic and functional improvement, boosting patient confidence. The use of implants also restores normal functions such as speech and mastication. Once an implant is placed, the surrounding bone will fuse to the titanium in a process known as osseointegration. The success of osseointegration is dependent on stress distribution within the surrounding bone and thus implant geometry plays an important role in it. Optimisation analyses are used to identify the geometry which results in the most favourable stress distribution, but the traditional methodology is inefficient, requiring analysis of numerous models and parameter combinations to identify the optimal solution. A proposed improvement to the traditional methodology includes the use of Design of Experiments (DOE) together with Response Surface Methodology (RSM). This would allow for a well-reasoned combination of parameters to be proposed. This study aims to use DOE, RSM and finite element models to develop a simplified optimisation analysis method for dental implant design. Drawing on data and results from previous studies, two-dimensional finite element models of a single Branemark implant, a multi-unit abutment, two prosthetic screws, a prosthetic crown and a region of mandibular bone were built. A small number of combinations of implant diameter and length were set based on the DOE method to analyse the influence of geometry on stress distribution at the bone-implant interface. The results agreed with previous studies and indicated that implant length is the critical parameter in reducing stress on cortical bone. The proposed method represents a more efficient analysis of multiple geometrical combinations with reduced time and computational cost, using fewer than a third of the models required by the traditional methods. Further work should include the application of this methodology to optimisation analyses using three-dimensional finite element models.
Asunto(s)
Implantes Dentales , Fenómenos Biomecánicos , Huesos , Simulación por Computador , Hueso Cortical , Diseño de Prótesis Dental , Análisis del Estrés Dental , Análisis de Elementos Finitos , Humanos , Imagenología Tridimensional , Oseointegración , Estrés MecánicoRESUMEN
More than ever, COVID-19 is putting pressure on health systems worldwide, especially in Brazil. In this study, we propose a method based on statistics and machine learning that uses blood lab exam data from patients to predict whether patients will require special care (hospitalization in regular or special-care units). We also predict the number of days the patients will stay under such care. The two-step procedure developed uses Bayesian Optimisation to select the best model among several candidates. This leads us to final models that achieve 0.94 area under ROC curve performance for the first target and 1.87 root mean squared error for the second target (which is a 77% improvement over the mean baseline)-making our model ready to be deployed as a decision system that could be available for everyone interested. The analytical approach can be used in other diseases and can help to plan hospital resources in other contexts.
RESUMEN
Connectionist and dynamic field models consist of a set of coupled first-order differential equations describing the evolution in time of different units. We compare three numerical methods for the integration of these equations: the Euler method, and two methods we have developed and present here: a modified version of the fourth-order Runge Kutta method, and one semi-analytical method. We apply them to solve a well-known nonlinear connectionist model of retrieval in single-digit multiplication, and show that, in many regimes, the semi-analytical and modified Runge Kutta methods outperform the Euler method, in some regimes by more than three orders of magnitude. Given the outstanding difference in execution time of the methods, and that the EM is widely used, we conclude that the researchers in the field can greatly benefit from our analysis and developed methods.
RESUMEN
This paper is focused on mode I delimitation of a unidirectional glass fibre reinforced polymer (GFRP) composite. The aim is to propose an accurate and simple characterisation of three cohesive zone models (CZM)-bilinear, trilinear, and potential-from the measurement of the load-displacement curve during a double cantilever beam experimental test. For that, a framework based on the equivalent linear elastic fracture mechanics (LEFM) R-curve is here proposed, which has never before been developed for a bilinear and a potential CZM. Besides, in order to validate this strategy, an optimisation algorithm for solving an inverse problem is also implemented. It is shown that the parameters' identification using the equivalent LEFM R-curve enables the same accuracy but reduces 72% the numerical efforts respect to a "blind fitting" (i.e., the optimisation algorithm). Therefore, even if optimisation techniques become popular at present due to their easy numerical implementation, strategies founded on physical models are still better solutions especially when evaluating the objective function is expensive as in mechanical problems.
RESUMEN
In this study, we examined endophytic fungi in leaves of Mandevilla catimbauensis, an endemic plant species found in the Brazilian dry forest (Caatinga), and endophytic fungi's potential to produce L-asparaginase (L-ASNase). In total, 66 endophytes were isolated, and the leaf-fragment colonisation rate was 11.78%. Based on morphology, internal transcribed spacer (ITS), and partial large subunit (LSU) of ribosomal DNA sequencing, the endophytic fungi isolated belonged to six Ascomycota orders (Botryosphaeriales, Capnodiales, Diaporthales, Eurotiales, Marthamycetales, and Pleosporales). Phyllosticta species were the most frequent endophytes isolated (23 isolates [45.1%] from two species). The Shannon-Wiener and Fisher alpha index average values were 0.56 and 3.26, respectively. Twenty endophytes were randomly selected for the L-ASNase production test, of which fourteen isolates showed potential to produce the enzyme (0.48-2.22 U g-1), especially Phyllosticta catimbauensis URM 7672 (2.22 U g-1) and Cladosporium sp. G45 (2.11 U g-1). Phyllosticta catimbauensis URM 7672 was selected for the partial optimisation of L-ASNase production because of its ability to generate considerable amounts of enzyme. We obtained the highest L-ASNase activity (3.47 U g-1), representing an increase of 36.02% in enzymatic production, under the following experimental conditions: a pH of 4.2, 1.0% inoculum concentration, and 2.5% L-asparagine concentration. Our study showed that M. catimbauensis harbours an important diversity of endophytic fungi with biotechnological potential for L-ASNase production.
Asunto(s)
Apocynaceae , Ascomicetos , Asparaginasa/biosíntesis , Apocynaceae/microbiología , Ascomicetos/clasificación , Ascomicetos/metabolismo , Asparaginasa/genética , Biodiversidad , Cladosporium , ADN de Hongos/genética , Endófitos/clasificación , Endófitos/metabolismo , Filogenia , Hojas de la Planta/microbiologíaRESUMEN
The necessity of incorporating a resilience-informed approach into urban planning and its decision-making is felt now more than any time previously, particularly in low and middle income countries. In order to achieve a successful transition to sustainable, resilient and cost-effective cities, there is a growing attention given to more effective integration of nature-based solutions, such as Sustainable Drainage Systems (SuDS), with other urban components. The experience of SuDS integration with urban planning, in developed cities, has proven to be an effective strategy with a wide range of advantages and lower costs. The effective design and implementation of SuDS requires a multi-objective approach by which all four pillars of SuDS design (i.e., water quality, water quantity, amenity and biodiversity) are considered in connection to other urban, social, and economic aspects and constraints. This study develops a resilience-driven multi-objective optimisation model aiming to provide a Pareto-front of optimised solutions for effective incorporation of SuDS into (peri)urban planning, applied to a case study in Brazil. This model adopts the SuDS's two pillars of water quality and water quantity as the optimisation objectives with its level of spatial distribution as decision variables. Also, an improved quality of life index (iQoL) is developed to re-evaluate the optimal engineering solutions to encompass the amenity and biodiversity pillars of SuDS. Rain barrels, green roofs, bio-retention tanks, vegetation grass swales and permeable pavements are the suitable SuDS options identified in this study. The findings show that the most resilient solutions are costly but this does not guarantee higher iQoL values. Bio-retention tanks and grass swales play effective roles in promotion of water quality resilience but this comes with considerable increase in costs. Permeable pavements and green roofs are effective strategies when flood resilience is a priority. Rain barrel is a preferred solution due to the dominance of residential areas in the study area and the lower cost of this option.
Asunto(s)
Calidad de Vida , Lluvia , Brasil , Ciudades , InundacionesRESUMEN
Government, researchers, and health professionals have been challenged to model, forecast, and evaluate pandemics time series (e.g. new coronavirus SARS-CoV-2, COVID-19). The main difficulty is the level of novelty imposed by these phenomena. Information from previous epidemics is only partially relevant. Further, the spread is local-dependent, reflecting a number of social, political, economic, and environmental dynamic factors. The present paper aims to provide a relatively simple way to model, forecast, and evaluate the time incidence of a pandemic. The proposed framework makes use of the non-central beta (NCB) probability density function. Specifically, a probabilistic optimisation algorithm searches for the best NCB model of the pandemic, according to the mean square error metric. The resulting model allows one to infer, among others, the general peak date, the ending date, and the total number of cases as well as to compare the level of difficult imposed by the pandemic among territories. Case studies involving COVID-19 incidence time series from countries around the world suggest the usefulness of the proposed framework in comparison with some of the main epidemic models from the literature (e.g. SIR, SIS, SEIR) and established time series formalisms (e.g. exponential smoothing - ETS, autoregressive integrated moving average - ARIMA).