Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 29
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Entropy (Basel) ; 25(12)2023 Nov 25.
Artículo en Inglés | MEDLINE | ID: mdl-38136464

RESUMEN

We describe boson sampling of interacting atoms from the noncondensed fraction of Bose-Einstein-condensed (BEC) gas confined in a box trap as a new platform for studying computational ♯P-hardness and quantum supremacy of many-body systems. We calculate the characteristic function and statistics of atom numbers via the newly found Hafnian master theorem. Using Bloch-Messiah reduction, we find that interatomic interactions give rise to two equally important entities-eigen-squeeze modes and eigen-energy quasiparticles-whose interplay with sampling atom states determines the behavior of the BEC gas. We infer that two necessary ingredients of ♯P-hardness, squeezing and interference, are self-generated in the gas and, contrary to Gaussian boson sampling in linear interferometers, external sources of squeezed bosons are not required.

2.
Philos Trans A Math Phys Eng Sci ; 381(2241): 20210417, 2023 Jan 23.
Artículo en Inglés | MEDLINE | ID: mdl-36463923

RESUMEN

In this review, after providing the basic physical concept behind quantum annealing (or adiabatic quantum computation), we present an overview of some recent theoretical as well as experimental developments pointing to the issues which are still debated. With a brief discussion on the fundamental ideas of continuous and discontinuous quantum phase transitions, we discuss the Kibble-Zurek scaling of defect generation following a ramping of a quantum many body system across a quantum critical point. In the process, we discuss associated models, both pure and disordered, and shed light on implementations and some recent applications of the quantum annealing protocols. Furthermore, we discuss the effect of environmental coupling on quantum annealing. Some possible ways to speed up the annealing protocol in closed systems are elaborated upon: we especially focus on the recipes to avoid discontinuous quantum phase transitions occurring in some models where energy gaps vanish exponentially with the system size. This article is part of the theme issue 'Quantum annealing and computation: challenges and perspectives'.

3.
Philos Trans A Math Phys Eng Sci ; 381(2241): 20210419, 2023 Jan 23.
Artículo en Inglés | MEDLINE | ID: mdl-36463926

RESUMEN

In the introductory article of this theme issue, we provide an overview of quantum annealing and computation with a very brief summary of the individual contributions to this issue made by experts as well as a few young researchers. We hope the readers will get the touch of the excitement as well as the perspectives in this unusually active field and important developments there. This article is part of the theme issue 'Quantum annealing and computation: challenges and perspectives'.

4.
Entropy (Basel) ; 24(12)2022 Dec 03.
Artículo en Inglés | MEDLINE | ID: mdl-36554176

RESUMEN

We propose a multi-qubit Bose-Einstein-condensate (BEC) trap as a platform for studies of quantum statistical phenomena in many-body interacting systems. In particular, it could facilitate testing atomic boson sampling of the excited-state occupations and its quantum advantage over classical computing in a full, controllable and clear way. Contrary to a linear interferometer enabling Gaussian boson sampling of non-interacting non-equilibrium photons, the BEC trap platform pertains to an interacting equilibrium many-body system of atoms. We discuss a basic model and the main features of such a multi-qubit BEC trap.

5.
Evol Intell ; : 1-16, 2022 Oct 23.
Artículo en Inglés | MEDLINE | ID: mdl-36312203

RESUMEN

Quantum-inspired metaheuristics emerged by combining the quantum mechanics principles with the metaheuristic algorithms concepts. These algorithms extend the diversity of the population, which is a primary key to proper global search and is guaranteed using the quantum bits' probabilistic representation. In this work, we aim to review recent quantum-inspired metaheuristics and to cover the merits of linking the quantum mechanics notions with optimization techniques and its multiplicity of applications in real-world problems and industry. Moreover, we reported the improvements and modifications of proposed algorithms and identified the scope's challenges. We gathered proposed algorithms of this scope between 2017 and 2022 and classified them based on the sources of inspiration. The source of inspiration for most quantum-inspired metaheuristics are the Genetic and Evolutionary algorithms, followed by swarm-based algorithms, and applications range from image processing to computer networks and even multidisciplinary fields such as flight control and structural design. The promising results of quantum-inspired metaheuristics give hope that more conventional algorithms can be combined with quantum mechanics principles in the future to tackle optimization problems in numerous disciplines.

6.
Ann Oper Res ; : 1-37, 2022 Jun 16.
Artículo en Inglés | MEDLINE | ID: mdl-35729981

RESUMEN

The location planning of relief distribution centres (DCs) is crucial in humanitarian logistics as it directly influences the disaster response and service to the affected victims. In light of the critical role of facility location in humanitarian logistics planning, the study proposes a two-stage relief distribution location problem. The first stage of the model determines the minimum number of relief DCs, and the second stage find the optimal location of these DCs to minimize the total cost. To address a more realistic situation, restrictions are imposed on the coverage area and capacity of each DCs. In addition, for optimally solving this complex NP-hard problem, a novel two-phase algorithm with exploration and exploitation phase is developed in the paper. The first phase of the algorithm i.e., exploration phase identifies a near-optimal solution while the second phase i.e. exploitation phase enhances the solution quality through a close circular proximity investigation. Furthermore, the comparative analysis of the proposed algorithm with other well-known algorithms such as genetic algorithm, pattern search, fmincon, multistart and hybrid heuristics is also reported and computationally tested from small to large data sets. The results reveal that the proposed two-phase algorithm is more efficient and effective when compared to the conventional metaheuristic methods.

7.
Entropy (Basel) ; 24(5)2022 May 03.
Artículo en Inglés | MEDLINE | ID: mdl-35626526

RESUMEN

As a non-deterministic polynomial hard (NP-hard) problem, the shortest common supersequence (SCS) problem is normally solved by heuristic or metaheuristic algorithms. One type of metaheuristic algorithms that has relatively good performance for solving SCS problems is the chemical reaction optimization (CRO) algorithm. Several CRO-based proposals exist; however, they face such problems as unstable molecular population quality, uneven distribution, and local optimum (premature) solutions. To overcome these problems, we propose a new approach for the search mechanism of CRO-based algorithms. It combines the opposition-based learning (OBL) mechanism with the previously studied improved chemical reaction optimization (IMCRO) algorithm. This upgraded version is dubbed OBLIMCRO. In its initialization phase, the opposite population is constructed from a random population based on OBL; then, the initial population is generated by selecting molecules with the lowest potential energy from the random and opposite populations. In the iterative phase, reaction operators create new molecules, where the final population update is performed. Experiments show that the average running time of OBLIMCRO is more than 50% less than the average running time of CRO_SCS and its baseline algorithm, IMCRO, for the desoxyribonucleic acid (DNA) and protein datasets.

8.
Synthese ; 198(6): 5749-5784, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34720224

RESUMEN

Many compelling examples have recently been provided in which people can achieve impressive epistemic success, e.g. draw highly accurate inferences, by using simple heuristics and very little information. This is possible by taking advantage of the features of the environment. The examples suggest an easy and appealing naturalization of rationality: on the one hand, people clearly can apply simple heuristics, and on the other hand, they intuitively ought do so when this brings them high accuracy at little cost.. The 'ought-can' principle is satisfied, and rationality is meaningfully normative. We show, however, that this naturalization program is endangered by a computational wrinkle in the adaptation process taken to be responsible for this heuristics-based ('ecological') rationality: for the adaptation process to guarantee even minimal rationality, it requires astronomical computational resources, making the problem intractable. We consider various plausible auxiliary assumptions in attempt to remove this obstacle, and show that they do not succeed; intractability is a robust property of adaptation. We discuss the implications of our findings for the project of naturalizing rationality.

9.
Entropy (Basel) ; 23(11)2021 Oct 28.
Artículo en Inglés | MEDLINE | ID: mdl-34828120

RESUMEN

We present a finite-order system of recurrence relations for the permanent of circulant matrices containing a band of k any-value diagonals on top of a uniform matrix (for k=1,2 and 3) and the method for deriving such recurrence relations, which is based on the permanents of the matrices with defects. The proposed system of linear recurrence equations with variable coefficients provides a powerful tool for the analysis of the circulant permanents, their fast, linear-time computing; and finding their asymptotics in a large-matrix-size limit. The latter problem is an open fundamental problem. Its solution would be tremendously important for a unified analysis of a wide range of the nature's ♯P-hard problems, including problems in the physics of many-body systems, critical phenomena, quantum computing, quantum field theory, theory of chaos, fractals, theory of graphs, number theory, combinatorics, cryptography, etc.

10.
Comput Biol Chem ; 88: 107327, 2020 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-32688010

RESUMEN

The shortest common supersequence (SCS) problem is a classical NP-hard problem, which is normally solved by heuristic algorithms. One important heuristic that is inspired by the process of chemical reactions in nature is the chemical reaction optimization (CRO) and its algorithm known as CRO_SCS. In this paper we propose a novel CRO algorithm, dubbed IMCRO, to solve the SCS problem efficiently. Two new operators are introduced in two of the four reactions of the CRO: a new circular shift operator is added to the decomposition reaction, and a new two-step crossover operator is included in the inter-molecular ineffective collision reaction. Experimental results show that IMCRO achieves better performance on random and real sequences than well-known heuristic algorithms such as the ant colony optimization, deposition and reduction, enhanced beam search, and CRO_SCS. Additionally, it outperforms its baseline CRO_SCS for DNA instances, averaging a SCS length reduction of 1.02, with a maximum length reduction of up to 2.1.


Asunto(s)
Algoritmos
11.
Algorithmica ; 82(5): 1410-1433, 2020.
Artículo en Inglés | MEDLINE | ID: mdl-32214575

RESUMEN

We consider the two-sided stable matching setting in which there may be uncertainty about the agents' preferences due to limited information or communication. We consider three models of uncertainty: (1) lottery model-for each agent, there is a probability distribution over linear preferences, (2) compact indifference model-for each agent, a weak preference order is specified and each linear order compatible with the weak order is equally likely and (3) joint probability model-there is a lottery over preference profiles. For each of the models, we study the computational complexity of computing the stability probability of a given matching as well as finding a matching with the highest probability of being stable. We also examine more restricted problems such as deciding whether a certainly stable matching exists. We find a rich complexity landscape for these problems, indicating that the form uncertainty takes is significant.

12.
J Theor Biol ; 489: 110144, 2020 03 21.
Artículo en Inglés | MEDLINE | ID: mdl-31911141

RESUMEN

Phylogenetics is a field that studies and models the evolutionary history of currently living species. The rooted phylogenetic network is an important approach that models non-tree-like events between currently living species. Rooted triplets are one type of inputs in constructing rooted phylogenetic networks. Constructing an optimal rooted phylogenetic network that contains all given rooted triplets is a NP-hard problem. To overcome this challenge efficiently, a novel heuristic method called NCHB is introduced in this paper. NCHB produces an optimal rooted phylogenetic network that covers all given rooted triplets. The NCHB optimality criterions in building a rooted phylogenetic network are minimizing the number of reticulation nodes, and minimizing the level of the final network. In NCHB, the two concepts: the height function and the binarization of a network are considered innovatively. In order to study the performance of NCHB, our proposed method is compared with the three state of the art algorithms that are LEV1ATHAN, SIMPLISTIC and TripNet in two scenarios. In the first scenario, triplet sets are generated under biological presumptions and our proposed method is compared with SIMPLISTIC and TripNet. The results show that NCHB outperforms TripNet and SIMPLISTIC according to the optimality criterions. In the second scenario, we designed a software for generating level-k networks. Then all triplets consistent with each network are obtained and are used as input for NCHB, LEV1THAN, SIMPLISTIC, and TripNet. LEV1ATHEN is just applicable for level-1 networks while the other algorithms can be performed to obtain higher level networks. The results show that the NCHB and LEV1ATHAN outputs are almost the same when we are restricted to level-1 networks. Also the results show that NCHB outperforms TripNet and SIMPLISTIC. Moreover NCHB outputs are very close to the generated networks (that are optimal) with respect to the criterions.


Asunto(s)
Algoritmos , Programas Informáticos , Evolución Biológica , Heurística , Modelos Genéticos , Filogenia
13.
Biosystems ; 184: 103997, 2019 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-31369836

RESUMEN

DNA computing, as one of potential means to solve complicated computational problems, is a new field of interdisciplinary research, including computational mathematics, parallel algorithms, bioinformatics. Capacitated vehicle routing problem is one of famous NP-hard problems, which includes determining the path of a group same vehicles serving a set of clients, while minimizing the total transportation cost. Based on the bio-heuristic computing model and DNA molecular manipulations, parallel biocomputing algorithms for solving capacitated vehicle routing problem are proposed in this paper. We appropriately use different biological chains to mean vertices, edges, weights, and adopt appropriate biological operations to search the solutions of the problem with O(n2) time complexity. We enrich the application scope of biocomputing, reduce computational complexity, and verify practicability of DNA parallel algorithms through simulations.


Asunto(s)
Algoritmos , Biología Computacional/métodos , Heurística Computacional , Computadores Moleculares , Modelos Teóricos , Solución de Problemas , Accidentes de Tránsito/prevención & control , Ciudades , Simulación por Computador , Humanos , Vehículos a Motor , Transportes/métodos
14.
Neural Netw ; 117: 191-200, 2019 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-31174047

RESUMEN

Graph partitioning, a classical NP-hard combinatorial optimization problem, is widely applied to industrial or management problems. In this study, an approximated solution of the graph partitioning problem is obtained by using a deterministic annealing neural network algorithm. The algorithm is a continuation method that attempts to obtain a high-quality solution by following a path of minimum points of a barrier problem as the barrier parameter is reduced from a sufficiently large positive number to 0. With the barrier parameter assumed to be any positive number, one minimum solution of the barrier problem can be found by the algorithm in a feasible descent direction. With a globally convergent iterative procedure, the feasible descent direction could be obtained by renewing Lagrange multipliers red. A distinctive feature of it is that the upper and lower bounds on the variables will be automatically satisfied on the condition that the step length is a value from 0 to 1. Four well-known algorithms are compared with the proposed one on 100 test samples. Simulation results show effectiveness of the proposed algorithm.


Asunto(s)
Redes Neurales de la Computación
15.
Algorithms Mol Biol ; 14: 5, 2019.
Artículo en Inglés | MEDLINE | ID: mdl-30899321

RESUMEN

BACKGROUND: Network connectivity problems are abundant in computational biology research, where graphs are used to represent a range of phenomena: from physical interactions between molecules to more abstract relationships such as gene co-expression. One common challenge in studying biological networks is the need to extract meaningful, small subgraphs out of large databases of potential interactions. A useful abstraction for this task turned out to be the Steiner Network problems: given a reference "database" graph, find a parsimonious subgraph that satisfies a given set of connectivity demands. While this formulation proved useful in a number of instances, the next challenge is to account for the fact that the reference graph may not be static. This can happen for instance, when studying protein measurements in single cells or at different time points, whereby different subsets of conditions can have different protein milieu. RESULTS AND DISCUSSION: We introduce the condition Steiner Network problem in which we concomitantly consider a set of distinct biological conditions. Each condition is associated with a set of connectivity demands, as well as a set of edges that are assumed to be present in that condition. The goal of this problem is to find a minimal subgraph that satisfies all the demands through paths that are present in the respective condition. We show that introducing multiple conditions as an additional factor makes this problem much harder to approximate. Specifically, we prove that for C conditions, this new problem is NP-hard to approximate to a factor of C - ϵ , for every C ≥ 2 and ϵ > 0 , and that this bound is tight. Moving beyond the worst case, we explore a special set of instances where the reference graph grows monotonically between conditions, and show that this problem admits substantially improved approximation algorithms. We also developed an integer linear programming solver for the general problem and demonstrate its ability to reach optimality with instances from the human protein interaction network. CONCLUSION: Our results demonstrate that in contrast to most connectivity problems studied in computational biology, accounting for multiplicity of biological conditions adds considerable complexity, which we propose to address with a new solver. Importantly, our results extend to several network connectivity problems that are commonly used in computational biology, such as Prize-Collecting Steiner Tree, and provide insight into the theoretical guarantees for their applications in a multiple condition setting.

16.
Entropy (Basel) ; 21(2)2019 Jan 24.
Artículo en Inglés | MEDLINE | ID: mdl-33266824

RESUMEN

The densest k-subgraph (DkS) maximization problem is to find a set of k vertices with maximum total weight of edges in the subgraph induced by this set. This problem is in general NP-hard. In this paper, two relaxation methods for solving the DkS problem are presented. One is doubly nonnegative relaxation, and the other is semidefinite relaxation with tighter relaxation compare with the relaxation of standard semidefinite. The two relaxation problems are equivalent under the suitable conditions. Moreover, the corresponding approximation ratios' results are given for these relaxation problems. Finally, some numerical examples are tested to show the comparison of these relaxation problems, and the numerical results show that the doubly nonnegative relaxation is more promising than the semidefinite relaxation for solving some DkS problems.

17.
Sensors (Basel) ; 18(7)2018 Jun 29.
Artículo en Inglés | MEDLINE | ID: mdl-29966286

RESUMEN

With the development of science and technology, modern communication scenarios have put forward higher requirements for passive location technology. However, current location systems still use manual scheduling methods and cannot meet the current mission-intensive and widely-distributed scenarios, resulting in inefficient task completion. To address this issue, this paper proposes a method called multi-objective, multi-constraint and improved genetic algorithm-based scheduling (MMIGAS), contributing a centralized combinatorial optimization model with multiple objectives and multiple constraints and conceiving an improved genetic algorithm. First, we establish a basic mathematical framework based on the structure of a passive location system. Furthermore, to balance performance with respect to multiple measures and avoid low efficiency, we propose a multi-objective optimal function including location accuracy, completion rate and resource utilization. Moreover, to enhance its practicability, we formulate multiple constraints for frequency, resource capability and task cooperation. For model solving, we propose an improved genetic algorithm with better convergence speed and global optimization ability, by introducing constraint-proof initialization, a penalty function and a modified genetic operator. Simulations indicate the good astringency, steady time complexity and satisfactory location accuracy of MMIGAS. Moreover, compared with manual scheduling, MMIGAS can improve the efficiency while maintaining high location precision.

18.
BMC Genomics ; 19(Suppl 5): 252, 2018 May 08.
Artículo en Inglés | MEDLINE | ID: mdl-29745851

RESUMEN

BACKGROUND: Many supertree estimation and multi-locus species tree estimation methods compute trees by combining trees on subsets of the species set based on some NP-hard optimization criterion. A recent approach to computing large trees has been to constrain the search space by defining a set of "allowed bipartitions", and then use dynamic programming to find provably optimal solutions in polynomial time. Several phylogenomic estimation methods, such as ASTRAL, the MDC algorithm in PhyloNet, FastRFS, and ALE, use this approach. RESULTS: We present SIESTA, a method that can be combined with these dynamic programming algorithms to return a data structure that compactly represents all the optimal trees in the search space. As a result, SIESTA provides multiple capabilities, including: (1) counting the number of optimal trees, (2) calculating consensus trees, (3) generating a random optimal tree, and (4) annotating branches in a given optimal tree by the proportion of optimal trees it appears in. CONCLUSIONS: SIESTA improves the accuracy of FastRFS and ASTRAL, and is a general technique for enhancing dynamic programming methods for constrained optimization.


Asunto(s)
Algoritmos , Simulación por Computador , Filogenia , Programas Informáticos , Animales , Biología Computacional/métodos , Humanos
19.
Comput Biol Chem ; 74: 379-390, 2018 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-29650458

RESUMEN

In this paper, we introduce and analyze two graph-based models for assigning orthologs in the presence of whole-genome duplications, using similarity information between pairs of genes. The common feature of our two models is that genes of the first genome may be assigned two orthologs from the second genome, which has undergone a whole-genome duplication. Additionally, our models incorporate the new notion of duplication bonus, a parameter that reflects how assigning two orthologs to a given gene should be rewarded or penalized. Our work is mainly focused on developing exact and reasonably time-consuming algorithms for these two models: we show that the first one is polynomial-time solvable, while the second is NP-hard. For the latter, we thus design two fixed-parameter algorithms, i.e. exact algorithms whose running times are exponential only with respect to a small and well-chosen input parameter. Finally, for both models, we evaluate our algorithms on pairs of plant genomes. Our experiments show that the NP-hard model yields a better cluster quality at the cost of lower coverage, due to the fact that our instances cannot be completely solved by our algorithms. However, our results are altogether encouraging and show that our methods yield biologically significant predictions of orthologs when the duplication bonus value is properly chosen.


Asunto(s)
Algoritmos , Duplicación de Gen/genética , Modelos Genéticos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA