RESUMO
In this work, we focus on the philosophical aspects and technical challenges that underlie the axiomatization of the non-Kolmogorovian probability framework, in connection with the problem of quantum contextuality. This fundamental feature of quantum theory has received a lot of attention recently, given that it might be connected to the speed-up of quantum computers-a phenomenon that is not fully understood. Although this problem has been extensively studied in the physics community, there are still many philosophical questions that should be properly formulated. We analyzed different problems from a conceptual standpoint using the non-Kolmogorovian probability approach as a technical tool.
RESUMO
In this work, we discuss the failure of the principle of truth functionality in the quantum formalism. By exploiting this failure, we import the formalism of N-matrix theory and non-deterministic semantics to the foundations of quantum mechanics. This is done by describing quantum states as particular valuations associated with infinite non-deterministic truth tables. This allows us to introduce a natural interpretation of quantum states in terms of a non-deterministic semantics. We also provide a similar construction for arbitrary probabilistic theories based in orthomodular lattices, allowing to study post-quantum models using logical techniques.
RESUMO
In this work we advance a generalization of quantum computational logics capable of dealing with some important examples of quantum algorithms. We outline an algebraic axiomatization of these structures.
RESUMO
The VII Conference on Quantum Foundations: 90 years of uncertainty (https://sites [...].
RESUMO
Based on the problem of quantum data compression in a lossless way, we present here an operational interpretation for the family of quantum Rényi entropies. In order to do this, we appeal to a very general quantum encoding scheme that satisfies a quantum version of the Kraft-McMillan inequality. Then, in the standard situation, where one is intended to minimize the usual average length of the quantum codewords, we recover the known results, namely that the von Neumann entropy of the source bounds the average length of the optimal codes. Otherwise, we show that by invoking an exponential average length, related to an exponential penalization over large codewords, the quantum Rényi entropies arise as the natural quantities relating the optimal encoding schemes with the source description, playing an analogous role to that of von Neumann entropy.