Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 9 de 9
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Entropy (Basel) ; 25(1)2023 Jan 04.
Artículo en Inglés | MEDLINE | ID: mdl-36673244

RESUMEN

This paper provides new observations on the Lovász θ-function of graphs. These include a simple closed-form expression of that function for all strongly regular graphs, together with upper and lower bounds on that function for all regular graphs. These bounds are expressed in terms of the second-largest and smallest eigenvalues of the adjacency matrix of the regular graph, together with sufficient conditions for equalities (the upper bound is due to Lovász, followed by a new sufficient condition for its tightness). These results are shown to be useful in many ways, leading to the determination of the exact value of the Shannon capacity of various graphs, eigenvalue inequalities, and bounds on the clique and chromatic numbers of graphs. Since the Lovász θ-function factorizes for the strong product of graphs, the results are also particularly useful for parameters of strong products or strong powers of graphs. Bounds on the smallest and second-largest eigenvalues of strong products of regular graphs are consequently derived, expressed as functions of the Lovász θ-function (or the smallest eigenvalue) of each factor. The resulting lower bound on the second-largest eigenvalue of a k-fold strong power of a regular graph is compared to the Alon-Boppana bound; under a certain condition, the new bound is superior in its exponential growth rate (in k). Lower bounds on the chromatic number of strong products of graphs are expressed in terms of the order and the Lovász θ-function of each factor. The utility of these bounds is exemplified, leading in some cases to an exact determination of the chromatic numbers of strong products or strong powers of graphs. The present research paper is aimed to have tutorial value as well.

2.
Entropy (Basel) ; 24(5)2022 Apr 25.
Artículo en Inglés | MEDLINE | ID: mdl-35626483

RESUMEN

The present paper offers, in its first part, a unified approach for the derivation of families of inequalities for set functions which satisfy sub/supermodularity properties. It applies this approach for the derivation of information inequalities with Shannon information measures. Connections of the considered approach to a generalized version of Shearer's lemma, and other related results in the literature are considered. Some of the derived information inequalities are new, and also known results (such as a generalized version of Han's inequality) are reproduced in a simple and unified way. In its second part, this paper applies the generalized Han's inequality to analyze a problem in extremal graph theory. This problem is motivated and analyzed from the perspective of information theory, and the analysis leads to generalized and refined bounds. The two parts of this paper are meant to be independently accessible to the reader.

3.
Entropy (Basel) ; 24(5)2022 May 16.
Artículo en Inglés | MEDLINE | ID: mdl-35626595

RESUMEN

Data science, information theory, probability theory, statistical learning, statistical signal processing, and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures [...].

4.
Entropy (Basel) ; 23(3)2021 Feb 25.
Artículo en Inglés | MEDLINE | ID: mdl-33668754

RESUMEN

This paper studies the problem of upper bounding the number of independent sets in a graph, expressed in terms of its degree distribution. For bipartite regular graphs, Kahn (2001) established a tight upper bound using an information-theoretic approach, and he also conjectured an upper bound for general graphs. His conjectured bound was recently proved by Sah et al. (2019), using different techniques not involving information theory. The main contribution of this work is the extension of Kahn's information-theoretic proof technique to handle irregular bipartite graphs. In particular, when the bipartite graph is regular on one side, but may be irregular on the other, the extended entropy-based proof technique yields the same bound as was conjectured by Kahn (2001) and proved by Sah et al. (2019).

5.
Entropy (Basel) ; 22(5)2020 May 18.
Artículo en Inglés | MEDLINE | ID: mdl-33286335

RESUMEN

The relative entropy and the chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of f-divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong data-processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete-time Markov chains.

6.
Entropy (Basel) ; 22(6)2020 Jun 26.
Artículo en Inglés | MEDLINE | ID: mdl-33286479

RESUMEN

This work is an extension of our earlier article, where a well-known integral representation of the logarithmic function was explored and was accompanied with demonstrations of its usefulness in obtaining compact, easily-calculable, exact formulas for quantities that involve expectations of the logarithm of a positive random variable. Here, in the same spirit, we derive an exact integral representation (in one or two dimensions) of the moment of a nonnegative random variable, or the sum of such independent random variables, where the moment order is a general positive non-integer real (also known as fractional moments). The proposed formula is applied to a variety of examples with an information-theoretic motivation, and it is shown how it facilitates their numerical evaluations. In particular, when applied to the calculation of a moment of the sum of a large number, n, of nonnegative random variables, it is clear that integration over one or two dimensions, as suggested by our proposed integral representation, is significantly easier than the alternative of integrating over n dimensions, as needed in the direct calculation of the desired moment.

7.
Entropy (Basel) ; 22(1)2019 Dec 30.
Artículo en Inglés | MEDLINE | ID: mdl-33285826

RESUMEN

We explore a well-known integral representation of the logarithmic function, and demonstrate its usefulness in obtaining compact, easily computable exact formulas for quantities that involve expectations and higher moments of the logarithm of a positive random variable (or the logarithm of a sum of i.i.d. positive random variables). The integral representation of the logarithm is proved useful in a variety of information-theoretic applications, including universal lossless data compression, entropy and differential entropy evaluations, and the calculation of the ergodic capacity of the single-input, multiple-output (SIMO) Gaussian channel with random parameters (known to both transmitter and receiver). This integral representation and its variants are anticipated to serve as a useful tool in additional applications, as a rigorous alternative to the popular (but non-rigorous) replica method (at least in some situations).

8.
Entropy (Basel) ; 20(5)2018 May 19.
Artículo en Inglés | MEDLINE | ID: mdl-33265473

RESUMEN

This paper is focused on f-divergences, consisting of three main contributions. The first one introduces integral representations of a general f-divergence by means of the relative information spectrum. The second part provides a new approach for the derivation of f-divergence inequalities, and it exemplifies their utility in the setup of Bayesian binary hypothesis testing. The last part of this paper further studies the local behavior of f-divergences.

9.
Entropy (Basel) ; 20(12)2018 Nov 22.
Artículo en Inglés | MEDLINE | ID: mdl-33266620

RESUMEN

This paper provides tight bounds on the Rényi entropy of a function of a discrete random variable with a finite number of possible values, where the considered function is not one to one. To that end, a tight lower bound on the Rényi entropy of a discrete random variable with a finite support is derived as a function of the size of the support, and the ratio of the maximal to minimal probability masses. This work was inspired by the recently published paper by Cicalese et al., which is focused on the Shannon entropy, and it strengthens and generalizes the results of that paper to Rényi entropies of arbitrary positive orders. In view of these generalized bounds and the works by Arikan and Campbell, non-asymptotic bounds are derived for guessing moments and lossless data compression of discrete memoryless sources.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA