It is known that Kaniadakis entropy, a generalization of the Shannon-Boltzmann-Gibbs entropic form, is always super-additive for any bipartite statistically independent distributions. In this paper, we show that when imposing a suitable constraint, there exist classes of maximal entropy distributions labeled by a positive real number ℵ>0 that makes Kaniadakis entropy multi-additive, i.e., Sκ[pA∪B]=(1+ℵ)Sκ[pA]+Sκ[pB], under the composition of two statistically independent and identically distributed distributions pA∪B(x,y)=pA(x)pB(y), with reduced distributions pA(x) and pB(y) belonging to the same class.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10814381 | PMC |
http://dx.doi.org/10.3390/e26010077 | DOI Listing |
Sensors (Basel)
September 2024
National Key Laboratory of Laser Spatial Information, Institute of Opto-Electronic, Harbin Institute of Technology, Harbin 150001, China.
The photon-counting light laser detection and ranging (LiDAR), especially the Geiger mode avalanche photon diode (Gm-APD) LiDAR, can obtain three-dimensional images of the scene, with the characteristics of single-photon sensitivity, but the background noise limits the imaging quality of the laser radar. In order to solve this problem, a depth image estimation method based on a two-dimensional (2D) Kaniadakis entropy thresholding method is proposed which transforms a weak signal extraction problem into a denoising problem for point cloud data. The characteristics of signal peak aggregation in the data and the spatio-temporal correlation features between target image elements in the point cloud-intensity data are exploited.
View Article and Find Full Text PDFEntropy (Basel)
May 2024
Dipartimento di Scienza Applicata e Tecnologia, Politecnico di Torino, Corso Duca degli Abruzzi 24, 10129 Torino, Italy.
The axiomatic structure of the κ-statistcal theory is proven. In addition to the first three standard Khinchin-Shannon axioms of continuity, maximality, and expansibility, two further axioms are identified, namely the self-duality axiom and the scaling axiom. It is shown that both the κ-entropy and its special limiting case, the classical Boltzmann-Gibbs-Shannon entropy, follow unambiguously from the above new set of five axioms.
View Article and Find Full Text PDFEntropy (Basel)
January 2024
Region of Electrical and Electronic Systems Engineering, Ibaraki University, 4-12-1 Nakanarusawa-cho, Hitachi 316-8511, Ibaraki, Japan.
It is known that Kaniadakis entropy, a generalization of the Shannon-Boltzmann-Gibbs entropic form, is always super-additive for any bipartite statistically independent distributions. In this paper, we show that when imposing a suitable constraint, there exist classes of maximal entropy distributions labeled by a positive real number ℵ>0 that makes Kaniadakis entropy multi-additive, i.e.
View Article and Find Full Text PDFEntropy (Basel)
July 2023
Department of Political Science, Communication and International Relations, University of Macerata, Via Don Minzoni 22/A, 62100 Macerata, Italy.
The paper reviews the "κ-generalized distribution", a statistical model for the analysis of income data. Basic analytical properties, interrelationships with other distributions, and standard measures of inequality such as the Gini index and the Lorenz curve are covered. An extension of the basic model that best fits wealth data is also discussed.
View Article and Find Full Text PDFEntropy (Basel)
July 2023
Department of Mathematics, University of Genoa, 16144 Genova, Italy.
We propose to use a particular case of Kaniadakis' logarithm for the exploratory analysis of compositional data following the Aitchison approach. The affine information geometry derived from Kaniadakis' logarithm provides a consistent setup for the geometric analysis of compositional data. Moreover, the affine setup suggests a rationale for choosing a specific divergence, which we name the Kaniadakis divergence.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!