Analyzing the Effect of Bin-width on the Computed Entropy

Authors

  • Sri Purwani Department of Mathematics, Padjadjaran University, Bandung
  • Sudradjat Supian Department of Mathematics, Padjadjaran University, Bandung
  • Carole Twining Imaging Science, the University of Manchester, Manchester

DOI:

https://doi.org/10.26713/jims.v9i4.1023

Keywords:

Entropy, Objective function, Gaussian distribution, Flat distribution

Abstract

The Shannon entropy is a mathematical expression for quantifying the amount of randomness which can be used to measure information content. It is used in objective function. Mutual Information (MI) uses Shannon entropy in order to determine shared information content of two images. The Shannon entropy, which was originally derived by Shannon in the context of lossless encoding of messages, is also used to define an optimum message length used in the Minimum Description Length (MDL) principle for groupwise registration. We first derived the Shannon entropy from the integral of probability density function (pdf), and thenfound that Gaussian has maximum entropy over all possible distribution. We also show that the entropy of the flat distribution is less than the entropy of the Gaussian distribution with the same variance.We then investigated the effect of bin-width on the computed entropy. Weanalyzed the relationship between the computed entropy and the integral entropy when we vary bin-width, but fix variance and the number of samples. We then found that the value of the computed entropy lies within the theoretical predictions at small and large bin-widths. Wealso show two types of bias in entropy estimators.

Downloads

Download data is not yet available.

References

L.G. Brown, A survey of image registration techniques, ACM Computing Surveys 24 (1992), 325 – 376.

C.C. Brun, N. Leporé, X. Pennec, Y. Chou, A.D. Lee, G.D. Zubicaray, M.J. Wright, J.C. Gee, P.M. Thompson and et al., A non-conservative lagrangian framework for statistical fluid registration - SAFIRA, IEEE Transactions on Medical Imaging 30(2) (2011), 184 – 202.

A.G. Carlton, On the bias of information estimates, Psychological Bulletin 71(2) (1969), 108 – 109.

J.H.C. Lisman and M.C.A. van Zuylen, Note on the generation of most probable frequency distributions, Statistica Neerlandica 26(1) (1972), 19 – 23.

R. Moddemeijer, On estimation of entropy and mutual information of continuous distributions, Signal Processing 16(3) (1989), 233 – 248.

L. Paninski, Estimation of entropy and mutual information, Neural Computation 15(6) (2003), 1191 – 1253.

J.P.W. Pluim, J.B.A. Maintz and M.A. Viergever, Interpolation artefacts in mutual informationbased image registration, Computer Vision and Image Understanding 77 (2000), 211 – 232.

J.P.W. Pluim, J.B.A. Maintz and M.A. Viergever, Mutual-information based registration of medical images: a survey, IEEE Transactions on Medical Imaging 22(8) (2003), 986 – 1004.

M.R. Sabuncu and P. Ramadge, Using spanning graphs for effcient image registration, IEEE Transactions on Image Processing 17(5) (2008), 788 – 797.

C.E. Shannon, A mathematical theory of communication, The Bell System Technical Journal 27(1948), 379 – 423.

C. Studholme, D.L.G. Hill and D.J. Hawkes, Multiresolution voxel similarity measures for MR-PET registration, in Information Processing in Medical Imaging, Vol. 3 of Computational Imaging and Vision (1995), 287 – 298, Kluwer.

C.J. Twining and C.J. Taylor, Specificity: a graph-based estimator of divergence, IEEE Transactions on Pattern Analysis and Machine Intelligence 33(12) (2011), 2492 – 2505.

P.A. Viola, Alignment by Maximization of Mutual Information, PhD thesis, Massachusetts Institute of Technology (1995).

P. Viola and W.M. Wells III, Alignment by maximization of mutual information, International Journal of Computer Vision 24(2) (1997), 137 – 154.

Downloads

Published

2017-12-30
CITATION

How to Cite

Purwani, S., Supian, S., & Twining, C. (2017). Analyzing the Effect of Bin-width on the Computed Entropy. Journal of Informatics and Mathematical Sciences, 9(4), 1117–1123. https://doi.org/10.26713/jims.v9i4.1023

Issue

Section

Research Articles