Generalized Information Measures and Their Applications
by Inder Jeet Taneja
Publisher: Universidade Federal de Santa Catarina 2001
Contents: Shannon's Entropy; Information and Divergence Measures; Entropy-Type Measures; Generalized Information and Divergence Measures; M-Dimensional Divergence Measures and Their Generalizations; Unified (r,s)-Multivariate Entropies; Noiseless Coding and Generalized Information Measures; Channel Capacity and Source Coding Theorems; Statistical Aspects of Information Measures; Bayesian Probability of Error and Generalized Information Measures; Fuzzy Sets and Information Measures.
Home page url
Download or read it online for free here:
by Peter D. Gruenwald, Paul M.B. Vitanyi - CWI
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain this quantitative approach to defining information and discuss the extent to which Kolmogorov's and Shannon's theory have a common purpose.
by John Watrous - University of Calgary
The focus is on the mathematical theory of quantum information. We will begin with basic principles and methods for reasoning about quantum information, and then move on to a discussion of various results concerning quantum information.
by Robert H. Schumann - arXiv
A short review of ideas in quantum information theory. Quantum mechanics is presented together with some useful tools for quantum mechanics of open systems. The treatment is pedagogical and suitable for beginning graduates in the field.
by Neri Merhav - arXiv
Lecture notes for a graduate course focusing on the relations between Information Theory and Statistical Physics. The course is aimed at EE graduate students in the area of Communications and Information Theory, or graduate students in Physics.