previous home search |
LaTeX -
PostScript -
PDF -
Html/Gif
| contact up next |

## Distribution of Mutual Information from Complete and Incomplete Data

Authors:Marcus Hutter and Marco Zaffalon (2002-2005) Comments:26 pages, 5 figures, 4 tables Subj-class:Learning; Artificial Intelligence ACM-class:I.2 Reference:Computational Statistics & Data Analysis, 48:3 (2005) 633-657 Report-no:IDSIA-11-02 and cs.LG/0403025 Paper:LaTeX - PostScript - PDF - Html/Gif Slides:PowerPoint - PDF

Keywords:Mutual information, cross entropy, Dirichlet distribution, second order distribution, expectation and variance of mutual information, feature selection, filters, naive Bayes classifier, Bayesian statistics.

Abstract:Mutual information is widely used, in a descriptive way, to measure the stochastic dependence of categorical random variables. In order to address questions such as the reliability of the descriptive value, one must consider sample-to-population inferential approaches. This paper deals with the posterior distribution of mutual information, as obtained in a Bayesian framework by a second-order Dirichlet prior distribution. The exact analytical expression for the mean, and analytical approximations for the variance, skewness and kurtosis are derived. These approximations have a guaranteed accuracy level of the order O(1/n^3), where n is the sample size. Leading order approximations for the mean and the variance are derived in the case of incomplete samples. The derived analytical expressions allow the distribution of mutual information to be approximated reliably and quickly. In fact, the derived expressions can be computed with the same order of complexity needed for descriptive mutual information. This makes the distribution of mutual information become a concrete alternative to descriptive mutual information in many applications which would benefit from moving to the inductive side. Some of these prospective applications are discussed, and one of them, namely feature selection, is shown to perform significantly better when inductive mutual information is used.

previous home search |
LaTeX -
PostScript -
PDF -
Html/Gif
| contact up next |

@Article{Hutter:04mifs, author = "Marcus Hutter and Marco Zaffalon", title = "Distribution of Mutual Information from Complete and Incomplete Data", _number = "IDSIA-11-02", journal = "Computational Statistics \& Data Analysis", volume = "48", number = "3", pages = "633--657", year = "2005", _month = mar, publisher = "Elsevier Science", url = "http://www.hutter1.net/ai/mifs.htm", url2 = "http://arxiv.org/abs/cs.LG/0403025", ftp = "ftp://ftp.idsia.ch/pub/techrep/IDSIA-11-02.pdf", categories = "I.2. [Artificial Intelligence]", keywords = "Mutual information, cross entropy, Dirichlet distribution, second order distribution, expectation and variance of mutual information, feature selection, filters, naive Bayes classifier, Bayesian statistics.", abstract = "Mutual information is widely used, in a descriptive way, to measure the stochastic dependence of categorical random variables. In order to address questions such as the reliability of the descriptive value, one must consider sample-to-population inferential approaches. This paper deals with the posterior distribution of mutual information, as obtained in a Bayesian framework by a second-order Dirichlet prior distribution. The exact analytical expression for the mean, and analytical approximations for the variance, skewness and kurtosis are derived. These approximations have a guaranteed accuracy level of the order O(1/n^3), where n is the sample size. Leading order approximations for the mean and the variance are derived in the case of incomplete samples. The derived analytical expressions allow the distribution of mutual information to be approximated reliably and quickly. In fact, the derived expressions can be computed with the same order of complexity needed for descriptive mutual information. This makes the distribution of mutual information become a concrete alternative to descriptive mutual information in many applications which would benefit from moving to the inductive side. Some of these prospective applications are discussed, and one of them, namely feature selection, is shown to perform significantly better when inductive mutual information is used.", note = "to appear", }

previous home search |
LaTeX -
PostScript -
PDF -
Html/Gif
| contact up next |