Theis, Fabian J. and Bauer, Christoph and Lang, Elmar (2002) Comparison of maximum entropy and minimal mutual information in a nonlinear setting. Signal Processing 82, pp. 971-980.
Full text not available from this repository.
In blind source separation (BSS), two different separation techniques are mainly used: Minimal Mutual Information (MMI), where minimization of the mutual output information yields an independent random vector, and Maximum Entropy (ME), where the output entropy is maximized. However, it is yet unclear why ME should solve the separation problem, ie. result in an independent vector. Yang and Amari have given a partial confirmation for ME in the linear case proving that under the assumption of vanishing expectation of the sources ME does not change the solutions of MMI except for scaling and permutation. In this paper, we generalize Yang and Amaris approach to nonlinear BSS problems, where random vectors are mixed by output functions of layered neural networks. We show that certain solution points of MMI are kept fixed by ME if no scaling in all layers is allowed. In general, ME however might also change the scaling in the non-output network layers, hence leaving the MMI solution points. Therefore, we conclude this paper by suggesting that in nonlinear ME algorithms the norm of all weight matrix rows of each non-output layer should be kept fixed in later epochs during network training.
|Institutions:||Biology, Preclinical Medicine > Institut für Biophysik und physikalische Biochemie > Prof. Dr. Elmar Lang|
|Projects:||Graduiertenkolleg Nichtlinearität und Nichtgleichgewicht|
|Subjects:||500 Science > 530 Physics|
500 Science > 570 Life sciences
|Refereed:||Yes, this version has been refereed|
|Created at the University of Regensburg:||Yes|
|Deposited On:||20 Mar 2007|
|Last Modified:||04 Oct 2010 11:25|
- ASCII Citation
- Dublin Core
- HTML Citation
- OAI-ORE Resource Map (Atom Format)
- OAI-ORE Resource Map (RDF Format)
- Reference Manager
- Simple Metadata
Literature of the same author