Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT
  Comparison of image decomposition techniques: results from psychophysics and computation

Nielsen, K., Rainer, G., & Logothetis, N. (2001). Comparison of image decomposition techniques: results from psychophysics and computation. Poster presented at 4. Tübinger Wahrnehmungskonferenz (TWK 2001), Tübingen, Germany.

Item is

Externe Referenzen

einblenden:

Urheber

einblenden:
ausblenden:
 Urheber:
Nielsen, K1, Autor           
Rainer, G1, Autor           
Logothetis, NK1, Autor           
Affiliations:
1Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497798              

Inhalt

einblenden:
ausblenden:
Schlagwörter: -
 Zusammenfassung: Our visual environment is highly structured, generating complex statistical dependencies in natural scenes. In the framework of information theory, these dependencies of input variables generate redundancy in the input. Removing the redundancies would lead to the construction of an efficient code for natural scenes. A number of computational algorithms have been developed that are based on the assumption of redundancy reduction. Four of these algorithms were used here that are often discussed in the context of natural scene statistics. These algorithms were Principal Component Analysis, Independent Component Analysis, Sparsenet and Non-Negative-Matrix Factorization. Each of the algorithms decomposes the natural scenes into basis features or functions. The input can then be reconstructed again by a linear combination of these basis functions. Thus, the decomposition algorithms can be seen as performing filtering of the input data, such that the basis functions filter statistical dependencies. Because each of the four algorithms has a different statistical criterion for constructing the basis functions, each of them filters different types of statistics in the natural images. Usually, the performance of these decomposition algorithms is evaluated by calculating analytically defined measures, e.g. the reconstruction error between the original and the reconstructed image or the redundancy reduction achieved with the decomposition. Here, in addition to the calculation of reconstruction errors, we tested the algorithms in three psychophysical experiments. In each of the experiments, subjects had to match reconstructed images to their originals in a delayed match-to-sample paradigm. The performance of subjects in this task depends on the ability of the algorithms to preserve the information present in the original scenes. The three experiments tested different properties of the algorithms. The first experiment was concerned with the dependence of the psychophysical performance on the number of basis functions used in the reconstruction. Since each of the algorithm derives its basis functions by applying a statistical criterion to a set of training images, it is an important question how general these basis functions are. This was tested in the second experiment. Finally, the third experiment assessed the robustness of the algorithms against noise added in the reconstruction process. This evaluation of decomposition algorithms in a more natural context provides new insights that can extend the computational results. It also helps to clarify which statistical criteria are of importance in natural vision as opposed to purely computational purposes.

Details

einblenden:
ausblenden:
Sprache(n):
 Datum: 2001-03
 Publikationsstatus: Erschienen
 Seiten: -
 Ort, Verlag, Ausgabe: -
 Inhaltsverzeichnis: -
 Art der Begutachtung: -
 Identifikatoren: URI: http://www.twk.tuebingen.mpg.de/twk01/Pcomp.htm
BibTex Citekey: NielsenRL2001
 Art des Abschluß: -

Veranstaltung

einblenden:
ausblenden:
Titel: 4. Tübinger Wahrnehmungskonferenz (TWK 2001)
Veranstaltungsort: Tübingen, Germany
Start-/Enddatum: -

Entscheidung

einblenden:

Projektinformation

einblenden:

Quelle

einblenden: