Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT
  Partial information decomposition as a unified approach to the specification of neural goal functions

Wibral, M., Priesemann, V., Kay, J. W., Lizier, J. T., & Phillips, W. A. (2017). Partial information decomposition as a unified approach to the specification of neural goal functions. Brain and Cognition, 112, 25-38. doi:10.1016/j.bandc.2015.09.004.

Item is

Basisdaten

einblenden: ausblenden:
Genre: Zeitschriftenartikel

Urheber

einblenden:
ausblenden:
 Urheber:
Wibral, Michael, Autor
Priesemann, Viola1, Autor           
Kay, Jim W., Autor
Lizier, Joseph T., Autor
Phillips, William A., Autor
Affiliations:
1Department of Nonlinear Dynamics, Max Planck Institute for Dynamics and Self-Organization, Max Planck Society, ou_2063286              

Inhalt

einblenden:
ausblenden:
Schlagwörter: Information theory; Unique information; Shared information; Synergy; Redundancy; Predictive coding; Neural coding; Coherent infomax; Neural goal function
 Zusammenfassung: In many neural systems anatomical motifs are present repeatedly, but despite their structural similarity they can serve very different tasks. A prime example for such a motif is the canonical microcircuit of six-layered neo-cortex, which is repeated across cortical areas, and is involved in a number of different tasks (e.g. sensory, cognitive, or motor tasks). This observation has spawned interest in finding a common underlying principle, a ‘goal function’, of information processing implemented in this structure. By definition such a goal function, if universal, cannot be cast in processing-domain specific language (e.g. ‘edge filtering’, ‘working memory’). Thus, to formulate such a principle, we have to use a domain-independent framework. Information theory offers such a framework. However, while the classical framework of information theory focuses on the relation between one input and one output (Shannon’s mutual information), we argue that neural information processing crucially depends on the combination of multiple inputs to create the output of a processor. To account for this, we use a very recent extension of Shannon Information theory, called partial information decomposition (PID). PID allows to quantify the information that several inputs provide individually (unique information), redundantly (shared information) or only jointly (synergistic information) about the output. First, we review the framework of PID. Then we apply it to reevaluate and analyze several earlier proposals of information theoretic neural goal functions (predictive coding, infomax and coherent infomax, efficient coding). We find that PID allows to compare these goal functions in a common framework, and also provides a versatile approach to design new goal functions from first principles. Building on this, we design and analyze a novel goal function, called ‘coding with synergy’, which builds on combining external input and prior knowledge in a synergistic manner. We suggest that this novel goal function may be highly useful in neural information processing.

Details

einblenden:
ausblenden:
Sprache(n): eng - English
 Datum: 2015-10-212017-03
 Publikationsstatus: Erschienen
 Seiten: -
 Ort, Verlag, Ausgabe: -
 Inhaltsverzeichnis: -
 Art der Begutachtung: Expertenbegutachtung
 Identifikatoren: DOI: 10.1016/j.bandc.2015.09.004
BibTex Citekey: WibralPriesemannKayEtAl2015
 Art des Abschluß: -

Veranstaltung

einblenden:

Entscheidung

einblenden:

Projektinformation

einblenden:

Quelle 1

einblenden:
ausblenden:
Titel: Brain and Cognition
Genre der Quelle: Zeitschrift
 Urheber:
Affiliations:
Ort, Verlag, Ausgabe: -
Seiten: - Band / Heft: 112 Artikelnummer: - Start- / Endseite: 25 - 38 Identifikator: -