Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Zeitschriftenartikel

Thermodynamic efficiency of information and heat flow

MPG-Autoren
/persons/resource/persons75626

Janzing,  D
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Allahverdyan, A., Janzing, D., & Mahler, G. (2009). Thermodynamic efficiency of information and heat flow. Journal of Statistical Mechanics: Theory and Experiment, 2009(9): P09011, pp. 1-35. doi:10.1088/1742-5468/2009/09/P09011.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-C2F8-9
Zusammenfassung
A basic task of information processing is information transfer (flow).
P0
Here we study a pair of Brownian particles each coupled to a thermal bath
at temperatures T1 and T2 . The information flow in such a system is defined
via the time-shifted mutual information. The information flow nullifies at
equilibrium, and its efficiency is defined as the ratio of the flow to the total
entropy production in the system. For a stationary state the information flows
from higher to lower temperatures, and its efficiency is bounded from above by
(max[T1 , T2 ])/(|T1 amp;amp;amp;amp;amp;8722; T2 |). This upper bound is imposed by the second law and
it quantifies the thermodynamic cost for information flow in the present class
of systems. It can be reached in the adiabatic situation, where the particles
have widely different characteristic times. The efficiency of heat flow—defined
as the heat flow over the total amount of dissipated heat—is limited from above
by the same factor. There is a complementarity between heat and information
flow: the set-up which is most efficient for the former is the least efficient for the
latter and vice versa. The above bound for the efficiency can be (transiently)
overcome in certain non-stationary situations, but the efficiency is still limited
from above. We study yet another measure of information processing (transfer
entropy) proposed in the literature. Though this measure does not require any
thermodynamic cost, the information flow and transfer entropy are shown to be
intimately related for stationary states.