"Directed Information Measures in Neuroscience"
edited by Michael Wibral, Raul Vicente, Joseph T. Lizier
in series "Understanding Complex Systems",
Springer, Berlin, 2014.
About -- Downloads (via Springer) -- Purchase (via amazon)
The book grew out of a workshop I co-organised with Michael and Raul in Frankfurt in April 2014 -- NeFF-Workshop on Non-linear and model-free Interdependence Measures in Neuroscience. Our workshop focussed on the use of transfer entropy in computational neuroscience. We managed to attract several good speakers from this field, including Daniele Marinazzo, Daniel Chicaharro, Luca Faes and Vasily Vakorin, as well as a good crowd of participants, many of whom were quite knowledgeable in this field, such as Demian Battaglia. In our humble opinion, the meeting was quite a success, culminating in lively discussion sessions at the end of each day. We were delighted to host Leontina Di Cecco from Springer at the workshop, and the book project grew from our discussions there.
The contributions were mainly decided at the workshop, and with chapters contributed from the aforementioned authors, we managed to span most of the research taking place on information transfer in neuroscience. The book serves as a thorough introduction to measuring directed information transfer in computational neuroscience, how this is being applied and what it can reveal, and what directions this research may take in the future. We're really happy with the end result, and even more pleased that we can share it with you now.
For a little more information, here's the teaser from the back of the book:
Analysis of information transfer has found rapid adoption in neuroscience, where a highly dynamic transfer of information continuously runs on top of the brain's slowly-changing anatomical connectivity. Measuring such transfer is crucial to understanding how flexible information routing and processing give rise to higher cognitive function. Directed Information Measures in Neuroscience reviews recent developments of concepts and tools for measuring information transfer, their application to neurophysiological recordings and analysis of interactions. Written by the most active researchers in the field the book discusses the state of the art, future prospects and challenges on the way to an efficient assessment of neuronal information transfer. Highlights include the theoretical quantification and practical estimation of information transfer, description of transfer locally in space and time, multivariate directed measures, information decomposition among a set of stimulus/responses variables and the relation between interventional and observational causality. Applications to neural data sets and pointers to open source software highlight the usefulness of these measures in experimental neuroscience. With state-of-the-art mathematical developments, computational techniques and applications to real data sets, this book will be of benefit to all graduate students and researchers interested in detecting and understanding the information transfer between components of complex systems.