Display options
Share it on

Sci Rep. 2018 Nov 01;8(1):16181. doi: 10.1038/s41598-018-34528-y.

Information spectra and optimal background states for dynamical networks.

Scientific reports

Delsin Menolascino, ShiNung Ching

Affiliations

  1. Department of Electrical and Systems Engineering, Washington University in St. Louis, St. Louis, MO, 63130, USA. [email protected].
  2. Department of Electrical and Systems Engineering, Washington University in St. Louis, St. Louis, MO, 63130, USA.
  3. Department of Biology and Biomedical Sciences, Washington University in St. Louis, St. Louis, MO, 63130, USA.

PMID: 30385795 PMCID: PMC6212402 DOI: 10.1038/s41598-018-34528-y

Abstract

We consider the notion of stimulus representation over dynamic networks, wherein the network states encode information about the identify of an afferent input (i.e. stimulus). Our goal is to understand how the structure and temporal dynamics of networks support information processing. In particular, we conduct a theoretical study to reveal how the background or 'default' state of a network with linear dynamics allows it to best promote discrimination over a continuum of stimuli. Our principal contribution is the derivation of a matrix whose spectrum (eigenvalues) quantify the extent to which the state of a network encodes its inputs. This measure, based on the notion of a Fisher linear discriminant, is relativistic in the sense that it provides an information value quantifying the 'knowablility' of an input based on its projection onto the background state. We subsequently optimize the background state and highlight its relationship to underlying state noise covariance. This result demonstrates how the best idle state of a network may be informed by its structure and dynamics. Further, we relate the proposed information spectrum to the controllabilty gramian matrix, establishing a link between fundamental control-theoretic network analysis and information processing.

References

  1. Proc Natl Acad Sci U S A. 2007 Aug 7;104(32):13170-5 - PubMed
  2. Science. 1999 Oct 15;286(5439):509-12 - PubMed
  3. Nat Rev Genet. 2004 Feb;5(2):101-13 - PubMed
  4. Biotechniques. 2008 Mar;44(3):323-9 - PubMed
  5. Phys Rev Lett. 2012 May 25;108(21):218703 - PubMed
  6. Proc Natl Acad Sci U S A. 2013 Feb 12;110(7):2460-5 - PubMed
  7. Science. 2003 Sep 26;301(5641):1866-7 - PubMed
  8. Neural Comput. 1998 Oct 1;10(7):1731-57 - PubMed
  9. Nat Commun. 2015 Oct 01;6:8414 - PubMed
  10. PLoS Comput Biol. 2017 Apr 18;13(4):e1005497 - PubMed
  11. Proc Natl Acad Sci U S A. 2008 Dec 2;105(48):18970-5 - PubMed
  12. Nature. 2011 May 12;473(7346):167-73 - PubMed
  13. IEEE Trans Neural Netw. 2006 Nov;17(6):1349-61 - PubMed
  14. Proc Natl Acad Sci U S A. 2001 Jan 16;98(2):676-82 - PubMed
  15. Phys Rev E Stat Nonlin Soft Matter Phys. 2001 Nov;64(5 Pt 1):051903 - PubMed
  16. Nature. 2007 Apr 19;446(7138):860 - PubMed
  17. Proc Natl Acad Sci U S A. 2003 Jan 7;100(1):253-8 - PubMed
  18. Nat Rev Neurosci. 2011 Jan;12(1):43-56 - PubMed
  19. Proc Natl Acad Sci U S A. 2003 May 27;100(11):6313-8 - PubMed

Publication Types

Grant support