Evaluating performance of neural codes in model neural communication networks

Chris G. Antonopoulos (Corresponding Author), Ezequiel B Martinez, Murilo S Baptista

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

Information needs to be appropriately encoded to be reliably transmitted over physical media. Similarly, neurons have their own codes to convey information in the brain. Even though it is well-known that neurons exchange information using a pool of several protocols of spatio-temporal encodings, the suitability of each code and their performance as a function of network parameters and external stimuli is still one of the great mysteries in neuroscience. This paper sheds light on this by modeling small-size networks of chemically and electrically coupled Hindmarsh-Rose spiking neurons. We focus on a class of temporal and firing-rate codes that result from neurons’ membrane-potentials and phases, and quantify numerically their performance estimating the Mutual Information Rate, aka the rate of information exchange. Our results suggest that the firing-rate and interspike-intervals codes are more robust to additive Gaussian white noise. In a network of four interconnected neurons and in the absence of such noise, pairs of neurons that have the largest rate of information exchange using the interspike-intervals and firing-rate codes are not adjacent in the network, whereas spike-timings and phase codes (temporal) promote large rate of information exchange for adjacent neurons. If that result would have been possible to extend to larger neural networks, it would suggest that small microcircuits would preferably exchange information using temporal codes (spike-timings and phase codes), whereas on the macroscopic scale, where there would be typically pairs of neurons not directly connected due to the brain’s sparsity, firing-rate and interspike-intervals codes would be the most efficient codes.
Original languageEnglish
Pages (from-to)90-102
Number of pages13
JournalNeural Networks
Volume109
Early online date23 Oct 2018
DOIs
Publication statusPublished - Jan 2019

Fingerprint

Neural Networks (Computer)
Neurons
Telecommunication networks
Brain
White noise
Neurosciences
Membrane Potentials
Noise
Neural networks
Membranes

Keywords

  • information
  • mutual information
  • Mutual Information Rate
  • neuroscience
  • brain
  • neurons
  • external stimuli
  • neural codes
  • Hindmarsh-Rose system
  • neural networks
  • interspike-intervals code
  • firing-rate code

Cite this

Evaluating performance of neural codes in model neural communication networks. / Antonopoulos, Chris G. (Corresponding Author); Martinez, Ezequiel B; Baptista, Murilo S.

In: Neural Networks, Vol. 109, 01.2019, p. 90-102.

Research output: Contribution to journalArticle

Antonopoulos, Chris G. ; Martinez, Ezequiel B ; Baptista, Murilo S. / Evaluating performance of neural codes in model neural communication networks. In: Neural Networks. 2019 ; Vol. 109. pp. 90-102.
@article{fab9683dd6654fb38eecbd6bdd666bb2,
title = "Evaluating performance of neural codes in model neural communication networks",
abstract = "Information needs to be appropriately encoded to be reliably transmitted over physical media. Similarly, neurons have their own codes to convey information in the brain. Even though it is well-known that neurons exchange information using a pool of several protocols of spatio-temporal encodings, the suitability of each code and their performance as a function of network parameters and external stimuli is still one of the great mysteries in neuroscience. This paper sheds light on this by modeling small-size networks of chemically and electrically coupled Hindmarsh-Rose spiking neurons. We focus on a class of temporal and firing-rate codes that result from neurons’ membrane-potentials and phases, and quantify numerically their performance estimating the Mutual Information Rate, aka the rate of information exchange. Our results suggest that the firing-rate and interspike-intervals codes are more robust to additive Gaussian white noise. In a network of four interconnected neurons and in the absence of such noise, pairs of neurons that have the largest rate of information exchange using the interspike-intervals and firing-rate codes are not adjacent in the network, whereas spike-timings and phase codes (temporal) promote large rate of information exchange for adjacent neurons. If that result would have been possible to extend to larger neural networks, it would suggest that small microcircuits would preferably exchange information using temporal codes (spike-timings and phase codes), whereas on the macroscopic scale, where there would be typically pairs of neurons not directly connected due to the brain’s sparsity, firing-rate and interspike-intervals codes would be the most efficient codes.",
keywords = "information, mutual information, Mutual Information Rate, neuroscience, brain, neurons, external stimuli, neural codes, Hindmarsh-Rose system, neural networks, interspike-intervals code, firing-rate code",
author = "Antonopoulos, {Chris G.} and Martinez, {Ezequiel B} and Baptista, {Murilo S}",
note = "This work was performed using the Maxwell high performance and ICSMB computer clusters of the University of Aberdeen. All authors acknowledge financial support provided by the EPSRC Ref: EP/I032606/1 grant. C. G. A. contributed to this work while working at the University of Aberdeen and then, while working at the University of Essex.",
year = "2019",
month = "1",
doi = "10.1016/j.neunet.2018.10.008",
language = "English",
volume = "109",
pages = "90--102",
journal = "Neural Networks",
issn = "0893-6080",
publisher = "Elsevier Limited",

}

TY - JOUR

T1 - Evaluating performance of neural codes in model neural communication networks

AU - Antonopoulos, Chris G.

AU - Martinez, Ezequiel B

AU - Baptista, Murilo S

N1 - This work was performed using the Maxwell high performance and ICSMB computer clusters of the University of Aberdeen. All authors acknowledge financial support provided by the EPSRC Ref: EP/I032606/1 grant. C. G. A. contributed to this work while working at the University of Aberdeen and then, while working at the University of Essex.

PY - 2019/1

Y1 - 2019/1

N2 - Information needs to be appropriately encoded to be reliably transmitted over physical media. Similarly, neurons have their own codes to convey information in the brain. Even though it is well-known that neurons exchange information using a pool of several protocols of spatio-temporal encodings, the suitability of each code and their performance as a function of network parameters and external stimuli is still one of the great mysteries in neuroscience. This paper sheds light on this by modeling small-size networks of chemically and electrically coupled Hindmarsh-Rose spiking neurons. We focus on a class of temporal and firing-rate codes that result from neurons’ membrane-potentials and phases, and quantify numerically their performance estimating the Mutual Information Rate, aka the rate of information exchange. Our results suggest that the firing-rate and interspike-intervals codes are more robust to additive Gaussian white noise. In a network of four interconnected neurons and in the absence of such noise, pairs of neurons that have the largest rate of information exchange using the interspike-intervals and firing-rate codes are not adjacent in the network, whereas spike-timings and phase codes (temporal) promote large rate of information exchange for adjacent neurons. If that result would have been possible to extend to larger neural networks, it would suggest that small microcircuits would preferably exchange information using temporal codes (spike-timings and phase codes), whereas on the macroscopic scale, where there would be typically pairs of neurons not directly connected due to the brain’s sparsity, firing-rate and interspike-intervals codes would be the most efficient codes.

AB - Information needs to be appropriately encoded to be reliably transmitted over physical media. Similarly, neurons have their own codes to convey information in the brain. Even though it is well-known that neurons exchange information using a pool of several protocols of spatio-temporal encodings, the suitability of each code and their performance as a function of network parameters and external stimuli is still one of the great mysteries in neuroscience. This paper sheds light on this by modeling small-size networks of chemically and electrically coupled Hindmarsh-Rose spiking neurons. We focus on a class of temporal and firing-rate codes that result from neurons’ membrane-potentials and phases, and quantify numerically their performance estimating the Mutual Information Rate, aka the rate of information exchange. Our results suggest that the firing-rate and interspike-intervals codes are more robust to additive Gaussian white noise. In a network of four interconnected neurons and in the absence of such noise, pairs of neurons that have the largest rate of information exchange using the interspike-intervals and firing-rate codes are not adjacent in the network, whereas spike-timings and phase codes (temporal) promote large rate of information exchange for adjacent neurons. If that result would have been possible to extend to larger neural networks, it would suggest that small microcircuits would preferably exchange information using temporal codes (spike-timings and phase codes), whereas on the macroscopic scale, where there would be typically pairs of neurons not directly connected due to the brain’s sparsity, firing-rate and interspike-intervals codes would be the most efficient codes.

KW - information

KW - mutual information

KW - Mutual Information Rate

KW - neuroscience

KW - brain

KW - neurons

KW - external stimuli

KW - neural codes

KW - Hindmarsh-Rose system

KW - neural networks

KW - interspike-intervals code

KW - firing-rate code

UR - http://www.mendeley.com/research/evaluating-performance-neural-codes-model-neural-communication-networks

U2 - 10.1016/j.neunet.2018.10.008

DO - 10.1016/j.neunet.2018.10.008

M3 - Article

VL - 109

SP - 90

EP - 102

JO - Neural Networks

JF - Neural Networks

SN - 0893-6080

ER -