Shannon's capacity theorem is the main concept behind the theory of communication. It says that if the amount of information contained in a signal is smaller than the channel capacity of a physical media of communication, it can be transmitted with arbitrarily small probability of error. This theorem is usually applicable to ideal channels of communication in which the information to be transmitted does not alter the passive characteristics of the channel that basically tries to reproduce the source of information. For an active channel, a network formed by elements that are dynamical systems (such as neurons, chaotic or periodic oscillators), it is unclear if such theorem is applicable, once an active channel can adapt to the input of a signal, altering its capacity. To shed light into this matter, we show, among other results, how to calculate the information capacity of an active channel of communication. Then, we show that the channel capacity depends on whether the active channel is self-excitable or not and that, contrary to a current belief, desynchronization can provide an environment in which large amounts of information can be transmitted in a channel that is self-excitable. An interesting case of a self-excitable active channel is a network of electrically connected Hindmarsh-Rose chaotic neurons.
|Number of pages||13|
|Journal||Physical Review. E, Statistical, Nonlinear and Soft Matter Physics|
|Publication status||Published - 8 Feb 2008|
- inferior olive