A Feedforward Neural Logic based on Synaptic and Volume Transmission,
We consider a homeostatic mechanism to maintain a plastic layer of a feedforward neural network reac- tive to a long sequence of signals, with neither falling in a fixed point of the state space nor undergoing in overfitting. The homeostatsis is achieved without asking the neural network to be able to pursue an offset through local feedbacks. Rather, each neuron evolves monotonically in the direction increasing its own parameter, while a global feedback emerges from volume transmission of a homostatic signal. Namely: 1) each neuron is triggered to increase its own param- eter in order to exceed the mean value of all of the other neurons’ parameters, and 2) a global feedback on the population emerges from the composition of the single neurons behavior paired with a reasonable rule through which surrounding neurons in the same layer are activated. We provide a formal description of the model that we implement in an ad hoc version of π-calculus. Some numerical simulations will depict some typical behaviors that seem to show a plausible biological interpretation.