DEEP Neural Networks

The neural networks proposed in the Related Articles are dynamic, explicit, evolutionary, and predictive (DEEP). 

The networks' dynamic operation means the only changes are the levels of neuron activity.  No structural change is required, such as neurogenesis, synaptogenesis, or pruning, nor is any change required in the way neurons function, such as a change in synaptic strength or the strength of action potentials.  This makes the networks' speed consistent with the "real time" of most brain functions (a few milliseconds). 

All neurons, connections, and types of synapses are shown explicitly, and all assumptions of neuron capabilities are stated explicitly.  Only minimal neuron capabilities are assumed, and no network capabilities are assumed.  So the networks are not "black boxes." 

The networks are evolutionary in the sense that they demonstrate selective advantages for the phenomena they generate.  This includes phenomena whose functions are apparently uncertain, such as the matched periods of neural activity found in EEGs. 

Finally, the networks are predictive of nervous system phenomena.  That is, based on the explicit connections and neuron capabilities, it can be demonstrated that the models generate known nervous system phenomena, and they may predict testable phenomena that are as yet unknown. 

DEEP neural networks may show how neurons are actually connected to process information with the speed of most brain functions.  The networks presented in the Related Articles are apparently the only theoretical neural networks in the literature that have the four DEEP properties.  The dual purpose of the articles is to demonstrate that it is possible to design DEEP neural networks and to call attention to the dearth of such models in the literature.  DEEP neural networks can promote discoveries of how synaptic connections are organized at the local level.