Given current assumptions about the biology of neural organization, some connectionists believe that it may not be possible to accurately model the brain's neural architecture. We have identified five restrictive neurobiological dogmas that we believe have limited the exploration of more fundamental correlations between computational and biological neural networks. We postulate that: 1) the dendritic tree serves as a synapse storage device rather than a simple summation device; 2) connection strength between neurons depends on the number and location of synapses of similar weight, not on synapses of variable weights; 3) axonal sprouting occurs regularly in adult organisms; 4) the postsynaptic genome directly controls the presynaptic cell via mRNA, rather than indirectly by the expression of NCAMs, reverse neurotransmitters, etc.; 5) dendritic spines serve a trophic function by controlling development of new sprouts via a process we term retroduction. We entertain an alternative formulation of a computational neural element that is fully consistent with modern neuroscience research. We then show how our model neuron can learn under Hebbian conditions, and extend the model to explain non-Hebbian, one-trial learning. This work is significant because by stretching the theoretical boundaries of modern neuroscience, we show how connectionists can potentially create new, more biologically-based neural elements which, when, interconnected into networks, exhibit not only properties of existing backpropagation networks, but other physiological properties as well.
ASJC Scopus subject areas