Skip to main content
Medicine LibreTexts

2.2: Basic Biology of a Neuron as Detector

  • Page ID
    12564
    • Contributed by

    Figure 2.2 shows the correspondence between neural biology and the detection functions they serve. Synapses are the connection points between sending neurons(the ones firing an alarm and sending a signal) and receiving neurons (the ones receiving that signal). Most synapses are on dendrites, which are the large branching trees (the word "dendrite" is derived from the Greek "dendros," meaning tree), which is where the neuron integrates all the input signals. Like tributaries flowing into a major river, all these signals flow into the main dendritic trunk and into the cell body, where the final integration of the signal takes place. The thresholding takes place at the very start of the output-end of the neuron, called the axon (this starting place is called the axon hillock -- apparently it looks like a little hill or something). The axon also branches widely and is what forms the other side of the synapses onto other neuron's dendrites, completing the next chain of communication. And onward it goes.

    fig_neuron_as_detect.png

    Figure \(2.2\): Neuron as a detector, with corresponding biological components.

    This is all you need to know about the neuron biology to understand the basic detector functionality: It just receives inputs, integrates them, and decides whether the integrated input is sufficiently strong to trigger an output signal.

    There are some additional biological properties regarding the nature of the input signals, which we'll see have various implications for neural function, including making the integration process better able to deal with large changes in overall input signal strength. There are at least three major sources of input signals to the neuron:

    • Excitatory inputs -- these are the "normal", most prevalent type of input from other neurons (roughly 85% of all inputs), which have the effect of exciting the receiving neuron (making it more likely to get over threshold and fire an "alarm"). They are conveyed via a synaptic channel called AMPA, which is opened by the neurotransmitter glutamate.
    • Inhibitory inputs -- these are the other 15% of inputs, which have the opposite effect to the excitatory inputs -- they cause the neuron to be less likely to fire, and serve to make the integration process much more robust by keeping the excitation in check. There are specialized neurons in the brain called inhibitory interneurons that generate this inhibitory input (we'll learn a lot more about these in the Networks chapter). This input comes in via GABA synaptic channels, driven by the neurotransmitter GABA.
    • Leak inputs -- these aren't technically inputs, as they are always present and active, but they serve a similar function to the inhibitory inputs, by counteracting the excitation and keeping the neuron in balance overall. Biologically, leak channels are potassium channels (K).

    The inhibitory and excitatory inputs come from different neurons in the cortex: a given neuron can only send either excitatory or inhibitory outputs to other neurons, not both (although neurons in other brain areas do violate this constraint, neocortical pyramidal neurons appear to obey it). We will see the multiple implications of this constraint throughout the text.

    Finally, we introduce the notion of the net synaptic efficacy or weight, which represents the total impact that a sending neuron activity signal can have on the receiving neuron, via its synaptic connection. The synaptic weight is one of the most important concepts in the entire field of computational cognitive neuroscience! We will be exploring it in many different ways as we go along. Biologically, it represents the net ability of the sending neuron's action potential to release neurotransmitter, and the ability of that neurotransmitter to open synaptic channels on the postsynaptic side (including the total number of such channels that are available to be opened). For the excitatory inputs, it is thus the amount of glutamate released by the sending neuron into the synapse, and the number and efficacy of AMPA channels on the receiving neuron's side of the synapse. Computationally, the weights determine what a neuron is detecting. A strong weight value indicates that the neuron is very sensitive to that particular input neuron, while a low weight means that that input is relatively unimportant. The entire process of Learning amounts to changing these synaptic weights as a function of neural activity patterns in the sending and receiving neurons. In short, everything you know, every cherished memory in your brain, is encoded as a pattern of synaptic weights!

    To learn more about the biology of the neuron, see Neuron/Biology.