Biological neurons integrate electrochemical signals across thousands of dendritic inputs. The soma sums weighted inputs continuously. When membrane potential crosses threshold (~-55mV) at the axon hillock, an action potential fires - a stereotyped all-or-nothing voltage spike propagating down the axon.
Signals transmit at synapses via neurotransmitter release into the synaptic cleft, binding receptors on the postsynaptic cell. Both excitatory (AMPA, NMDA) and inhibitory (GABA) synapses exist. The ratio and timing of both shape output.
-70mV. Na⁺/K⁺ pumps maintain electrochemical gradient. Membrane is polarized. Na⁺ channels closed, K⁺ leak channels open. Cell is integrating dendritic input continuously.
Sufficient input drives membrane to ~-55mV threshold. Voltage-gated Na⁺ channels open. Na⁺ floods in. Membrane potential rockets to +40mV in ~1ms. This is the spike.
Na⁺ channels inactivate. Voltage-gated K⁺ channels open. K⁺ flows out. Membrane returns toward resting potential. Slight hyperpolarization (undershoot, ~-80mV).
Absolute refractory: cannot fire again for ~1-2ms regardless of input. Relative refractory: can fire with stronger stimulus. Limits max firing rate ~500-1000Hz. Enforces directionality.
Rate coding: information in spike frequency. Temporal coding: information in precise timing and burst patterns. Brain likely uses both. ANNs use neither - they output continuous values.
Highly myelinated long-range axons. Prefrontal cortex enables abstraction and planning. Dense cortico-thalamic loops. Glial cells (astrocytes) actively modulate synaptic transmission - not just support cells.
Each arm contains an autonomous ganglion that processes sensory data and executes motor commands without central brain input. Distributed intelligence. No myelin but giant axons for squid escape reflex (Mauthner equivalent). Unique: recodes proteins via RNA editing to adapt to temp.
Insect neurons are small, often monopolar (single process), and tightly packed. Mushroom bodies (the insect analogue of cortex) support associative learning, memory, navigation. Bees perform path integration, symbolic communication (waggle dance), concept formation - all from 1 million neurons.
The only organism with a fully mapped connectome (Brenner, White et al. 1986). 302 neurons producing locomotion, chemotaxis, thermotaxis, learning. Critical model organism for neural circuit research. OpenWorm project simulated the full nervous system in software.
| BIOLOGICAL | ARTIFICIAL | |
|---|---|---|
| Signal type | Discrete spikes (binary events) | Continuous floats |
| Computation | Spatial + temporal integration | Weighted sum + nonlinearity |
| Time | Temporal dynamics matter | No time dimension |
| Learning | Hebbian, STDP, neuromod | Backprop + gradient descent |
| Synapse type | Excitatory + inhibitory + modulatory | Weights only (positive or negative) |
| Dendrites | Compute themselves (active) | Not modeled |
| Plasticity | Continuous, lifelong | Train-time only (mostly) |
| Noise | Inherent stochasticity | Deterministic (mostly) |
| Glial cells | Active role in computation | Not modeled at all |
| Fan-in | 1,000 - 100,000 synapses | Thousands to millions |
| Fan-out | 100 - 10,000 targets | Architecture-defined |
| Refractory | Hard physiological limit | None |
Spiking Neural Networks attempt to close the gap - neurons fire discrete spikes and time carries information. The standard SNN neuron model is the Leaky Integrate-and-Fire (LIF):
More complex models (Hodgkin-Huxley, Izhikevich) simulate actual ion channel dynamics but are computationally heavier. The trade-off: biological fidelity vs tractability.
SNNs are event-driven - computation only happens when spikes occur. On neuromorphic hardware this is extremely efficient. Intel Loihi 2 and IBM TrueNorth implement spiking networks in silicon, achieving orders of magnitude better energy per inference than GPU-based ANNs for certain tasks.
The brain's efficiency advantage is the single most motivating fact in neuromorphic computing. 20 watts to run a human brain. The biological solution uses sparse spiking, analog computation, co-located memory and processing, and massive parallelism.
The brain's trick: von Neumann architectures separate memory from compute, requiring massive data movement (the "memory wall"). The brain co-locates both at the synapse. Each synaptic weight is stored exactly where computation happens. No bus, no bandwidth bottleneck.