NEURON

BIO vs MACHINE // COMPARATIVE ANATOMY ACTION POTENTIALS // ANN UNITS // SPIKING NETWORKS
HUMAN // INSECT // CEPHALOPOD // SILICON
01 // Biological Neuron
THE REAL THING
DENDRITES SOMA NUCLEUS HILLOCK AXON MYELIN SHEATH SYNAPSES -70mV rest → +40mV spike → refractory

Biological neurons integrate electrochemical signals across thousands of dendritic inputs. The soma sums weighted inputs continuously. When membrane potential crosses threshold (~-55mV) at the axon hillock, an action potential fires - a stereotyped all-or-nothing voltage spike propagating down the axon.

Signals transmit at synapses via neurotransmitter release into the synaptic cleft, binding receptors on the postsynaptic cell. Both excitatory (AMPA, NMDA) and inhibitory (GABA) synapses exist. The ratio and timing of both shape output.

02 // Action Potential
THE SPIKE
RESTING

-70mV. Na⁺/K⁺ pumps maintain electrochemical gradient. Membrane is polarized. Na⁺ channels closed, K⁺ leak channels open. Cell is integrating dendritic input continuously.

DEPOLARIZE

Sufficient input drives membrane to ~-55mV threshold. Voltage-gated Na⁺ channels open. Na⁺ floods in. Membrane potential rockets to +40mV in ~1ms. This is the spike.

REPOLARIZE

Na⁺ channels inactivate. Voltage-gated K⁺ channels open. K⁺ flows out. Membrane returns toward resting potential. Slight hyperpolarization (undershoot, ~-80mV).

REFRACTORY

Absolute refractory: cannot fire again for ~1-2ms regardless of input. Relative refractory: can fire with stronger stimulus. Limits max firing rate ~500-1000Hz. Enforces directionality.

spike train encoding // time →
RATE CODE
TEMPORAL
SILENCE

Rate coding: information in spike frequency. Temporal coding: information in precise timing and burst patterns. Brain likely uses both. ANNs use neither - they output continuous values.

HUMAN
HOMO SAPIENS
Neurons86 billion
Synapses~100 trillion
Speed0.5–120 m/s
Power~20W
Neuron typeCentralized CNS
PlasticityHebbian/LTP

Highly myelinated long-range axons. Prefrontal cortex enables abstraction and planning. Dense cortico-thalamic loops. Glial cells (astrocytes) actively modulate synaptic transmission - not just support cells.

cortex cerebellum myelin
OCTOPUS
OCTOPUS VULGARIS
Neurons500 million
In arms~350M (70%)
Speedslow, unmyelinated
Powervery low
Neuron typeDistributed
PlasticityRNA editing

Each arm contains an autonomous ganglion that processes sensory data and executes motor commands without central brain input. Distributed intelligence. No myelin but giant axons for squid escape reflex (Mauthner equivalent). Unique: recodes proteins via RNA editing to adapt to temp.

distributed RNA edit no myelin
HONEYBEE
APIS MELLIFERA
Neurons~1 million
Brain vol1 mm³
Speedfast for size
Power<1 mW
Neuron typeCompact ganglia
PlasticityMushroom body

Insect neurons are small, often monopolar (single process), and tightly packed. Mushroom bodies (the insect analogue of cortex) support associative learning, memory, navigation. Bees perform path integration, symbolic communication (waggle dance), concept formation - all from 1 million neurons.

monopolar ganglia compact
C. ELEGANS
NEMATODE // FULLY MAPPED
Neurons302 exactly
Synapses~7,000
Connectomefully known
Powersub-microwatt
Neuron typeSimple ganglia
Plasticitylimited

The only organism with a fully mapped connectome (Brenner, White et al. 1986). 302 neurons producing locomotion, chemotaxis, thermotaxis, learning. Critical model organism for neural circuit research. OpenWorm project simulated the full nervous system in software.

connectome model org 302 neurons
05 // Artificial Neuron
THE ANN UNIT
x₁ x₂ x₃ w₁ w₂ w₃ Σ(xᵢwᵢ) + bias σ(z) ACTIV ŷ b (bias)
z = Σᵢ (xᵢ · wᵢ) + b
output = σ(z)
// continuous float in, continuous float out. no time dimension. no refractory period. no ion channels. just linear algebra + nonlinearity.
ReLU
max(0, z)
sparse activation, fast. most used in deep networks. dies at negative z.
SOFTMAX
eᶻⁱ / Σ eᶻʲ
probability distribution over outputs. used in classification heads.
SIGMOID
1 / (1 + e⁻ᶻ)
biological inspiration. vanishing gradient problem at extremes.
GELU
z·Φ(z)
used in transformers (GPT, BERT). smooth, probabilistic gating.
06 // Head to Head
BIO vs ANN
BIOLOGICAL ARTIFICIAL
Signal type Discrete spikes (binary events) Continuous floats
Computation Spatial + temporal integration Weighted sum + nonlinearity
Time Temporal dynamics matter No time dimension
Learning Hebbian, STDP, neuromod Backprop + gradient descent
Synapse type Excitatory + inhibitory + modulatory Weights only (positive or negative)
Dendrites Compute themselves (active) Not modeled
Plasticity Continuous, lifelong Train-time only (mostly)
Noise Inherent stochasticity Deterministic (mostly)
Glial cells Active role in computation Not modeled at all
Fan-in 1,000 - 100,000 synapses Thousands to millions
Fan-out 100 - 10,000 targets Architecture-defined
Refractory Hard physiological limit None
07 // Spiking Neural Networks
SNN // THE BRIDGE

Spiking Neural Networks attempt to close the gap - neurons fire discrete spikes and time carries information. The standard SNN neuron model is the Leaky Integrate-and-Fire (LIF):

τ · (dV/dt) = -V(t) + R·I(t)
if V(t) ≥ V_threshold → SPIKE → V reset to V_rest
// membrane potential leaks toward rest. integrates current. fires when threshold crossed. refractory period enforced. much closer to biology.

More complex models (Hodgkin-Huxley, Izhikevich) simulate actual ion channel dynamics but are computationally heavier. The trade-off: biological fidelity vs tractability.

SNNs are event-driven - computation only happens when spikes occur. On neuromorphic hardware this is extremely efficient. Intel Loihi 2 and IBM TrueNorth implement spiking networks in silicon, achieving orders of magnitude better energy per inference than GPU-based ANNs for certain tasks.

HARDWARE TARGET // NEUROMORPHIC
INTEL LOIHI 2
128 neuro-cores
1M spiking neurons
120M synapses
~1W at full load
IBM TRUENORTH
4096 neuro-cores
256 neurons/core
256M synapses
70mW inference
08 // Energy // The Real Gap
POWER CONSUMPTION

The brain's efficiency advantage is the single most motivating fact in neuromorphic computing. 20 watts to run a human brain. The biological solution uses sparse spiking, analog computation, co-located memory and processing, and massive parallelism.

Human brain
20 W
Honeybee brain
<1 mW
Intel Loihi 2
~1 W
RTX 4090
450 W
H100 (training)
700 W
GPT-4 cluster est.
>50 MW

The brain's trick: von Neumann architectures separate memory from compute, requiring massive data movement (the "memory wall"). The brain co-locates both at the synapse. Each synaptic weight is stored exactly where computation happens. No bus, no bandwidth bottleneck.

10¹⁵
ops/sec/watt // brain estimate
10¹²
ops/sec/watt // best GPU (FP16)