Interactive 3D Educational Simulation — Explore the parallels between neurons and artificial networks
Grey matter neurons transmit electrochemical signals via dendrites, process them in the soma, and fire action potentials down axons. Synapses connect to ~7,000 other neurons each.
A mathematical model: inputs are multiplied by weights, summed with a bias, and passed through an activation function. Billions of these form deep neural networks.
The neocortex contains ~100 million cortical columns, each with ~60,000 neurons arranged in 6 layers. Each column may act as a pattern-recognition module.
Deep networks stack layers of neurons. Each layer extracts increasingly abstract features — edges → shapes → objects in vision, for example.
When a neuron's membrane potential crosses ~-55mV (threshold), voltage-gated sodium channels open, causing a rapid depolarization spike to +40mV. This all-or-none signal travels down the axon at up to 120 m/s.
In ANNs, signals flow layer by layer: z = Wx + b, then a = σ(z). The activation σ (sigmoid, ReLU, etc.) introduces nonlinearity, enabling complex function approximation.
A single neuron fires 1-200 times per second. The brain contains ~86 billion neurons firing simultaneously, creating ~10^16 operations per second — comparable to the world's largest supercomputers, but using the power of a dim lightbulb.