PhysSandbox
Classical MechanicsWaves & SoundElectricity & MagnetismOptics & LightGravity & OrbitsLabs
🌙Astronomy & The Sky🌡️Thermodynamics🌍Biophysics, Fluids & Geoscience📐Math Visualization🔧Engineering🧪Chemistry

More from Thermodynamics

Other simulators in this category — or see all 40.

View category →
NewUniversity / research

Bose–Einstein vs Fermi–Dirac

Grand-canonical occupation f(E) at the same T and μ: FD, BE, and classical Maxwell–Boltzmann comparison.

Launch Simulator
FeaturedSchool

Ideal Gas Simulator

Bouncing particles in a box. See PV=nRT in action.

Launch Simulator
School

Gas Laws Interactive

Boyle's, Charles's, Gay-Lussac's laws with interactive piston.

Launch Simulator
School

Heat Transfer

Conduction, convection, and radiation with temperature gradients.

Launch Simulator
School

Phase Diagram

Temperature-pressure diagram with phase transitions.

Launch Simulator
School

Carnot Engine

PV diagram animation with cycle steps and efficiency.

Launch Simulator
PhysSandbox

Interactive physics, chemistry, and engineering simulators for students, teachers, and curious minds.

Physics

  • Classical Mechanics
  • Waves & Sound
  • Electricity & Magnetism

Science

  • Optics & Light
  • Gravity & Orbits
  • Astronomy & The Sky

More

  • Thermodynamics
  • Biophysics, Fluids & Geoscience
  • Math Visualization
  • Engineering
  • Chemistry

© 2026 PhysSandbox. Free interactive science simulators.

PrivacyTermsContact
Home/Thermodynamics/Hopfield Associative Memory

Hopfield Associative Memory

A classical Hopfield network stores binary patterns ξ^μ on N = 100 spins arranged as a 10×10 grid. Symmetric Hebbian weights w_ij = (1/N) Σ_μ ξ_i^μ ξ_j^μ (zero diagonal) define the Lyapunov (energy) function E = −½ Σ_{i,j} w_ij S_i S_j. Asynchronous zero-temperature updates S_i ← sign(Σ_j w_ij S_j) never increase E, so noisy states relax toward stored attractors—the standard cartoon of associative memory and an energy landscape, with caveats about capacity and spurious minima when P grows.

Who it's for: Students in statistical mechanics, neural-networks introductions, or information theory who want a hands-on Hopfield energy picture.

Key terms

  • Hopfield model
  • Hebbian learning
  • Associative memory
  • Energy function
  • Attractor network
  • Asynchronous dynamics

Hopfield net

25
0.25

Presets (load into grid)

N = 100 binary neurons; Hebb rule with P ≤ 7 patterns. Click-drag the lattice to paint. Async updates drive E = −½ Σ w_ij S_i S_j downhill (zero-temperature recall).

Shortcuts

  • •Space / Enter — run / pause recall
  • •P — pause / resume
  • •R — reset patterns & spins

Measured values

Energy E0.00
Stored patterns P0
Best overlap m_μ—
Locally stable fraction1.00

How it works

Small Hopfield associative memory on a 10×10 grid: Hebbian weights, energy landscape, paint states, memorize patterns, add noise, and watch zero-temperature async dynamics retrieve a stored attractor.

Key equations

w_ij = (1/N) Σ_μ ξ_i^μ ξ_j^μ (i ≠ j), E = −½ Σ_{i,j} w_ij S_i S_j, update: pick i at random, set S_i = sign(Σ_j w_ij S_j) (tie keeps S_i).

Frequently asked questions

Why cap at seven patterns?
The classic Hopfield capacity on random unbiased patterns scales like P ~ 0.14 N in this normalization; seven is a conservative teaching cap on N = 100 before overlaps and spurious states dominate.
Does synchronous updating also lower E?
Not guaranteed. The Hopfield Lyapunov argument is standard for random sequential (single-unit) updates; parallel sweeps can increase energy and oscillate.
What does the overlap readout mean?
For each stored pattern μ, m_μ = N⁻¹ Σ_i ξ_i^μ S_i measures alignment of the current state with that memory; the sidebar highlights the largest magnitude among them as a quick “which memory wins” indicator near convergence.