Local Minima experiment
This experiment shows that neural networks really can get stuck in a local minima.
The goal of the neural network is to learn the parity function (whether the number of 1 bits in the input is odd). Upon startup, the weights are hardcoded to a human-understandable solution: the first hidden layer's nth node calculates if the number of bits is at least n, the second hidden layer's nth node calculates if the number of bits is exactly n, and the output node checks if the number of bits is 1, 3, 5, or 7. This demonstrates that a solution exists.
If you click the reset button and train the network, the fixed point will usually be almost perfect, but not quite: one or two inputs will be wrongly-classified. This demonstrates that the network is stuck in a local minima: if it wasn't stuck, then it would be able to improve its weight and find a fully-working solution.