Precision issues for learning with analog VLSI multilayer perceptrons
Cairns G., Tarassenko L.
Some precision issues in training such as analog VLSI multilayer perceptron (MLP) networks are examined. Presently, three techniques are used to train analog VLSI chips. They are: off-chip learning; chip-in-loop learning; and on-chip learning. It was shown that successful MLP gradient-descent learning necessitates weight precisions of between 9 and 13 bits. If the weight precision is less than about 12 bits, learning stops because the weight updates are smaller than the least significant bit of the weight representations, and thus are ignored.
