Reverse Differentiation via Predictive Coding

Salvatori T
Song Y
Xu Z
Lukasiewicz T
Bogacz R

This paper proposes how networks of nerve cells in the brain could learn according to an efficient method called “error back-propagation”. This method is widely used to teach artificial neural networks, but it is not clear if the brain could use it. Previous studies described how this method could be implemented by networks of nerve cells for simple computations, and this paper extends it to more complex tasks.

Scientific Abstract

Deep learning has redefined AI thanks to the rise of artificial neural networks, which are inspired by neuronal networks in the brain. Through the years, these interactions between AI and neuroscience have brought immense benefits to both fields, allowing neural networks to be used in a plethora of applications. Neural networks use an efficient implementation of reverse differentiation, called backpropagation (BP). This algorithm, however, is often criticized for its biological implausibility (e.g., lack of local update rules for the parameters). Therefore, biologically plausible learning methods that rely on predictive coding (PC), a framework for describing information processing in the brain, are increasingly studied. Recent works prove that these methods can approximate BP up to a certain margin on multilayer perceptrons (MLPs), and asymptotically on any other complex model, and that zerodivergence inference learning (Z-IL), a variant of PC, is able to exactly implement BP on MLPs. However, the recent literature shows also that there is no biologically plausible method yet that can exactly replicate the weight update of BP on complex models. To fill this gap, in this paper, we generalize (PC and) Z-IL by directly defining it on computational graphs, and show that it can perform exact reverse differentiation. What results is the first PC (and so biologically plausible) algorithm that is equivalent to BP in the way of updating parameters on any neural network, providing a bridge between the interdisciplinary research of neuroscience and deep learning. Furthermore, the above results in particular also immediately provide a novel local and parallel implementation of BP.

computational graphs
Sample computational graph (left) and its adjusted version that is easier for neural implementation
Citation
In Proceedings of the 36th AAAI Conference on Artificial Intelligence‚ AAAI 2022 ‚Vancouver, BC, Canada, February 22--March 1‚ 2022 (Vol. 10177, pp. 507-524). AAAI Press.
Related Content
Publication
Song Y, Lukasiewicz T, Xu Z, Bogacz R
2020. Adv Neural Inf Process Syst, 33:22566-22579.
Publication
Whittington JCR, Bogacz R
2019. Trends Cogn. Sci., 23, 235-250
Publication
Richards BA, Lillicrap TP, Beaudoin P, Bengio Y, Bogacz R, Christensen A, Clopath C, Costa RP, de Berker A, Ganguli S, Gillon CJ, Hafner D, Kepecs A, Kriegeskorte N, Latham P, Lindsay GW, Miller KD, Naud R, Pack CC, Poirazi P, Roelfsema P, Sacramento J, Saxe A, Scellier B, Schapiro AC, Senn W, Wayne G, Yamins D, Zenke F, Zylberberg J, Therien D, Kording KP
2019. Nat. Neurosci., 22:1761-1770.
Publication
Millidge B, Song Y, Salvatori T, Lukasiewicz T, Bogacz R
2023. The Eleventh International Conference on Learning Representations